Modern (Linux) Desktop Environments

Yes. One method, called “persistence prioritization” is often seen when they:

  • Promote items to the top of a menu that you use a lot.
  • A “Recent” list. Showing you the last used commands/objects/searches.
  • If you enter a mode (or context) it only offers relevant commands or items.
  • Good defaults. Re use, either the very last settings/options, or, previously used ones based on your activities. E.G. Different defaults when you either create or reply to an email.

I’m expecting machine learning (ML) or statistical analysis to enhance this a bit.
But rearranging or removing items can be very confusing or frustrating to people. You have to be very careful when you do these things. Netflix and Prime like to thrust things at you that they want you to see, and move stuff around. They certainly make it a lot more tedious for me to use.

Unfortunately many UIs are designed by copy-pasting usability from one situation to another, which is similar, but just different enough to make no sense any more. And be very frustrating.

And don’t forget Microsofts famous Clippy :paperclip:Office Assistant. Who tried to guess what you are doing and offer commands for that task. But that example doesn’t mean these techniques are bad. Just some were premature.

2 Likes

It would be interesting if someone could come up with a learning UI. It figures out where people are looking for something and then puts it there. Make sure there is more than one way to do something so users aren’t orphaned.

I haven’t thought about this in forever and almost certainly don’t have the code anymore, but back in college (and on Windows 3.1), I tried to do something along these lines by remapping the screen geometry so that the mouse would move slightly faster through areas you generally didn’t go and slower through higher-traffic areas. It was terrible to use, but that could’ve easily been because of my crappy programming…

But yeah, modern learning systems that go beyond an Most Recently Used list would be great. If it understood context (i.e., at certain times of the day/week, I’m more likely to take certain actions), that’d be even better.

This reminds me a bit of Dasher which is predictive typing with a pointer. This predates Swype et al by a long shot but I think the technique could still be interesting and useful in the right context.

Maybe this is how we’ll all be typing when we’re all wearing AR glasses. Tomorrow’s version of walking off a cliff while staring at a phone.

If someone could come up with a probabilistic UI that would be interesting.

great discussion guys! Valuable thread. Thanks.

I’m thinking a direct neuro interface.

With LXDE.

I am not an interface.

1 Like

Aim high. You just might get there. LXDE after all!