Core Components of a User Interface?

Started by
4 comments, last by Tutorial Doctor 9 years, 4 months ago

I couldn't find a suitable place to post this, so I defaulted to this category.

What would you say are the core components of a user interface? I like to simplify things.

I have been using Editorial to learn Python, but it also has offered me some experience in UI design, and that is what I am focused on now. I really think that a UI can be separated into a set of core modules.

A complete user interface has windows/frames, text fields, buttons, menus(sub-frames), switches, sliders, etc.

In side of Editorial there are a bunch of preset controls/widgets that you can add to a UI designer and then program them by associating buttons and switches with functions, or adding delegates to text fields. In the program, these are associated with classes. The list of core components to work with in Editorial are:

Custom View

A view based on the main View class

All of the following inherit from a View Class:

Label

  • Holds text

Button

  • Triggers functions

Slider

  • Adjusts values (int,float)

Switch

  • Sets boolean values
  • Toggles functions

Segmented Control

  • Toggles views?

Text Field

  • Set or edit strings

Text View

  • Set or edit long strings (sentences and paragraphs)

Web View

  • Displays web content (URL)

Table View

  • List dialog

Date Picker

  • Yeah, that (mainly for mobile)

Scroll View

  • Scrolls a view, or rather it is a scrollable view

Image View

  • Loads images

Navigation View

  • Navigate between views

Comment

  • Just like a comment in code, accept it is like a sticky in the UI designer
  • It never gets displayed

I suppose that a switch can be replaced with a checkbox in some cases.

The reason I am asking this is because I really would like to know if there is a program that allows you to program in whichever language you want, but test that code inside of an actual modular User Interface environment. Or for that matter, use it to actually create the program.

This, to me, would be the ideal way to create a full program with user interactivity. This is a way I am growing accustomed to working (Pythonista has it too).

I am sure there are libraries, but I am thinking of an actual Editor/IDE(maybe), that uses a modular (drag-and-drop) interface to set it all up. (Something simpler than the overgrown IDEs available.)

If I ever get to that level of proficiency, I would certainly work on it.

So, any other key components a UI should have? (It can be classes or actual widgets)

They call me the Tutorial Doctor.

Advertisement

The reason I am asking this is because I really would like to know if there is a program that allows you to program in whichever language you want, but test that code inside of an actual modular User Interface environment. Or for that matter, use it to actually create the program.

I imagine that the main obstacle to this is that, one way or another, this would call for integration with (or even implementation of) each language/engine/whatever that one wanted it to work with--not impossible, but quite a bit of work, I imagine.

Hmm... What I could see working would be a GUI-builder that has its own method of specifying interface elements, and then provided an API for export scripts to work with--that way the problem of integration with various languages/engines/etc. could become community-driven.

As to the main question of the thread, I think that I'd define UI components as a set of "containers" for other UI elements, each potentially with custom logic and responding to mouse- and keyboard- input. A frame is (barring user-defined logic) a simple container; a button is a container with a callback registered for mouse-clicks, as well as logic for producing a "pressed" effect, etc...

If one were to create such an editor as you suggest, it might be a good idea to provide a method of defining new types of UI-element, so that users who want some unusual--or simply unforeseen--element can implement it themselves.

MWAHAHAHAHAHAHA!!!

My Twitter Account: @EbornIan

I so forgot about Keyboard input. That makes more sense with desktop apps too. I forgot to mention about custom widgets too. Really all these widgets/controls do is manipulate code interactively. So we have strings, integers, floats, booleans, tuples, lists, dictionaries, etc to work with.

I don't know if there is a standard widget for working with dictionaries yet. How would one visualize a dictionary? A wall of doors? If it is an associative array, then perhaps it looks something like a list, but how would you visualize the keys? Time for some mockups eh?

I know tuples can be displayed as an ordered pair (a text field with two sub-textfields).

Perhaps a REPL that can display UI elements, something like Playgrounds. Where, inside of the REPL you have the widgets, and can drag and drop them? I don't know yet.

But throughout, it seems that whether a programming language is interpreted or compiled, there could be some universal framework for displaying that code in a UI-like environment, no matter the langue or API. Like you say, an API for export scripts.

Community driven is ideal also.

They call me the Tutorial Doctor.

What would you say are the core components of a user interface?

Well, in its most general form, a user interface requires the following things:
-A way for a person to give inputs to a system
-A way for the system to receive those inputs and correspond them to some action to take.
-Some sort of feedback about the system state

Those are the core components of a user interface.

The rest is just a question of "how?".

If you think about it, a "user interface" is very generalized. A light switch is a UI. By moving a small lever up and down, you toggle the state of lights. A car steering wheel is another UI. Rotating a wheel device causes the forward wheels of a car to rotate around the Y-axis to a limit. With computers, you've got keyboards, mice, toggles and switches, and a bunch of other hardware IO devices which act as one layer of UI. Then you've got the software UI, which can be anything from controls to gesture recognition. etc, etc.

Why point this out?

Because if you only think of UI as a bunch of controls in software, you limit your options without realizing it. I mean, hell, you've got games which use the kinect to figure out your body position and you use dance moves to interact with a game. How amazing is that UI tech?!

That's a good point, actually; I took the original question to be asking about GUIs, but the term "user interface" covers rather more than that.

MWAHAHAHAHAHAHA!!!

My Twitter Account: @EbornIan

Great info slayeman. I always love broader views, so that I can draw analogies from them to understand how things work. This site always gives varied and complete responses.

Thanks. I wonder how this could be made modular. Of course you have event handlers and listeners...

I might have to dig back into my archive to this old program I used to use. Some type of actionscript editor, that you didn't have to use code with.

They call me the Tutorial Doctor.

This topic is closed to new replies.

Advertisement