Jump to content

Widget toolkit: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
Undid revision 614853814 by ScotXW (talk) unclear diagram, badly drawn, uninformative at size, not appropriate to article
fix typo
 
(39 intermediate revisions by 34 users not shown)
Line 1: Line 1:
{{Short description|Framework or toolkit a program uses to display the graphical user interface}}
{{About||desktop applets that give access to frequently used functions|Widget engine}}
A '''widget toolkit''', '''widget library''', '''GUI toolkit''', or '''UX library''' is a [[library (computing)|library]] or a collection of libraries containing a set of [[graphical control element]]s (called ''widgets'') used to construct the [[graphical user interface]] (GUI) of programs.
{{refimprove|date=October 2007}}
In [[computing]], a '''widget toolkit''', '''widget library''', or '''GUI toolkit''' is a set of [[GUI widget|widgets]] for use in designing applications with [[graphical user interface]]s (GUIs). The toolkit itself is a piece of software which is usually built on the top of an [[operating system]], [[windowing system]], or [[window manager]] and provides programs with an [[application programming interface]] (API), allowing them to make use of widgets. Each widget facilitates a specific user-computer interaction, and appears as a visible part of the computer's GUI. Widget toolkits can be either native or cross platform.


Most widget toolkits additionally include their own [[Rendering (computer graphics)|rendering engine]]. This engine can be specific to a certain [[operating system]] or [[windowing system]] or contain back-ends to interface with multiple ones and also with rendering APIs such as [[OpenGL]], [[OpenVG]], or [[EGL (API)|EGL]].
[[GUI widget|Widget]]s that are provided by a toolkit typically adhere to a unified design specification, including aesthetics, to lend a sense of overall cohesion among various parts of the application and between various applications within the GUI.
The [[look and feel]] of the graphical control elements can be hard-coded or decoupled, allowing the graphical control elements to be [[Theme (computing)|themed]]/[[Skin (computing)|skinned]].


==Overview==
Widget toolkits also contain software to assist in the creation of [[window managers]], as windows themselves are considered widgets. Some widgets support interaction with the user, for example labels, [[Button (computing)|buttons]], and [[Checkbox|check boxes]]. Others act as [[Container (data structure)|containers]] that group the widgets added to them, for example [[Window (computing)|windows]], [[Panel (computer software)|panels]], and tabs.
[[File:SWT-on-mac.png|thumb|A window using the [[Standard Widget Toolkit]]]]
Some toolkits may be used from other languages by employing [[language binding]]s. [[Graphical user interface builder]]s such as e.g. [[Glade Interface Designer]] facilitate the authoring of GUIs in a [[WYSIWYG]] manner employing a [[user interface markup language]] such as in this case [[GtkBuilder]].


The [[graphical user interface]] of a program is commonly constructed in a cascading manner, with widgets being added directly to on top of existing widgets. In many implementations application windows are added directly to the desktop by the window manager, and can be stacked layered on top of each other through various means. Each window is associated with a particular application which controls the widgets added to its canvas, which can be watched and modified by their associated applications.
The GUI of a program is commonly constructed in a cascading manner, with graphical control elements being added directly to on top of one another.


Most widget toolkits use [[event-driven programming]] as a model for interaction.<ref>[http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.45.9491 Past, Present and Future of User Interface Software Tools]. Brad Myers, Scott E. Hudson, Randy Pausch, Y Pausch. ACM Transactions on Computer-Human Interaction, 2000. [http://www.cs.cmu.edu/~amulet/papers/futureofhciACM.pdf]</ref> The toolkit handles [[Event handler|user events]], for example when the user clicks on a [[Button (computing)|button]]. When an event is detected, it is passed on to the application where it is dealt with. The design of those toolkits has been criticized for promoting an oversimplified model of event-action, leading programmers to create error-prone, difficult to extend and excessively complex application code.<ref name=Samek03b>{{cite web
Most widget toolkits use [[event-driven programming]] as a model for interaction.<ref>[http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.45.9491 Past, Present and Future of User Interface Software Tools]. Brad Myers, Scott E. Hudson, Randy Pausch, Y Pausch. ACM Transactions on Computer-Human Interaction, 2000. [https://www.cs.cmu.edu/~amulet/papers/futureofhciACM.pdf]</ref> The toolkit handles [[Event handler|user events]], for example when the user clicks on a [[Button (computing)|button]]. When an event is detected, it is passed on to the application where it is dealt with. The design of those toolkits has been criticized for promoting an oversimplified model of event-action, leading programmers to create error-prone, difficult to extend and excessively complex [[application code]].<ref name=Samek03b>{{cite web
| title = Who Moved My State?
| title = Who Moved My State?
| author = Samek, Miro
| author = Samek, Miro
| url = http://www.ddj.com/cpp/184401643
| url = http://www.ddj.com/cpp/184401643
| publisher = C/C++ Users Journal, The Embedded Angle column
| publisher = C/C++ Users Journal, The Embedded Angle column
| date = April 2003}}</ref> [[Finite State Machine]]s and [[UML_state_machine|Hierarchical State Machines]] have been proposed as high-level models to represent the interactive state changes for reactive programs.
| date = April 2003}}</ref> [[Finite state machine]]s and [[UML state machine|hierarchical state machines]] have been proposed as high-level models to represent the interactive state changes for reactive programs.


== Windowing systems ==
The [[look and feel]] of the widgets can be hard-coded in the toolkit, but some widget toolkit APIs decouple the look and feel from the definition of the widgets, allowing the widgets to be themed. (see [[pluggable look and feel]]).
A [[window (computing)|window]] is considered to be a graphical control element. In some windowing systems, windows are added directly to the [[Canvas (GUI)|scene graph (canvas)]] by the [[window manager]], and can be stacked and layered on top of each other through various means. Each window is associated with a particular application which controls the widgets added to its canvas, which can be watched and modified by their associated applications.


==See also==
==See also==
* [[List of widget toolkits]]
* [[WIMP (computing)]]
* [[WIMP (computing)]]
* [[Graphical user interface builder]]
* [[Layout manager]]
* [[Layout manager]]
* [[List of widget toolkits]]


== References ==
== References ==
{{Reflist}}
<references/>


{{Widget toolkits}}
{{Widget toolkits}}
{{Graphical control elements}}
{{GUI widgets}}
{{X desktop environments and window managers}}
{{Window Managers}}


[[Category:Widget toolkits| ]]
[[Category:Widget toolkits| ]]

Latest revision as of 12:37, 4 October 2023

A widget toolkit, widget library, GUI toolkit, or UX library is a library or a collection of libraries containing a set of graphical control elements (called widgets) used to construct the graphical user interface (GUI) of programs.

Most widget toolkits additionally include their own rendering engine. This engine can be specific to a certain operating system or windowing system or contain back-ends to interface with multiple ones and also with rendering APIs such as OpenGL, OpenVG, or EGL. The look and feel of the graphical control elements can be hard-coded or decoupled, allowing the graphical control elements to be themed/skinned.

Overview[edit]

A window using the Standard Widget Toolkit

Some toolkits may be used from other languages by employing language bindings. Graphical user interface builders such as e.g. Glade Interface Designer facilitate the authoring of GUIs in a WYSIWYG manner employing a user interface markup language such as in this case GtkBuilder.

The GUI of a program is commonly constructed in a cascading manner, with graphical control elements being added directly to on top of one another.

Most widget toolkits use event-driven programming as a model for interaction.[1] The toolkit handles user events, for example when the user clicks on a button. When an event is detected, it is passed on to the application where it is dealt with. The design of those toolkits has been criticized for promoting an oversimplified model of event-action, leading programmers to create error-prone, difficult to extend and excessively complex application code.[2] Finite state machines and hierarchical state machines have been proposed as high-level models to represent the interactive state changes for reactive programs.

Windowing systems[edit]

A window is considered to be a graphical control element. In some windowing systems, windows are added directly to the scene graph (canvas) by the window manager, and can be stacked and layered on top of each other through various means. Each window is associated with a particular application which controls the widgets added to its canvas, which can be watched and modified by their associated applications.

See also[edit]

References[edit]

  1. ^ Past, Present and Future of User Interface Software Tools. Brad Myers, Scott E. Hudson, Randy Pausch, Y Pausch. ACM Transactions on Computer-Human Interaction, 2000. [1]
  2. ^ Samek, Miro (April 2003). "Who Moved My State?". C/C++ Users Journal, The Embedded Angle column.