Science topic
User Interface Design - Science topic
Explore the latest questions and answers in User Interface Design, and find User Interface Design experts.
Questions related to User Interface Design
What is the best EBSD system available on market right now? In terms of post processing software capabilities, stability, performance, user interface etc.
ToxTrac is a free Windows program optimized for tracking animals. It uses a Computer Vision tracking algorithm that is robust; very fast; and that can handle one or several animals in one or several environments. The program provides useful statistics as output. ToxTrac can be used for fish, insects, rodents, etc.
ToxTrac is currently being used in dozens of institutions and is one of the best available tracking software for animal studies.
The Project is currently being developed by only one person, but there is a large amount of work to be done. So a call for collaboration is open.
What I need, is people with knowledge in C++ with expertise in some of the following areas:
• Computer Vision and programming in OpenCV
• Machine Learning (with knowledge of TensorFlow)
• User interface design with QT
Authorship in all related scientific contributions will be shared.
Thank you for your support and patience.
Contact: o_siyeza@hotmail.com
ToxTrac website: https://toxtrac.sourceforge.io
Instruction video: https://youtu.be/RaVTsQ1JwfM
ToxTrac Guestbook: http://pub47.bravenet.com/guestbook/3993433667
Citations:
• Rodriguez, A., Zhang, H., Klaminder, J., Brodin, T., Andersson, P. L. and Andersson, M. (2018). ToxTrac: a fast and robust software for tracking organisms. Methods in Ecology and Evolution. 9(3):460–464.
• Rodriguez, A., Zhang, H., Klaminder, J., Brodin, T., and Andersson, M. (2017). ToxId: an algorithm to track the identity of multiple animals. Scientific Reports. 7(1):14774.
ToxTrac is licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.
Hello everybody,
I am trying to find literature/studies about qualitative examination of TAM-Factors. So far I've found a ton of quantitative surveys but my approach wants to focus instead on qualitative in-depth-interviews. Specifically I want to find out if different cultural backgrounds influence the perception of the same mobile appilcation in a different way.
It seems as if qualitative surveys in this area are quite rare so my second question would be if there is a valid way to bundle the quantitative questions in some qualitative ones?
Kindly,
Nicole
Design problem- If we have 3 UI concepts: A, B and C and we would like to quantify which one is better out of the three, can heuristic evaluation be used? Constraints: only low fidelity prototypes are available and context is based on participants' envisioning power. Also, how large the sample should be to prevent inter-rater bias?
Are the existing modelling languages like Archimate and e3value good enough to describe the concept of value co-creation in service dominant logic?
for example, participants selected user-interface design preferences for a particular system.
Knowing that the conditions of the experiment were the same for all participants.
I'm building up a prototype
how I could do justification in the number of components in my Mobile user interface compare with others.
Although I am quite active in the research domain of context-oriented programming technology, I am currently trying to expand my knowledge to cover the research field of context-aware or dynamically adaptive user interface technology. What are the key research papers in this domain that I absolutely should not miss? (For example, there is obviously the work of Joëlle Coutaz et al. on "plasticity of user interfaces". What else?)
I'm trying to get an answer for the question: What is the effect of Culture centric UI in HCI?
I would like to know, if anyone designs the icons of UI for a particular demographic based on their culture, then can we expect any improvement in form of performance of interaction? If yes, then how much? Would you suggest me some papers on this topic?
It is interface which shows the image of user's mental image.
So far I found the below equation to measure concentration level but the reference is not that solid and I can not relay on. Any one has other equation or solid reference that support this equation will be highly appreciated.
concentration level = ( (SMR + Beta) / Theta)
where SMR = SensoriMotor Rhythm
The question is related to my research and answers can prove helpful.
I'm planning to look into Augmented Reality in Higher Education, can I have some "experimented" examples from you experts? I know there are a lot of free and paid stuff online, but it would be nice to know from experience. And by the way, I would like to experiment something in the HE level and not a school level. Aurasma is a good tool, anything better than Aurasma in terms of interactivity?
Game matrices and trees can be hard to interpret and reason about. This is not only true for people who may be new to game theory (like young experiment participants), but even for those familiar with it. It normally isn't a problem because most lab experiments focus on one or two familiar games and can explain them with arbitrarily great care. But it could be a problem for experiments that use unfamiliar games or that expose participants to many different games in a session.
One way around this is to use practice rounds to give participants experience with the various outcomes, but, additionally, there must be a general intuitive approach to the materials that can present simple games in a way that highlights their different contingencies and incentives.
At present, for a 2x2 game, I imagine I'd use the classic game matrix supplemented with some equivalent textual description: "If you select strategy Left, the other player will either select Top (earning each of you 1 and 4, respectively) or Bottom (earning each of you 2 and 2, respectively). If you select strategy Right ...." But even that isn't so clear, and it wouldn't scale well to larger games; I'm sure there's a better way.
Is anyone familiar with research that tests, or at least uses, some unconventional intuitive format for visually communicating the different outcomes of a range of economic games? Without resorting to "cover story" narratives?
Depending on the project user experience, designers get a strategic definition of a brand, branding goals, value propositions of their "products" or "services", customer focus and similar stuff.
But i see a enormous challenge to transfer this into processable insights for the discipline of user experience design. Therefore, i just wonder if there are any kind of experiences or concrete documents out there which address this type of inquiry.
I am trying to understand the differences between the framework and tools, in the field of usability engineering?
I am particularily interested in methods and tools which are characterized by the usage of some kind of model(s) which incorporate related context of the user and the environment. In the HCI domain these kinds of methods are known as "user-centred" or "inclusive design" methods.
I will use the simulation to test the effect of different user interface features, such as different visualizations of prognosis information, on the operators’ situation awareness. Which software packages can I use to efficiently and effectively build a simulation environment which represents the control room? The interface will contain a geographic information system (GIS) with information layers, including vessel position tracks and different information windows with detailed information about vessels, traffic management measures, hydro and meteorological information, etc.
We're applying for H2020 funding for assistive technologies so an SME or academic partner would be best.
My intention is to run focus groups, observe people interacting with technology or apps, observe people interacting with people to achieve a goal in order to discover the range of personality constructs that help and hinder users trying to achieve a goal. These constructs can then be used to design a multimodal HMI that supports the interactions between the system and the users. Is anyone interested?
You'll need an understanding of human factors techniques
You'll need an understanding of our end user population - in this case the elderly
You'll need to have an open mind about designing software with personality!
You'll need an awareness of cross-cultural differences in behaviour, what is acceptable behaviour in France may not be in England! We want our software to be inclusive.
Experience in developing software to support the elderly or disabled users would be great
I know it is unusual, but I am wondering if there are any collaborative Virtual reality projects you know of in Edmonton, Canada. My main area of interest is spatial awareness and human material interaction. I appreciate your help.
Suggest any GUI tool which will operate in the same lines as NS2 to stimulate moblie adhoc network
Most of the Tangible User Interfaces (TUI) are quite complex to build by common end users.
I was wondering if anyone has examples of TUIs that are meant to be built at home or placed 'on the wild', thus requiring minimum support?
What would be the more likely statistical study to examine the influence of visual complexity in the analysis results of the user experience on webistes.
I am working on adaptive visualization and want to cover as many factors that influence usability, perception and user performance in user-interaction design.
I particularly liked and studied:
Theory: Deterding et al, Csikszentmihaly on flow, Koster on fun (some) Mcgonigal (some), I have heard about Mary-Jo kim, Werbach, Lazzaro.
Actual studies, none that I am truly impressed with, as in studies that deal with what I call complex gamification beyond PBL (points, badges, leaderboards). Currently following Yukai Chou a self-prescribed pioneer. He has good info.
Any thoughts, ideas, or things about gamification you have found meaningful particularly relating to education?
Several methodologies exist for evaluating the usability of a graphical interface, but what is the most suitable for the evaluation of haptic interface?
Such as the technology Eclipse Modeling Framework and Graphical Modeling Framework provides in using a defined meta model.
What are the main used metrics for evaluate gestures (on-device or in-air) based interactions and interfaces? Any papers, examples, books, or any references on this topic will be much appreciated.
Let's say, we would like to develop a new centralized agricultural information and application system for farmers from rural community in developing countries that are known as non IT savvy people, what is the best interface design pattern we need to follow? Has any study on this been done before? What are your opinion on this suggestion?
Interaction with abstract 3D models of architectural buildings.
As interface design matters a lot for this.
Human-Computer Interaction has evolved a lot in recent years. Most notably, modern computer systems enjoy the benefits from modern HCI tools and technologies. However, while designing for especially skilled persons (e.g., air force personnel, emergency response teams) I could trace that HCI tools, technologies and experiments were heavily utilized for none but the air force personnel. I am looking for solid reference (s) where the power of HCI was/is used in designing user interfaces (interactive systems) for mariners.
I want to see at least the start point that reveals,
1. What are the factors that the designers should consider for designing fro the mariner.
2. What observation should be made on the vessels (any clue, how?), on the personnel, etc.
Any thoughtful and open idea is welcome...
Pervasive interfaces/Ubiquitous computing -> context awareness -> emotion aware systems -> social phenomena -> etc.; research on latest advancements should find its way into a new, meaningful topic.