Content uploaded by Kristina Kölln
Author content
All content in this area was uploaded by Kristina Kölln on Nov 01, 2022
Content may be subject to copyright.
IdentifyingUserExperienceAspectsforVoiceUserInterfaceswithIntensiveUsersKristinaK¨olln1a,JanaDeutschl¨ander2b,AndreasM.Klein3c,MariaRauschenberger1dandDominiqueWinter4e1FacultyofTechnology,UniversityofAppliedSciencesEmden/Leer,Emden,Germany2BerlinerHochschulef¨urTechnik,Berlin,Germany3DepartmentofComputerLanguagesandSystems,UniversityofSeville,Seville,Spain4UniversityofSiegen,Siegen,GermanyKeywords:VoiceUserInterface,VUI,UserExperience,UX,VoiceAssistant,HCI,User-centered.Abstract:VoiceUserInterfaces(VUIs)arebecomingincreasinglyavailablewhileusersraise,e.g.,concernsaboutpri-vacyissues.UserExperience(UX)helpsinthedesignandevaluationofVUIswithfocusontheuser.Knowl-edgeoftherelevantUXaspectsforVUIsisneededtounderstandtheuser’spointofviewwhendevelopingsuchsystems.KnownUXaspectsarederived,e.g.,fromgraphicaluserinterfacesorexpert-drivenresearch.Theuser’sopiniononUXaspectsforVUIs,however,hasthusfarbeenmissing.Hence,weconductedaqualitativeandquantitativeuserstudytodeterminewhichaspectsuserstakeintoaccountwhenevaluatingVUIs.Wegeneratedalistof32UXaspectsthatintensiveusersconsiderforVUIs.Theseoverlapwith,butarenotlimitedto,aspectsfromestablishedliterature.Forexample,whileEfficiencyandEffectivityarealreadywellknown,SimplicityandPolitenessareinherenttoknownVUIUXaspectsbutarenotnecessarilyfocused.Furthermore,IndependencyandContext-sensitivityaresomenewUXaspectsforVUIs.1INTRODUCTIONAVoiceUserInterface(VUI)isanykindofsoftwareanddevicecombinationcontrolledbyuser’sspokeninput.VUIshavebecomeincreasinglypopularinrecentyears,andtheiruseispredictedtoriseevenmoreinthefuture(StrategyAnalytics,2021).However,althoughalotofpeopleownaVUI(e.g.,intheirsmartphone),theydonotnecessarilyusethem.Possiblereasonsfornon-usearediverse,e.g,fearofdatamisuseandmonitoring.Yet,ontheotherendofthespectrumisagroupofintensiveusers(Kleinetal.,2021).Theseintensiveusersshowanappreci-ationfortheuseofVUIsthatgoesbeyondthepurefunctionality,i.e.,userexperienceaspectsofVUIs.TodevelopapositiveUserExperience(UX),theHuman-CenteredDesign(HCD)Frameworkhasbe-ahttps://orcid.org/0000-0002-8625-4903bhttps://orcid.org/0000-0003-3851-4384chttps://orcid.org/0000-0003-3161-1202dhttps://orcid.org/0000-0001-5722-576Xehttps://orcid.org/0000-0003-2697-7437comewidelyaccepted.HCDisaholisticapproachfordesigningaUXthatfitsthetargetgroupbyfocusingontheuser(ISO9241-210,2019).WeshouldknowwhichUXaspectsuserstakeintoaccountwheneval-uatingthequalityofVUIs,sincedifferentUXaspectsareimportantfordifferentusersorproducts(Meinersetal.,2021).Forexample,someusersareconcernedaboutwhichdataiscollectedandhow,whileothersmentiontheneedforhigheraccuracyofcommands(Rauschenberger,2021;Kleinetal.,2021).Recentresearchincludesseveralattemptstode-fineimportantUXaspectsofVUIusinganexpert-drivenprocess(HoneandGraham,2000;Kocaballietal.,2019;Kleinetal.,2020a).Tothebestofourknowledge,however,thereisnouser-drivenidentifi-cationofrelevantUXaspectsforVUIsthatisbasedonup-to-dateuserdata.Inthisarticle,wepresenttheidentifiedUXas-pectsusingauser-centeredmixed-methodsapproach(McKim,2017;ISO9241-210,2019).WechosetoconcentrateonintensiveusersbecausetheyengagewithVUIsindepthandcanofferprofoundinsightsKölln,K.,Deutschländer,J.,Klein,A.,Rauschenberger,M.andWinter,D.IdentifyingUserExperienceAspectsforVoiceUserInterfaceswithIntensiveUsers.DOI:10.5220/0011383300003318InProceedingsofthe18thInternationalConferenceonWebInformationSystemsandTechnologies(WEBIST2022),pages385-393ISBN:978-989-758-613-2;ISSN:2184-3252Copyrightc2022bySCITEPRESS–ScienceandTechnologyPublications,Lda.Allrightsreserved385
intoextensiveusagescenarios.Thisarticleisstructuredasfollows:Section2in-troducesUX,recentresearchaboutUXofVUIs,andmixedmethods.Section3explainsourmethodologybydescribingtheinterviewandsurveyprocess,theparticipants,andthequalitativecontentanalysis.Sec-tion4presentsanddiscussesourresults.Section5finisheswithaconclusionandfuturework.2BACKGROUND&RELATEDWORKCurrentchallengeswhenusingVUIsare,e.g.,speechintelligibility,correctcommandexecution,datasecu-rity,andprivacy(Kleinetal.,2021;Tasetal.,2019;Rauschenberger,2021).UXassessmentbyconsider-ingspecificUXaspectsforVUIsisanessentialevalu-ationmethodforovercomingbarriersandskepticismaswellasmeetingusers’needs.Inthefollowing,webrieflyintroduceUX,howtoidentifyUXaspectsforVUIs,andVUIassessmentapproachesandmethods.UXisaholisticconceptthatconsidersemotion,cognition,andphysicalactionbefore,during,andaf-terusingaproduct(ISO9241-210,2019).UXhasasetofdistinctqualitycriteria:pragmatic,i.e.,clas-sicalusabilitycriteriasuchasefficiency,andhedo-nic,i.e.,non-goalcriteriasuchasstimulation(Preeceetal.,2002).TheseUXqualitycriteria,alsocalledUXaspects,canbeidentifiedandevaluated,e.g.,byconductingempiricalstudies.Focusingonrel-evantUXaspectsenablesefficientproductdevelop-mentandevaluation,e.g.,byusingthemostsuitablequestionnaires(Winteretal.,2017).Still,thereisnoconsensusonUXmeasurementspecificallyforVUIs(SeabornandUrakami,2021).VariousmethodsareavailableforVUIevaluation,buttheydonotnecessarilyfocusonUX.Astudyanalyzedsixquestionnairesthatarecommonlyap-pliedforVUIevaluationandassessedtheirsuitabilityregardingvariousUXdimensions(Kocaballietal.,2019).ItsauthorsrecommendeithercombiningquestionnairestocoverUXmorecomprehensivelyormeasuringadistinctUXdimensionindetail.AnotherVUIevaluationmethodistheapplicationofheuristics,whichareguidelinesfordesignandevaluation.TheymostlyfocusonusabilityandoverlookcertainUXaspects(WeiandLanday,2018;Langevinetal.,2021).AnotheroptiontomeasuredifferentUXaspectsforVUIsisthemodularquestionnaireconceptUEQ+(SchreppandThomaschewski,2019b).Becauseoftheflexibleapproach,researcherscould,forexample,utilizethreevoicequalityscalesmixedwith,say,3outof17otherUEQ+scales.Thereby,theresearcherscreateaquestionnairerelatedtotheirresearchquestionforproduct-specificUXaspectevaluation(Kleinetal.,2020b).ExamplesofotherUEQ+scalesareAttractiveness,Novelty,andEfficiency.Thevoicequalityscalesareconstructedwithconsiderationofhuman-computerinteraction(HCI)andtheVUIdesignprocess(Kleinetal.,2020a).User,system,andcontextallinfluenceHCIsignificantly(HassenzahlandTractinsky,2006).ImprovingtheVUIdesignprocessrequiresadeepunderstandingofcontext,user,andapplicationtodefinerelevantevaluationcriteria(Cohenetal.,2004)butthedefinitionbackthenwasonlytargetingusabilityinsteadoftheholisticUXconcept.Inrecentstudies,mixed-methodsapproacheshavebecomemorepopular(McKim,2017),astheypro-videcertainadvantages.Forexample,mixedmeth-odscanbeappliedinsinglequestionnaireexperi-mentsifthereisaquestionnairewithacombinationofstandardizedandopenquestions(Biermannetal.,2019).Anotherexampleiscomprehensivestudyde-sign(Iniestoetal.,2021),wherethecombinationofstandardizedquestionnairesandsemi-structuredin-terviewsallowstheresearcherstocoverbroaderas-pectsandgainin-depthinformationatthesametime.Ourmixed-methodsapproachaimstoidentifythemissingUXaspectsthatuserstakeintoaccountwhenevaluatingVUIs.3METHODOLOGYOurtargetgroupcomprisesintensiveVUIuserswhouseVUIsregularly,i.e.,fromdailytoseveraltimesaweek,inaprivateorprofessionalenvironment(Kleinetal.,2021).Theyhaveatleastoneyearofusageex-perienceanduseVUIsinvariousscenarios.Hence,intensiveusershavealreadydealtwithVUIsmoredeeplyandcanprovidecomprehensiveinsightsintotheiruse.Weaimtoidentifythetargetgroup’sUXaspectswhenusingVUIs.Forthispurpose,wefor-mulatedthefollowingresearchquestions(RQ):RQ1:Whatisintensiveusers’VUIfrequencyofuse?RQ2:Whatareintensiveusers’reasonsforVUIuse?RQ3:Whatareintensiveusers’UXaspectsforVUI?First,weexplorethefrequencyofuse(RQ1)ofin-tensiveusersbyconsideringshortertimeintervals,asinpreviousliterature.Next,weaskusersabouttheirreasonsforuse(RQ2)torevealtheintensiveusers’usagepatternsandscenarios.Wethenaskedthein-tensiveusers’tosharetheirpositiveandnegativeVUIWEBIST2022-18thInternationalConferenceonWebInformationSystemsandTechnologies386
Table1:Participants’durations,devices,andapplications.ParticipantsDurationofuseDevicesApplicationP13yearsAlexaAccessibility(visual),smarthomecontrolP2>5yearsAlexa,SiriAccessibility(visual),librarianshipP3>10yearsAlexa*,SiriAccessibility(visual),smartphonecontrolP43yearsAlexa,SiriAccessibility(visual),searchqueriesP5>10yearsDragon,SiriAccessibility(visual),workingtool,smartphonecontrolP6>10yearsAlexa,Dragon,SiriAccessibility(motor),workingtool,smartphonecontrolP7>5yearsAlexa,SiriVUIdevelopmentP8>5yearsAlexa,in-carentertainmentSmarthomecontrolP91yearGoogleAssistantTimer,searchqueriesP10>5yearsAlexa,smartphone**Radiosubstitute,(fun)searchqueries(*stoppedusingAlexa,**unknownsmartphonebrand)experiencesaswellassuggestionsforVUIimprove-mentsinordertodeterminenoteworthyUXaspectsforthetargetgroup(RQ3).ToanswertheRQs,wefollowamixed-methodsapproach:weconductaqualitativestudywithsemi-structuredinterviewsfollowedbyaquantitativestudywithanonlinequestionnaire.Thequestionnaireisde-signedtoverifytheresultsoftheinterviewsandtocomparethemwithabroadersampleofparticipants.3.1QualitativeUserStudywithInterviewsWeconductedtensemi-structuredinterviewswithaheterogeneousgroupofintensiveusersinthequalita-tivestudy.Wethenanalyzedthecollecteddatawithaqualitativecontentanalysis(Mayring,1994).3.1.1ProcedureFromApriltoMay2021,weconductedteninter-viewsapplyingthesemi-structuredexpert-interviewmethodology(Bogneretal.,2014).InordertoanswerourthreeRQs,weconstructedtheinterviewguide-linestoconsistofquestionsabouttheparticipants’positiveandnegativeexpectationsandexperiencesre-gardingVUIsaswellastheircontextsofuse.Weruntwopreteststoensurethattheguidelineswereusefulandguiding.Afterwards,wetranslatedtheinterviewguidelinesintoEnglishtoincludeinter-nationalparticipantsandmadeanadditionalversiontointerviewauserwhosechildrenalsousetheVUI.TheinterviewguidelinesareavailableintheoriginallanguageGermanandEnglishtranslationinthere-searchprotocol(K¨ollnetal.,2022).WeconductedtheinterviewsduringonlinevideosessionsusingMicrosoftTeams,or,inonecase,aphonecall.Theinterviewswererecordedandsubse-quentlytranscribedwithasimplescientifictranscript(DresingandPehl,2018)andmadeanonymous.Twointerviewshadtobedocumentedwithamemorylogbecausetherecordingfailed.Alltranscriptionsandmemorylogsareavailableintheresearchprotocolintheiroriginallanguage(K¨ollnetal.,2022).Af-terwards,thecollecteddatawasanalyzedwiththequalitativecontentanalysis(Mayring,1994).3.1.2InterviewParticipantsAllinterviewparticipants(seeTable1)meetourre-quirementsforourtargetgroup,whichare:theyalluseVUIsdailyinaprivateorprofessionalcontextandhaveatleastoneyearofusageexperience(seeTable1columnDurationofuse).WeincludedP7,whoworkswithVUIsdailyinVUIsoftwaredevelop-ment,butdoesnotactuallyconsiderthemselfareg-ularuser.P7respondedtoasocialmediacallontheplatformLinkedIn.Otherparticipants,whowereal-readyknowntouseVUIsmoreintensively,wereac-quiredfromthepersonalnetworksoftheauthors.Theintervieweesareheterogeneous,e.g.,intheircontextsofuseandcharacteristics.FouroutoftenuseVUIsinaprofessionalenvironment,andtheothersixinprivateenvironments.Sixoutoftenparticipantshaveanimpairment,whiletheotherfourhavenone.Theparticipantsare27to69yearsold.P10alsousestheVUIwiththeirchildren,whoarefourandsixyearsold.Twoparticipantsarefemale,eightaremale.MostparticipantsuseAlexa(8/10),followedbySiri(6/10).Thespeechrecognitionsoft-wareDragonisusedasaworkingtool(2/10).Leastusedareanin-carentertainmentsystem,GoogleAssistant,andanunspecifiedsmartphoneVUI(each1/10)(seeTable1columnDevices).Themainusagescenariosoftheparticipantsare:makingtheirdailyliveseasierasuserswithimpair-ments(6/10),smartphonecontrolandsearchqueries(each3/10),andsmarthomecontrol(2/10).Inaddi-tion,therearesomespecificmainapplications,suchaslibrarianship,timer,radiosubstitute,orVUIdevel-opment(each1/10)(seeTable1columnApplication).IdentifyingUserExperienceAspectsforVoiceUserInterfaceswithIntensiveUsers387
3.1.3QualitativeContentAnalysisWeappliedthesummarizingcontentanalysis,oneformofthequalitycontentanalysisthatiswell-knownandmostoftenusedinGerman-speakingcountries(Mayring,1994).Inthesummarizingcontentanalysisprocess(seeFig.1),weanalyzedthetranscriptsusingatech-niquecalled“coding”:first,wedefinedthetranscriptsastheunitsofanalysis.Wethenhighlightedtheinformation-bearingpartsandsummarizedtheirkeymessageonanabstractionlevelthatfitourpurpose.Thesekeymessagesarenowcalled“codes.”Wethenremovedallselectionsthatwerenotrelatedtoourre-searchquestionstomakeafirstreduction,followedbyclusteringcodeswithsimilarkeymessages.TheremainingcodeswerethenclusteredintoacategorysystemthatisthebasisforourlistofUXaspects.Wefinallyrecheckedallinterviewswiththedevelopedcodesysteminasecondroundofcodingtoensureallinterviewswerecodedwiththesameprocedure.Be-causeonlyminorchangesweremadetothecodesys-teminthesecondround,anadditionalcontrolroundwasunnecessary.Figure1:Processofconductingasummarizingcontentanalysisbasedon(Mayring,1994).Followingthismethod,twoauthorsalternatelycodedusingthesoftwaretoolMAXQDAStandard2000(Release20.4.1)inthefollowingway:authorAcodesP1,thenauthorBcodesP2,andsoonun-tilP10isreached.Inthesecondround,theauthorschangedtheparticipants’transcriptions,soauthorAcodedP2,thenauthorBcodedP1,andsoon.3.2QuantitativeUserStudywithaSurveyWeconductedanonlinesurveywithintensiveuserstoobtainmorecomprehensiveresults.Weobtainedanadditionalamountofqualitativedatafromtheques-tionnaires,whichwasalsoanalyzedwithanadjustedqualitativecontentanalysis.3.2.1ProcedureWeconductedasurveywithGerman-,English-andSpanish-speakingparticipantsusingGoogleFormsfromApriltoJune2021.Wedevelopedourquestionnaireasfollows(seeFig.2):first,wedevelopedthecontentbasedontheresearchquestionsandthefindingsoftheinter-views.Thequestionnairecombinesquantitativeandqualitativequestions.Inourfirstpilottest,wepre-sentedoursurveydrafttofourUXexperts.Wemadechanges,e.g.,totheinformativetextsororderofques-tions.Thenwedidthreepretestsconsecutively,eachwiththereworkedversionfromtheprecedingpretest.Aftereachpretest,wemostlyjustmadechangesinthewordingsinordertohelptheparticipantstobet-terunderstandthequestions.Ourfinalquestionnairecontains19questionsabouttheexperiencesandex-pectationsoftheVUIusersregardingtheVUIs.ThecompletequestionnairecanbefoundintheresearchprotocolinEnglish,German,andSpanishversions(K¨ollnetal.,2022).Figure2:Thedevelopmentprocessofthequestionnaire.ThesurveywasthensharedonthesocialmediaplatformsLinkedIn,Facebook,andTwitteraswellasthroughthepersonalnetworksoftheauthors.Contactsthenalsosharedthesurveywiththeircontactsandontheirsocialmediachannels.Werepeatedthecalltoparticipateafewtimestogainadditionalparticipants.Forthequalitativecontentanalysis,wepartlyad-justedthesummarizingcontentanalysistoourre-searchneeds:wedidnotbuildanewcodesystembutusedthecodesystemthatwehaddevelopedfortheWEBIST2022-18thInternationalConferenceonWebInformationSystemsandTechnologies388
interviews.Hence,wewereabletomatchtheresultsofthesurveywithourfindingsfromtheinterviews.3.2.2SurveyParticipantsWecollected76participantresponsesandexcluded24duetothefollowingreasons:oneduplicate,fiverecordshadfewerthanthreequestionsanswered,and18participantsdidnotmeetthetargetgrouprequirementsofahighfrequencyofuse.Fortheanalysis,wetook52participantsintoaccount.Wefoundthat69%(36/52)ofthesurveyparticipantsweremale,29%(15/52)werefemale,and2%(1/52)didnotanswerthequestion.Theaverageageofthesurveyparticipantsis43(SD13).Theparticipantsusediversesoftwareanddevices(seeFig.3).Ofallnameddevices,Alexaisthemostcommonlyused(37/52),followedbySiri(29/52).ThethirdmostuseddeviceoftheparticipantsisGoogleAssistant(20/52).ThiskeywordcombinesallmentionsofGoogleFigure3:Devicesusedbythetargetgroup(N=52).VUIsbecausetheparticipantswerenotalwaysclearaboutwhichGoogledevicetheyused(somewrote“Google,”“GoogleVoice,”oreven“GoogleHome,orGoogleAssistant?”).Voice-controllednavigationorentertainmentsystemsofcars,disregardingthemanufacturerofthecar,weresummarizedasin-carentertainment(7/52).LeastfrequentlynamedwerethespeechrecognitionsoftwareDragon(3/52)aswellasCortana(3/52),followedbyafewotherVUIsthatwereeachnamedbymax.2participants(13/52).WhileweidentifiedAlexaandSiriasthemostcommonlyusedVUIsamongourparticipants,arepresentativeGermanstudy(N=3184)foundthatGoogleAssistant(12%)andAlexa(9%)arethemostcommonlyused(Tasetal.,2019).Thismaydifferfromourparticipants,butsincewedidnotlookforbrand-specificevaluations,wedonotexpectsignificantdiscrepanciesinourresults.4RESULTS&DISCUSSIONWeprocessedourcollecteddataaccordingtothedescriptionintheprevioussectionswiththecon-tentanalysis(Mayring,1994).Hence,thequali-tativedatafrombothstudieswereanalyzedusingthedataanalysissoftwareMaxQDAStandard2020(Release20.4.1)bothwiththeoperatingsystemMi-crosoftWindows10.QuantitativedatawereanalyzedinMicrosoftExcel(Version2204).Weconsider62participantsforthequalitativeandquantitativestudy,71%male(n=44),27%female(n=17),and2%(n=1)whodidnotanswer.Al-thoughourstudyisnotrepresentative,itsdistributionisinlinewithcurrentliterature(77%(PyaeandJoels-son,2018),72%(Sciutoetal.,2018),and79%(Kleinetal.,2021)maleparticipants).Asinotherstudies,ourdistributionofgenderamongourparticipantsisbiasedtowardsmaleusersforVUIs.Wefirstreportourinterviewresults(n=10)andthenthesurveyresults(n=52)ineachsection.Quo-tationsaretranslatedfromtheoriginallanguageGer-man,English,orSpanish,andtheoriginalquotesareavailableintheresearchprotocol(K¨ollnetal.,2022).4.1WhatIsIntensiveUsers’VUIFrequencyofUse?Eventhoughallinterviewparticipants(n=10)meetourdefinitionforintensiveusers,considerablediffer-encesinthescopeofthefrequencyofusewerere-portedbytheparticipants.Therefore,weaskedthesurveyparticipants(n=52)tobemorespecificabouttheirfrequencyofuse.Thatiswhywehavesubdi-videdtheansweroptionsfordailyuseintothreeop-tions:lessthananhouraday,afewhoursaday,andmorethanfivehoursaday.Outofthesurveypartici-pantswhousetheirVUIdaily,mostuseitlessthananhouraday(29/52).Thisisfollowedbyafewhoursaday(9/52)andmorethanfivehoursaday(4/52).FewersurveyparticipantsusetheirVUIafewtimesaweek(8/52)andsomeVUIdevelopersuseitnotreg-ularly(2/52)(seeFig.4).Anotherstudy,whichalsoexaminedthefrequencyofuse,foundthat76.6%oftheidentifiedinten-siveusersusedVUIonadailybasis(Kleinetal.,2021),whichisinlinewithourresults.However,theintensiveuserswereonlydistinguishedbetweenapproximatelyonceadayandseveraltimesaday.Apopulation-representativestudyconductedinGer-manyin2019revealed11%ofdailyVUIusersand19%severaltimesaweek,resultingin30%intensiveusers(SPLENDIDRESEARCHGmbH,2019).OurfindingsareinlinewiththeseresultsandIdentifyingUserExperienceAspectsforVoiceUserInterfaceswithIntensiveUsers389
showthesurveyparticipants’distributionforVUIfre-quencyofuseinmoredetail.Figure4:Frequencyofuseofthesurveyparticipants(N=52).ThefrequencyofusedistributioncouldhaveitsorigininthedifferentusagescenariosoftheVUIs.OnetypicalusagescenarioofAlexais,e.g.,thetimerfunction.Thisonlytakesafewsecondstoexecute.However,ifauserusesDragontodictatetheiremails,theysometimesusethedevicethewholedayasatoolatworkorevenintheirprivatetime.Therefore,thefrequencyofuseisconnectednotonlytotheVUIsystem,butalsotothecontextofuse.Foramorepreciseanalysis,thecontextneedstobeconsideredinthedesignofVUIs.4.2WhatAreIntensiveUsers’ReasonsforVUIUse?TheinterviewparticipantsnamedvariousreasonsfortheirVUIusage.P2describeshow,asablindperson,usingVUIsgivesthemtheopportunitytodothingsforwhichtheyhavenoeasyalternativesolution.Forexample,theyusesAlexatoorderaudiobooksfromthelibrarysincethelibrarywebsitedoesnothaveanaccessiblecheckout.Duetomotorimpairment,P6usesDragonasatoolatworktowritemostoftheiremailcorrespondences.P8usesaVUIoutofcon-venienceforthenavigationsystemintheircarandtocontroltheirsmarthome.P10describeshowtheirchildrenuseAlexaforfun.Sinceweidentifiedmultiplepossiblereasonsforuse,weformedreasons-for-usecategoriestowhichthesurveyparticipantscouldassignthemselves(seeFig.5).Itwaspossibletoselectmultiplecategoriesandtogivecustomanswers.MostofthesurveyparticipantsuseVUIsforcom-fort(48/52).Thiswasfollowedbyfun(23/52)andasatoolatwork(12/52).AfewuseVUIsbecausetheydoscientificresearchonVUIs(3/52)orbecauseofsomekindofimpairment,e.g.,motor(3/52)orvisual(1/52).Thisismostlyinlinewiththeanswersoftheinterviewparticipants.AlmostallofthemmentionvariousscenariosinwhichtheVUIgrantsthemcom-fort.Participantswithimpairments,especiallymen-tionseveralpositiveexperienceswiththeirVUIs:P1explainsthatnothavingtostandupfromthesofatoturnthelightsonoroffisespeciallycomfortableforthem,sincetheyhavelowvision.P2says,“Ofcourse,Idon’tneeditinmylife,butasablindperson,Icanbenefitgreatlyfromsmartphonesandsuchassis-tancesystems.”Similarly,P3saysthat“asablindperson,typingonanIphoneisjustawkward.”Funisalsomentionedseveraltimesbytheinterviewpar-ticipants:thechildrenofP10mostlyuseAlexatoaskfunquestions,andP2likestoteaseAlexawithcheekyquestionsandlistentoheranswers.However,wedohavemoreinterviewthansurveyparticipantsthatuseVUIsbecauseofsomekindofimpairment.Inad-dition,wehavenointerviewparticipantsthatdosci-entificresearchonVUIs,unlikesomeofthesurveyparticipants.Figure5:ReasonsforusingVUIsofthetargetgroup(N=52).Existingliteraturehasinvestigatedthecontextofuse,butnotthemotivationfortheusagescenarios(Kleinetal.,2021).However,themotivationforusealsohasamajorinfluenceonwhatisimportanttotheusers.(Hassenzahl,2008)callsthesethedo-andbe-goalsoftheusers.Thedo-goalsdescribewhattheuserwantstodo,whilethebe-goalsdescribehowtheuserwantstofeel.Theresultsofourstudyprovideinsightsintothesedo-andbe-goalsofthetargetgroup,e.g.,theparticipantswanttouseVUIsforsmarthomecontrol(do-goal),ortostaycomfortablyonthesofa(be-goal).WehavefoundthatuserswithdisabilitiesperceivegreatpotentialforVUIstoassistthemintheirdailylives.Currently,participantsexploreVUIfeatureswithafocusoncomfortandfun.AfewinterviewparticipantsstatedthattheywouldliketouseVUIsforevenmorepracticalapplicationswhenVUIsorlinkedtechnologieshaveamoreversatileskillsetavailable,e.g.,usingVUIsinautonomousdrivetogivedirectionwouldbeidealforP2.WEBIST2022-18thInternationalConferenceonWebInformationSystemsandTechnologies390
Table2:TheidentifiedaspectsthetargetgroupnamedforevaluatingUXofVUIs.IndexAspectInterpretationoftheparticipantsInt.(N=10)*Sur.(N=52)*1ComprehensionTheVUIunderstandstheusercorrectly,eveniftheydonotspeakveryclearly.10372Error-freeBoththeresultandtheoperationdonotgiveerrors,wronganswersormisunderstandings.6343AestheticThehardwareoftheVUIissupposedtobeminimalistic.Visualfeedbackaboutthestatus(listening,processing,disabled,etc.)ispositivelyreceivedaslongasitisdiscreet.3354RangeoffunctionsTheVUIhasasmanyfunctionsandapplicationpossibilitiesaspossible.4295SimplicityTheoperationiseasytoperformandcontainsasfewstepsaspossible.8236EffectivityTheuserreachestheirgoal.9177SupportoftheUserTheVUIhelpsuserstoachievetheirgoals.3228HumanityTheuserhasthefeelingoftalkingtoahumanbeing.Theycanconductanormaldialogue,theVUIpersonarespondswithhumorandempathy,andthevoicesoundsnatural.4219PersonalfulfillmentTheVUIallowstheusertoliveouttheirpersonality.Theycanspeakintheirdialectanddonothavetodisguisethemselvesinordertobebetterunderstood.32110Context-sensitivityTheVUIknowsitsuser,understandsthecurrentsituation,andcanrememberthecontextoftheconversation.42011EfficiencyTheuserreachestheirgoalwithoutdetours.71712PrivacyTheVUIshouldnotpermanentlylistenin,interrupt,orevenrecordprivateconversations.61613DataSecurityIfpersonalinformationmustbeprovided,itcanbetrustedthatitwillnotbesharedandwillbehandledethically.71514Time-savingTheuserdoesnotneedtofetchadeviceorpressabuttonandcanimmediatelystarttheusageprocess.Theyreceivestheresultsimmediatelyaftertherequest.61415PolitenessTheVUIdoesnotinsulttheuser;itallowsthemtofinishtheirsentencesanddoesnotactivatewithoutbeingasked.01916Linkingwiththird-partyproductsManythird-partyproductsshouldbecompatible.TheVUIcaneasilybeconnectedwiththem,andtherearenoerrorsincommunication.81117SafetyTheVUIgivestheuserphysicalandprivacysecurity.Forexample,securityisgivenbyenablingoperationinthecarwithoutremovingthehandsfromthesteeringwheel,orbyprotectingthedatafromexternalaccess.21618CapabilitytolearnTheVUIcanlearnnewcommands,learnthepersonalityofitsuser,andexerciseappropriatereactions.Incorrectlylearnedcommandscanbedeleted.21519IntuitivenessTheuserdoesnotneedtolearnacomplicatedvocabulary,butcanimmediatelycommunicatewiththeVUIusingtheireverydaylanguage.SettingupandlearninghowtousetheVUIispossiblewithoutadditionalhelp.51220PracticalityTheVUIhelpstheuserwitheverydaychallenges.71021ReliabilityTheVUIrespondsonlywhenitisaddressed,withoutfalseactivation.Theresultsarecorrectandverified.Thequalityoftheinteractionshouldbeconsistentlyhigh.01522HelpwitherrorsIfanerroroccurs,awaytofixitisshown.Thereisahelpfunction.31123ConvenienceTheusercanusetheVUIfromanysituationwithouthavingtomakeaneffort.Forexample,theycanuseitfromthesofa,bedordesk.7724FunTheVUIisfuntouseanditshumorisappropriate.3825CustomizabilityThepersonaoftheVUIcanbesetbytheuseraccordingtotheirpreferences(gender,language,humor,voice,etc.).0826FlexibilityTheVUIcanadapttodifferentusersandsituations.4427VoiceThevoiceoftheVUIispleasantandclearlyunderstandable.0728ResponsivenessTheVUIrespondsassoonasitisaddressed,butonlywhenitisaddressed.0629IndependencyTheuserdoesnotneedanyassistanceinusingtheVUI.ItallowsadditionalindependenceforuserswhowouldhaveproblemsoperatingaGUI(forexample,thosewithvisualormotorimpairment,dyslexics,andchildren).4230InnovationTheVUIhasnew,modernanduniquefeatures.2131Ad-FreeAdvertisingisnotplayedorcanbeturnedoff.0232LongevityTheVUIcanbeusedforalongtime,doesnotbreakquickly,anddoesnotneedtoberepeatedlyreplacedwiththelatestmodel.02Theaspectsaresortedbythetotalnumberofmentions.*Mentionedbyparticipantsfromeithertheinterviewsorsurveys.4.3WhatAreIntensiveUsers’UXAspectsforVUI?Intotal,weidentified32UXaspectsofintensiveusersforVUIs(seeTable2).Wespecifythemthroughthestatementsoftheinterviewparticipants,whichareavailableintheresearchprotocol(K¨ollnetal.,2022).Thisway,theycanbecomparedwithexistingscalesandtermsbasedonthecorestatementsoftheusers,withoutrequiringadditionalassociationsfromprecedingresearch.Intheresults,weconsideronlyasinglementioningperparticipantofaUXaspect.Thenumberofmentionsbyasingleparticipantisnotnecessarilyameasureofimpor-tance.Emphasis,context,andwordingalsoprovideinformationonpriority,butarehardlymeasurable.Forthisreason,theevaluationwasbasedonthenumberofinterview(seeTable2,columnInt.)andsurveyparticipants(seeTable2,columnSur.)whomentionedaUXaspect.Duetothesmallnumberofparticipants,wehavedecidednottosetaminimumnumberofparticipantswhomusthavenamedaUXaspect.Itispossiblethatthedistributionofnumberscouldbedifferentwithalargergroupofparticipants.AfewofouridentifiedUXaspectshadalreadybeendefinedthroughoutestablishedliterature,e.g.,EfficiencyandEffectivity(ISO9241-210,2019)orAesthetic(SchreppandThomaschewski,2019a),butnotnecessarilyforVUIs.OtherUXaspectsarepartofotherknownUXfactors,e.g.,SimplicityandPolite-ness,whichmaybepartoftheUXfactorLikeabilityIdentifyingUserExperienceAspectsforVoiceUserInterfaceswithIntensiveUsers391
oftheSubjectiveAssessmentofSpeechSystemInter-faces(SASSI)(HoneandGraham,2000),butarenotexplicitlyconsidered.Additionally,wedididentifyseveralnewUXaspectsforVUIs,e.g.,IndependencyorContext-sensitivity.InsteadoffollowingatheoreticalapproachonwhataUXaspectshouldmean,wefollowedamoreuser-centeredapproachtoidentifyUXaspectsforVUIs.TheseUXaspectsrepresentwhatintensiveusersthinkaboutwhenevaluatingtheUXofVUIsandcanbeused,e.g.,tochoosethebestUEQ+scalestocombinewiththeVUIscales(Kleinetal.,2020b)4.4LimitationsOurstudyislimitedtoasmallergroupofparticipants,whichisputintoperspectiveusingamixed-methodsapproach.Despitebeinglimitedtoteninterviews,thisstudyprovidesmeaningfulresultsinthequalitativesection,confirmedbythequantitativepartofthissur-vey.Belowasamplesizeoftwentyinterviews,datasaturationusuallyappear(Francisetal.,2010).Wealsohavetoconsider,thatourstudyhasagen-derbiastowardsmaleparticipants.Theunstablegen-derdistributionoftheparticipantsisbecause,e.g.,femalesuseVUIslessthanmales(SPLENDIDRE-SEARCHGmbH,2019;Tasetal.,2019).Thepartic-ipantsareheterogeneousconcerningtheVUIusageinprofessionalandprivatelocationsaswellaspar-ticipantswithorwithoutimpairments,sowedonotexpectrelevantselectionbias.Althoughthisstudywasinternationallycon-ducted,mostofourparticipantsarefromtheWest-ernEuropeanregion,specificallyGermanyandSpain.Theresultsofthestudymaybedifferentwithmoreparticipantsfrom,e.g.,theAsianorAfricanregions.5CONCLUSION&FUTUREWORKWeexploredtheusagebehavioraswellastheexpec-tationsandexperiencesofintensiveusersofVUIs.Thisallowedustomakestatementsaboutthefre-quencyandreasonsofuseforthetargetgroupaccord-ingtoouruser-centered,mixed-methodsapproach.Additionally,wewereabletodeterminewhichUXaspectsthetargetgroupappliestoevaluateVUIs.WehavefoundthatmanyintensiveusersnotonlyusetheirVUIalmostdaily,butoftenevenforsev-eralhoursaday.MostofthetargetgroupuseVUIsforcomfortandtomaketheirdailyliveseasier,e.g.,intheirsmarthome.ParticularlynoteworthyisthepotentialofVUIsinsupportinguserswithdisabili-ties.Wecreatedalistwith32UXaspectsforVUIsofthetargetgroup.Althoughsomeofthetermsarealreadyknown,weexplaintheUXaspectsfromtheuser’spointofviewandwhattheyexpectfromaVUIregardingthisUXaspect.Additionally,wewereabletoidentifyseveralnewUXaspectsforVUIs.PrioritizationoftheseUXaspectsshouldstillbeperformedinthefuture.WewillusetheUXas-pectsforVUIsinfutureworktodeterminewhichUXmeasurementmethodconsiderswhichUXaspect.Thereby,VUIdesignerscanchooseaUXmeasure-mentmethodthatfitstheirusers’needs.Forexample,acomparisonwiththeUEQ+scalescouldshowthetotalscalesneededtoassesstheUXofaVUI.Thiswouldallowresearcherstobetteradapttheirmethod-ologytotheUXaspecttheywishtoevaluate.REFERENCESBiermann,M.,Schweiger,E.,andJentsch,M.(2019).Talk-ingtoStupid?!?ImprovingVoiceUserInterfaces.InFischer,H.andHess,S.,editors,MenschundComputer2019-UsabilityProfessionals,pages1–4,Bonn.Gesellschaftf¨urInformatike.V.UndGermanUPAe.V.Bogner,A.,Littig,B.,andMenz,W.(2014).InterviewsmitExperten:EinepraxisorientierteEinf¨uhrung.SpringerFachmedienWiesbaden,Wiesbaden.Cohen,M.H.,Giangola,J.P.,andBalogh,J.(2004).VoiceUserInterfaceDesign.Addison-Wesley,Boston.Dresing,T.andPehl,T.(2018).PraxisbuchInterview,Tran-skription&Analyse,Audiotranskription.Dr.DresingundPehl,Marburg.Francis,J.J.,Johnston,M.,Robertson,C.,Glidewell,L.,Entwistle,V.,Eccles,M.P.,andGrimshaw,J.M.(2010).Whatisanadequatesamplesize?Opera-tionalisingdatasaturationfortheory-basedinterviewstudies.Psychology&Health,25(10):1229–1245.Hassenzahl,M.(2008).Userexperience(ux):Towardsanexperientialperspectiveonproductquality.IHM,September2008:11–15.Hassenzahl,M.andTractinsky,N.(2006).Userexperience-aresearchagenda.Behaviour&InformationTech-nology,25(2):91–97.Hone,K.S.andGraham,R.(2000).Towardsatoolforthesubjectiveassessmentofspeechsysteminterfaces(sassi).NaturalLanguageEngineering,6(3&4):287–303.Iniesto,F.,Coughlan,T.,andLister,K.(2021).Imple-mentinganaccessibleconversationaluserinterface.InVazquez,S.R.,Drake,T.,Ahmetovic,D.,andYaneva,V.,editors,Proceedingsofthe18thInternationalWebforAllConference,pages1–5,NewYork,NY,USA.ACM.ISO9241-210(2019).Ergonomicsofhuman-systeminteractionPart210:Human-centredde-WEBIST2022-18thInternationalConferenceonWebInformationSystemsandTechnologies392
signforinteractivesystems.Technicalreport,https://www.iso.org/committee/53372.html.Klein,A.M.,Hinderks,A.,Schrepp,M.,andThomaschewski,J.(2020a).ConstructionofUEQ+ScalesforVoiceQuality.InProceedingsoftheConferenceonMenschUndComputer,MuC’20,pages1–5,NewYork,NY,USA.AssociationforComputingMachinery.Klein,A.M.,Hinderks,A.,Schrepp,M.,andThomaschewski,J.(2020b).MeasuringUserExperienceQualityofVoiceAssistants.In202015thIberianConferenceonInformationSystemsandTechnologies(CISTI),pages1–4,Seville,Spain.IEEE.Klein,A.M.,Rauschenberger,M.,Thomaschweski,J.,andEscalona,M.J.(2021).ComparingVoiceAssistantRisksandPotentialwithTechnology-BasedUsers:AStudyfromGermanyandSpain.JournalofWebEn-gineering,7(16):1991–2016.Kocaballi,A.B.,Laranjo,L.,andCoiera,E.(2019).Un-derstandingandmeasuringuserexperienceincon-versationalinterfaces.InteractingwithComputers,31(2):192–207.K¨olln,K.,Deutschl¨ander,J.,Klein,A.M.,Rauschenberger,M.,andWinter,D.(2022).Protocolforidentifyinguserexperienceaspectsforvoiceuserinterfaceswithintensiveusers.Langevin,R.,Lordon,R.J.,Avrahami,T.,Cowan,B.R.,Hirsch,T.,andHsieh,G.(2021).Heuristicevaluationofconversationalagents.InKitamura,Y.,Quigley,A.,Isbister,K.,Igarashi,T.,Bjørn,P.,andDrucker,S.,editors,Proceedingsofthe2021CHIConferenceonHumanFactorsinComputingSystems,pages1–15,NewYork,NY,USA.ACM.Mayring,P.(1994).Qualitativecontentanalysis,volume1994.UVKUniv.-Verl.Konstanz.McKim,C.A.(2017).Thevalueofmixedmethodsre-search:Amixedmethodsstudy.JournalofMixedMethodsResearch,11(2):202–222.Meiners,A.-L.,Kollmorgen,J.,Schrepp,M.,andThomaschewski,J.(2021).Whichuxaspectsareim-portantforasoftwareproduct?importanceratingsofuxaspectsforsoftwareproductsformeasurementwiththeueq+.InMenschUndComputer2021,MuC’21,page136139,NewYork,NY,USA.AssociationforComputingMachinery.Preece,J.,Rogers,Y.,andSharp,H.(2002).Interactiondesign:beyondhuman-computerinteraction.InInter-actiondesign:beyondhuman-computerinteraction.J.Wiley&Sons,NewYork.Pyae,A.andJoelsson,T.N.(2018).Investigatingtheus-abilityanduserexperiencesofvoiceuserinterface.InProceedingsofthe20thInternationalConferenceonHuman-ComputerInteractionwithMobileDevicesandServicesAdjunct,pages127–131,NewYork,NY,USA.ACM.Rauschenberger,M.(2021).AcceptancebyDesign:VoiceAssistants.In1stAI-DEbateWorkshop:work-shopestablishingAnInterDisciplinarypErspectiveonspeech-BAsedTEchnology,page27.09.2021,Magde-burg,Germany.OvGU.Schrepp,M.andThomaschewski,J.(2019a).Constructionandfirstvalidationofextensionscalesfortheuserex-periencequestionnaire(ueq).Schrepp,M.andThomaschewski,J.(2019b).DesignandValidationofaFrameworkfortheCreationofUserExperienceQuestionnaires.InternationalJour-nalofInteractiveMultimediaandArtificialIntelli-gence,5(7):S.88–95.Sciuto,A.,Saini,A.,Forlizzi,J.,andHong,J.I.(2018).heyalexa,whatsup?:Amixed-methodsstudiesofin-homeconversationalagentusage.InProceedingsofthe2018DesigningInteractiveSystemsConference,DIS18,page857868,NewYork,NY,USA.Associa-tionforComputingMachinery.Seaborn,K.andUrakami,J.(2021).Measuringvoiceuxquantitatively.InKitamura,Y.,Quigley,A.,Isbister,K.,andIgarashi,T.,editors,ExtendedAbstractsofthe2021CHIConferenceonHumanFactorsinComput-ingSystems,pages1–8,NewYork,NY,USA.ACM.SPLENDIDRESEARCHGmbH(2019).DigitaleSprachassistenten.StrategyAnalytics(2021).Absatzvonintelligentenlaut-sprechernweltweitvom3.quartal2016biszum3.quartal2021.Tas,S.,Hildebrandt,C.,andArnold,R.(2019).VoiceAs-sistantsinGermany.WIKWissenschaftlichesInstitutf¨urInfrastrukturundKommunikationsdiensteGmbH,BadHonnef,Germany.Nr.441.Wei,Z.andLanday,J.A.(2018).Evaluatingspeech-basedsmartdevicesusingnewusabilityheuristics.IEEEPervasiveComputing,17(2):84–96.Winter,D.,Hinderks,A.,Schrepp,M.,andThomaschewski,J.(2017).WelcheUXFak-torensindf¨urmeinProduktwichtig?InHess,S.andFischer,H.,editors,MenschundComputerMuC2017.Gesellschaftf¨urInformatike.V.unddieGermanUPAe.V.IdentifyingUserExperienceAspectsforVoiceUserInterfaceswithIntensiveUsers393