top of page

Projects

User Interface Design

Project 3: UC AR-Verse

Anna Chambers and Keerthi Sekar

Description

The goal of this project was to design an Augmented Reality experience for visitors to the University of Cincinnati's campus. This AR experience should present digital content that highlights research, learning and student activities taking place within UC buildings or on UC's campus.

Demo Video

C-Goals

Research

In order to understand how UC faculty and students create content for this AR experience - we sampled the real world to see how it's done right now, interviewed current UC students, and searched what content would be added if it was in the AR-verse experience (current student organizations and research labs).

Real World - UC Fliers and Bulletin Boards

DAAP3.jpg
Langsam1.jpg
Nippert1.jpg
DAAP2.jpg
MainStreet3.jpg
MainStreet1.jpg
Lindner2.jpg
misc2.jpg
Langsam2.jpg

Real World - UC Websites

solarCar_website.png
CSA_Website1.png
AMA_Website3.png
motorsports_website1.png
cubecats_website1.png
Real_Estate_Website.png
chem_website1.png
chem_website2.png
IDSA_Website.png
Econ_Website.png

Analysis of Real-World Examples

Themes:

  1. Upcoming events are very prominent

  2. Mission statement/about us on the home page

  3. Categories of different types of information

  4. Large use of QR codes

  5. News update

 

Codes:

  1. Events – Blue Border

  2. About – Green Border

  3. QR Code - Red Border

  4. Events and QR Code – Orange border

  5. Membership – Yellow border

  6. News – Purple

  7. News and About – Black

Summarized Results:

  1. There are multiple student clubs/research that coexist in the same space/building

    • Prioritize what to show when

  2. Each group has a lot of information they need to share

    • Need to show information specific to first-time viewers.

  3. Lots of fliers with quick information/QR codes.

    • Need a way to show a small amount of information at a glance and more when the user indicates that they are interested

  4. Repetition of fliers

    • Need to limit on how many an organization can post at a time and where.

Interviews

​Questions:

  1. Before you joined UC, what did you know about how to get involved on campus?

  2. When you took a tour of UC - what resources were you given about research and student clubs?

    1. Did you use these resources when you joined UC?

  3. Did you gain any information from the student org fair? If so, did you join any clubs after attending the fair?

  4. What is your favorite aspect of student life?

  5. When you were a prospective student, what was 1 resource you wished you had when researching UC?

​Summarized Results:

  1. Before joining - unaware of any of the activities on campus (relied on freshman orientation)

  2. Tours focused on the buildings and academic programs mainly, had to ask for specific information about the work done at UC

  3. Org Fair was helpful to see what types of clubs to get involved in - CampusLink was better for following up on where to go and when clubs meet

  4. Campus Vibes/student environment, Football games, student culture

  5. Learn more about student jobs on campus, professors doing research (open student positions)

Content Creator Portal

Purpose: Student Organizations and Professors with research labs can create AR experiences to deploy across the campus for others to see

Techniques used:

  • Narrative Sketching

  • Storyboarding

  • Wireframing with Figma

The demo to the right is the prototyped UI in Figma.

CSketch.png

Storyboard of Creator Portal

Storyboard depicted above shows how the user can deploy their organization's AR experiences to UC AR-verse.

virtual_flier_sketch.jpg

Sketch of Digital Flier Creation

Referenced in the AR View Editor panel, organizations can create virtual fliers they can add throughout the UC AR-verse. Research shows purpose of creating this functionality.

B-Goals

Mobile Device

We assume the user has access to an AR-compatible mobile phone, and they will interact primarily through touch interactions on their phone. When the user holds the phone up, with the AR application open, it will display our content on top of the world captured by the built-in camera and onboard sensors.

Finding AR Content

10 min x 5 min Design Challenge

BChallenge1.png
BChallenge2.png
BChallenge3.png
BChallenge4.png

Narrative Sketches

BChallenge5.png
BChallenge6.png
BChallenge7.png

Displaying AR Content

Displaying AR Content

To show to we designed the displaying of content for the user, we used 3 main examples of hybrid sketches to demonstrate this.

 

Example Set 1: Solar Car

  • 3D model of car

  • Text descriptions with arrows pointing to parts of the car

  • Location: In front of Baldwin hall

 

Example Set 2: DAAP Fashion Show

  • Flip book

  • Images: Sketches of designs

  • Visualization: Computer mockup of designs

  • Video: Fashion show

  • Location: fashion classroom in DAAP

 

Example Set 3: Chemistry Research

  • Interactive game

  • Beakers of chemicals. Combine to see chemical reactions

  • Text descriptions of chemical compounds

  • QR Code linking to research website

Solar Car Example Set

hybid_solar_car_sketch_tap1.png
hybid_solar_car_sketch_tap2.png
hybid_solar_car_sketch_tap3.png
hybid_solar_car_sketch1.png
hybid_solar_car_sketch2.png

DAAP Fashion Show Example Set

daap_hybrid_sketch1.png
daap_hybrid_sketch2.png
daap_hybrid_sketch3.png
daap_content1.png
daap_content2.png
daap_content3.png

Chemistry Research Example Set

hybrid_chem1.png
hybrid_chem2.png
hybrid_chem3.png
hybrid_chem4.png
hybrid_chem5.png
hybrid_chem6.png
hybrid_chem7.png

Interacting with AR Content

BGoal2.png
Zooming Interaction
BGoal4.png
Tap Interaction

Narrative sketch

BGoal_narrative1.png

1. User holds mobile device up Infront of Baldwin Hall 

BGoal_narrative2.png

2. User uses zoom gesture to enlarge the solar car. 

BGoal_narrative3.png

3. Solar car stays at increased size 

BGoal_narrative4.png

4. User taps one of the star at the wheel. 

BGoal_narrative5.png

5. Information about the wheels appears. 

BGoal_narrative6.png

6. User walks to the right of Baldwin Hall and sees a different angle of the solar car. 

A-Goals

ARvision-2024

The user is viewing the same content through the latest-and-greatest AR glasses invented in 2024, which will be the ARvision-2024. The ARvision-2024 can display graphics within the frame of the lenses of their glasses, overlaid on top of real-world objects. The ARvision-2024 is assumed that it can display graphics within the frame of the lenses of their glasses, overlaid on top of real-world objects.

Finding AR Content

10 min x 5 min Design Challenge

AGoal_sketch1.png
AGoal_sketch2.png

Narrative Sketch

AGoal_narrative1.png

1. User waves in the view of the glasses to bring up the recommended locations.

AGoal_narrative2.png

2. Stack of cards containing information about content around campus come up. Use of color to indicate distance (brightest color is highlighted, gets darker as it goes back) 

AGoal_narrative3.png

3. User swipes left for no and to see another option. 

AGoal_narrative4.png

4. A new location is highlighted, and a new card is added to the back. 

AGoal_narrative5.png

5. User swipes right to pick the location. 

AGoal_narrative6.png

6. Arrows appear on the ground leading the user to the building of the selected content. 

Displaying AR Content

Displaying Content for ARvision-2024​

  • Similar displaying to the discussed mobile device sketches and design challenges​

  • The interactions would be more gesture related.​

    • Activate content by waving​

    • Instead of buttons have gestures​

  • Allows for more "hands-free" interactions.​

    • Zoom with two hands

Interacting with AR Content

Two Handed Gestures: Zooming in & out

AInt1.png
AInt2.png

Flexible Flipping

Note: Flipping with one hand is to allow the other hand to move the carousel in the space available (similar to drag)

AInt4.png
AInt3.png

Narrative Sketch:

AInt5.png
AInt6.png
AInt7.png
bottom of page