Experiential Design Task 1

18/5/2025 - 30/5/2025 / Week 4 - Week 6

Jie Xuan/ 0356515

Experiential Design/ Bachelor of Creative Media/Taylor's University

Experiential Design Task 1: Experience Design Project Ideation

TABLE OF CONTENTS

1. Instructions

2. Lecture

3. Task 2: Experience Design Project Proposal

3.Feedbacks

4.Reflection

INSTRUCTIONS

Task 1 Instructions: 
  • Explore the current, popular trend in the market to give them better understanding of the technologies and the knowledge in creating content for those technologies. 
  • Conduct research and experiment to find out features and limitation which will later allows them to make decision on which technologies they should proceed with in their final project. 
  • To complete all exercises to demonstrate understanding the development platform fundamentals. 
  • Submit the link to your blog post, make sure all the exercises are updated on your blog
LECTURE
Week 1/ Introduction to Module
In the first week, Mr. Razif introduced the subject outline and outlined his expectations for the course. We were shown examples of previous student projects, which helped us understand the concept of augmented reality (AR) and gain some ideas for the following tasks. He also explained the various types of AR experiences and the technologies used to develop them, guiding us through the basics of designing an AR concept. Through this session, we developed a solid understanding of the distinctions between AR, VR, and MR, learned to identify AR and MR applications, created our own AR experience, and even built a simple AR app during the lecture.
  • Augmented Reality (AR): Enhances the real, physical world by overlaying digital content.
  • Mixed Reality (MR): Combines the physical and digital worlds, allowing interaction between real and virtual elements.
  • Virtual Reality (VR): A fully immersive digital environment, separate from the physical world.
  • Extended Reality (XR): A broad term encompassing AR, MR, and VR technologies.
Week 2/ Designing Experiences using User Journey Maps
During class, Mr. Razif divided us into four groups, each with approximately seven members. Our task was to select a location and create a user journey map that highlighted the user experience, identified pain points, proposed solutions, and present our findings before the class ended.
My group chose a theme park as our scenario. We decided to use Figma as our design tool and found a well-structured template that helped us effectively organize and present our ideas.

In our journey map, we visualized the user experience within the theme park through the following stages:

Week 3/Introduction to Unity 
Below is the task given for group activity: 
Imagine a scenario in either of the two places. What would the AR experiences be and what visualization can be useful? What do you want the user to feel?  


This week, we installed Unity on our laptops as the main platform for developing our AR applications. Before class, we were required to complete the MyTimes tutorial, which introduces the fundamental steps for creating a basic AR experience.

The tutorial involved scanning an image marker that would trigger two AR elements: a 3D cube and a video plane. For my project, I used a selfie as the image marker and linked it to an online video. By carefully following each step of the tutorial, I was able to ensure that the AR elements appeared correctly during testing and functioned as expected.

Using the Vuforia Engine:
  1. Registered a Vuforia account
  2. Selected version 10.14 and downloaded the Vuforia Engine
  3. Added Vuforia to my Unity project (or upgraded to the latest version if already installed)
  4. After the download was complete, I double-clicked the file and imported it into Unity
  5. This hands-on exercise helped solidify my understanding of how to integrate Vuforia with Unity and build a simple yet functional AR experience from scratch.
Week 4/ Markerless AR Experience

Exercise 1: User Control & UI
In this session, we advanced our AR development by focusing on screen navigation, building on the previous tutorial. The main goal was to incorporate interactive buttons that let users switch between screens and control the display of AR content. We continued to use the Spiderwoman image target to maintain consistency throughout the project.

1. Scene Initialization:
We began by creating a fresh scene in Unity, opting for the Basic (Built-in) template to ensure a clean and straightforward user interface.

2.UI and Canvas Configuration:
A Canvas was added to the scene to contain all user interface elements. Within the Canvas, we inserted a Panel to serve as the background layer, we renamed the first button to BTN_HIDE and changed its label to “HIDE”, and did the same for the second button, renaming it to BTN_SHOW with the label “SHOW”. This was done by expanding each button object and editing the child Text (TMP) component from the Inspector panel.

3. Button Interactivity:
We used Unity’s UI Button component to define the functionality of each button. Pressing the Hide button concealed the AR object, while the Show button revealed it again. This allowed users to interactively control what appears on the screen, simulating a simple navigation system.

4. Scripting the Behavior:
We attached a script using the “Set Active” method to control the cube’s visibility. Each button was connected to a specific function within the script, allowing users to toggle the panel and cube based on their inputs. We assigned the Game Object containing the AR Cube to the button’s On Click() field, and then used Unity's built-in function: Game Object.SetActive(bool). For the "HIDE" button, we set Set Active(false) to make the AR Cube disappear. For the "SHOW" button, we set Set Active(true) to make the AR Cube reappear.

5.Testing and Troubleshooting:
We tested the scene to confirm that the buttons worked correctly and that the AR content continued to respond appropriately to the Roseanne Park image target.

Exercise 2: Animation of 3D Cube & Controlling Movement
This session offered valuable insight into how user interface elements can enhance interactivity within AR applications. Setting up and linking the buttons to control AR visibility gave me practical experience in integrating UI with AR features in Unity. Initially, aligning the buttons with the C# script posed some challenges, particularly in ensuring responsive behavior. Through testing, I also learned how small adjustments in Canvas properties.

1.Maintaining the Scene:
We used the same AR setup with the Spiderwoman image target and retained the existing cube as the animated object.

2.Creating an Animation Clip:
Using Unity’s Animation tab, we created a new clip named CubeAnimation, incorporating basic movements such as rotation, scaling, and position changes to give the cube more visual appeal. Keyframes were inserted to define specific animation sequences.

3.Animator Controller Setup:
An Animator Controller was created and assigned to the cube. This allowed us to manage the animation’s behavior and transitions within Unity’s Animator window.

When the "PLAY" button is clicked, we enabled the Animator component to be ticked and the cube’s animation to play. 
When the "STOP" button is clicked, we disabled the Animator component to be unticked, which stopped the animation playback.
When the "HIDE" button is clicked, we SetActive(true) to make the AR Cube reappear.
4.Triggering the Animation:

The animation was configured to play when the AR image target was detected. Additional UI buttons were added to allow users to play and stop the animation manually, further integrating user input.

5.Testing and Debugging:
We tested the functionality to ensure that the animation played smoothly upon detection of the image target and responded accurately to user commands. could influence usability, highlighting the importance of accurate alignment and scaling. Overall, this exercise deepened my understanding of screen navigation in AR, showing how even simple interactions can improve user engagement and control.


TASK 1

First Ideation
During the first week, me and Chai Wei Yi worked together in a pair for this task. We both did some brainstorming and came up with a total of 2 ideas to suggest to our lecturer. 

Chai Wei Yi IDEAS:
  • AR IKEA Furniture Assembly App 
Jie Xuan IDEAS: (Me)
  • AR Storybook App - The Boy Who Cried Wolf 

For every idea, we wrote down the title, the problem it solves, the solution we suggest, the target audience, the steps or process, the advantages (pros), the disadvantages (cons), and why this idea. Below are our ideas documented in a Google Docs. 
Final Ideation
After our first consultation with Mr. Razif, his feedback was that we should go back and come up with more ideas. He said that having only two ideas is too limited. Moreover, our first idea — an AR IKEA furniture assembly app — would be difficult to execute. The second idea — an AR storybook — is not bad, but he wants to see more ideas so he can give us better advice in the next consultation. He also told us that if we don’t have a foundation in 3D, we shouldn’t do it in 3D — just stick to 2D instead. This is because he doesn’t want us to take on too much and end up not having enough time to complete everything within one semester.

The ideation stage is where we apply the design thinking process to generate and refine potential app ideas. This phase includes brainstorming, evaluating feasibility, and iterating based on feedback. We also come out with new ideas and more complete versions of the 6 AR App ideas. 

Chai Wei Yi IDEAS:
  • IKEA AR Step-by-Step Assembly Helper
  • AR Room Decor Preview  
  • AR Restaurant Menu – 2D Dish Previews and Info Cards 
Jie Xuan IDEAS:
  • Library AR Indoor Navigation 
  • AR Storybook App for Kids 
  • Uniqlo AR Try-On for Accessories 
For every idea, we wrote down the title, the problem it solves, the solution we suggest, the target audience, the steps or process, the advantages (pros), the disadvantages (cons), and why this idea. Below are our ideas documented in a Google Docs. 
Google Document Link: 
Final Ideation

Mr. Razif also mentioned that we can try out his GPT that will give feedbacks based on our proposed ideas, so here is the feedback from TDS XD Assistant. 

Evaluation Summary
🟡IKEA AR 2D Step-by-Step Assembly Helper
  • ✅ Clear problem-solution match (manual confusion → AR overlay).

  • User pain is real and relatable, especially with IKEA-style flat-pack items.

  • Feasible with marker-based tracking using Vuforia Image Targets (e.g. scanning the instruction sheet).

  • Lightweight 2D approach avoids complex 3D modeling.

🔧 Refinement Suggestions:

  • Define your "marker" – Use the front page of the manual or box as the image target.

  • Break steps into scenes – Each scene shows just one step in the process. Users tap to go next.

  • Include a “replay step” button and audio toggle for accessibility.

  • Consider fallback for when the marker is lost (e.g., semi-transparent overlay).

🎯 MVP Scope Suggestion:

  • 3 steps from one simple object (e.g., assemble a stool or lamp).

  • One marker target, 3 overlay scenes, 1 optional audio instruction


🟡 AR Room Decor Preview
  • ✅ Easy to prototype with Ground Plane in Vuforia.

  • ❗Requires basic spatial anchoring for realism—Unity supports this, but aligning 2D images properly is tricky.

  • ❌ May overlap with too many existing AR furniture apps (e.g., IKEA Place, Wayfair).

  • 🔧 Suggest narrowing to niche decor like anime posters or K-pop merch to stand out.


🟡 AR Restaurant Menu
  • ✅ Very trendy in post-pandemic dining.

  • ✅ Marker-based tracking works well (menu card or QR).

  • ❗You’d need strong visual mockups to "sell" the immersive menu idea in proposal.

  • 🔧 Consider choosing a theme restaurant (e.g., Thai street food or vegan café) for narrative richness.


🟢 Library AR Indoor Navigation
  • ✅ Clever idea with high usefulness.

  • ❗Difficult to prototype indoors without spatial anchors.

  • ❌ Vuforia does not support full indoor navigation (you’d need Area Targets or GPS+SLAM combo, which is advanced).

  • 🔧 Instead, you could simulate a single-floor directional AR sign system as proof of concept.


🟢 AR Storybook For Kids
  • ✅ Lovely for educational UX.

  • ✅ Feasible with Image Targets (each book page = marker).

  • ✅ Good for a “wow” presentation demo.

  • 🔧 Narrow scope: only 2–3 scenes for MVP, with narration and simple tap animations.

  • 📚 Consider pitching to publishers or libraries as potential users.


🟡 Uniqlo AR Try-On
  • ✅ Market-relevant, very shareable.

  • ❗Needs facial tracking – Unity with Vuforia doesn’t natively support that well.

  • 🔧 You’d need to simulate using a front-facing camera and overlay (more Snap Lens Studio than Unity).

  • ✅ Still useful as a concept pitch (future scope idea).


🔚 Final Evaluate Summary:
Idea Feasibility (Unity + Vuforia) Innovation MVP Readiness Presentation Potential
IKEA AR Guide ✅✅✅ ✅✅ ✅✅✅ ✅✅✅
AR Room Decor ✅✅ ✅✅
AR Menu ✅✅ ✅✅ ✅✅
Library Navigation ✅✅ ✅✅ ✅✅
Storybook ✅✅✅ ✅✅ ✅✅✅ ✅✅
AR clothes Try-On ✅✅✅ ✅✅

Mr. Razif advised us work on the AR Story Book Idea. The concept of this app is to help users to interact with story books more fun. Users can also scan the book to trigger a 2D pop-up interface. 

Final Idea : 

The boy cried wolf AR Storybook for Kids – 2D Animated Reading Companion

Concept Overview:

An AR-enhanced physical storybook that brings 2D characters and animations to life when children scan the book pages using a smartphone or tablet.

How It Works:

  • Kids scan a book page through the app.

  • 2D characters pop up on top of the page and perform short, animated actions.

  • Audio narration plays, with highlighted text for word tracking.

  • Kids can tap on characters for simple interactions.

  • Game quiz for kids to let them learn the moral of the story. 

Benefits for Users:

  • Makes reading more fun and engaging.

  • Supports early childhood literacy and word recognition.

  • Encourages screen time that complements physical reading.

Why This Idea Matters:

Kids today are surrounded by digital content. This idea uses technology to enhance—not replace—physical books in a playful and educational way.

Target Audience:

  • Children aged 7-12

  • Parents

  • Early educators


FEEDBACKS
Week 1
-

Week 2
-

Week 3
After consultation with Mr. Razif, his feedback was that we should go back and come up with more ideas. He said that having only two ideas is too limited. Moreover, our first idea — an AR IKEA furniture assembly app — would be difficult to execute. The second idea — an AR storybook — is not bad, but he wants to see more ideas so he can give us better advice in the next consultation. Mr. Razif also told us that if we don’t have a foundation in 3D, we shouldn’t do it in 3D — just stick to 2D instead. This is because he doesn’t want us to take on too much and end up not having enough time to complete everything within one semester.

Week 4
During the second consultation, Mr. Razif advised us work on the AR Story Book idea. The concept of this app is to help users to interact with story books more fun. Users can also scan the book to trigger a 2D pop-up interface. 


REFLECTION
Experience
The Experiential Design has been both challenging and rewarding. It allowed me to go through the full brainstorming process—from identifying a problem to creating a solution. Through brainstorming ideation, I learned that many app actually suitable to input AR. 

Working as a team also taught me a lot about communication and time management. We had to plan tasks, share ideas, and stay on the same page. Meeting deadlines and keeping everything on track was an important part of the process.  

Observation
While exploring different AR experiences, I noticed that many apps focus on cool visuals or location-based features, but often lack emotional connection or lasting engagement. Even though AR technology has improved, many apps still offer limited storytelling or interaction, making the experience feel short and shallow. I also saw that users today want more meaningful digital experiences—ones that feel personal and emotionally engaging, not just entertaining. This showed me there’s a chance to create an AR experience that combines immersive technology with emotional value and long-term use.

Findings
During the idea development process, I found that mixing magical fantasy with real-world interaction helps keep users interested for a longer time. Adding physical elements—like scanning a real potion book—made the experience feel more real and fun. In the end, the idea grew into something both magical and meaningful: a digital companion that brings happiness, emotional connection, and care through AR. 

Comments