fusion630-01.png

Fusion360

Usability evaluation of the Fusion 360, a professional 3D modeling software developed by Autodesk.


 

Client & Product

Autodesk, Fusion 360

Time

Three months

Role

UX researcher

Team Member

RIT HCI

Chia-Hua Li, Lingfei Li, Tanvi Dattatraya Kulkarni, Peter Yeung

Method

Heuristic evaluation

Usability testing

 

 

Process

 
process-ut-01.png
 
 

Heuristic Evaluation

 

Heuristics Evaluation Terms

The heuristic evaluation is when a group of experts reviews the product’s or website’s interface and compares it against accepted principles. The analysis results in a list of potential usability issues (see the Heuristic Evaluation report).

After completing the heuristic evaluation of the Autodesk Fusion 360, we identified 11 usability problems that violate the usability principles we assessed against. We prioritized the issues to help identify which we believe should be addressed first. Each usability problem has its own number described in the Findings section

 
HE findings
 
 
 

Usability Testing

 

Research Question:

  • From Autodesk Fusion 360 and SolidWorks, which one has an easy interface to complete the given task?

  • What are the difficulties faced by the participants while using Autodesk Fusion 360?

 Procedure:

  • Introduction to the study and purpose

  • Test Session

The tasks mentioned was provided to each participant according to the participant number. The participant number was significant because we counterbalanced the learning effect. Every other participant started performing a task for SolidWorks first and then performed the last task for Autodesk Fusion 360. During the entire test session, the participant was encouraged to think aloud. The observers recorded details based on the observation. After the participant finished performing tasks, the test moderator gave a post-task questionnaire.

  • Post-test debriefing

Once the two tasks were being completed:

  1. The participant was given the post-test questionnaire.

  2. Short interview with the participant about their experience.

  3. Followed on any problems that will come up for the participant.

Data Collection and Evaluation Measures:

  • Count of incorrect menu choices

  • Any positive comments that the users have while using the “think aloud” procedures

  • Any confusions or misunderstandings about Autodesk Fusion 360 while performing each

  • System Usability Scale (SUS) questions as the measurement of the results

 Findings:

  • Usability Problems Identified

 
Usability Problems Identified
 
  • System Usability Scale

  1. The SUS score matched Ok/Fair - good level, but it had a relatively wide confidence interval, which means we did not have a strong confidence interval.

  2. The results showed expert group owned the highest score (81.250), the beginner group had the medium score (81.250), and the general group had the lowest score (51.250).

  • Satisfaction Comparison

Satisfaction rate was collected from post-task questionnaire, which was filled by participants after they finished the same task by both software.The Result showed that satisfaction for Fusion 360 and SolidWorks might be the same. Specifically, we need to notice for most users they all had more experience of using SolidWorks than Fusion 360. One possible situation might be that if participants got both experiences about SolidWorks and Fusion 360, Fusion 360 might own better satisfaction.

 

Recommendations:

Based on the study conducted and the results gathered, we would like to make the following suggestions and recommendations:

  • Make model tree more active and own more functions levels.

  • Change the interaction way of rotation - fully accessible for everyone.

  • Consider most users’ experiences from other software, provide specialized reminder according to users’ previous experience in software, especially when users use Fusion 360 in their first time.