Computer Science - Student Works
Permanent link for this collection
Browse
Browsing Computer Science - Student Works by Author "Cobzas, Dana"
Now showing 1 - 6 of 6
Results Per Page
Sort Options
Item 496 Capstone: AR.t(2020) Driedger, Andre; Ansorger, Anneliese; Galay, Chance; Lafitte, Chanelle; Cobzas, DanaOriginal artwork is often very expensive; being able to see how a painting will look on a wall before you buy is advantageous. As a collaborative project between the MacEwan Computer Science and Design departments, we set out to do develop an AR app that can be used by by consumers to shop for art on the walls of their homes and offices. Existing mobile AR applications cannot identify vertical surfaces, such as walls. Our solution is to implement a target image that can be posted onto vertical surfaces to be detected by our app. We developed an OpenCV prototype to test this method of using object-detection to set a starting point for subsequent tracking. The prototype was successful in rendering 3d objects, true to scale, onto walls. Next, we developed an Android version utilizing Google's ARCore toolkit. This also delivered good results. Ultimately, we were successful in showcasing art on walls using smartphones in real-time.Item An analysis of electroencephalogram (EEG) with machine learning(2024) Emery, Jesse; Phan, Nhi; Jime, Isra; Cobzas, Dana; Hassall, CameronOur capstone project was done in collaboration with Dr. Cameron Hassall from the Psychology department at MacEwan University. Our data was based on one of Dr. Hassall’s papers on “Task-level value affects trial-level reward processing” (Hassal, C, 2022), where he wanted to determine if the Anterior Cingulate Cortex was responsible or involved in decision making. To determine this, a task sequence was carried out 427 times using 12 participants over a 52 minute period. While the participants completed these tasks, brain activity was being measured using an electroencephalogram (EEG). For our project, the goal was to train a machine learning model to accurately classify an EEG event after training on past events. In greater detail, we focus on the brain signal when the participant hit the left or right button in response to the stimulus which are colored shapes.Item An analysis of electroencephalogram (EEG) with machine learning(2024) Jime, Isra; Emery, Jesse; Phan, Nhi; Cobzas, DanaOur capstone project was done in collaboration with Dr. Cameron Hassall from the Psychology department at MacEwan University. Our data was based on one of Dr. Hassall’s papers on “Task-level value affects trial-level reward processing” (Hassal, C, 2022), where he wanted to determine if the Anterior Cingulate Cortex was responsible or involved in decision making. To determine this, a task sequence was carried out 427 times using 12 participants over a 52 minute period. While the participants completed these tasks, brain activity was being measured using an electroencephalogram (EEG). For our project, the goal was to train a machine learning model to accurately classify an EEG event after training on past events. In greater detail, we focus on the brain signal when the participant hit the left or right button in response to the stimulus which are colored shapes.Item Android app demo(2020) Driedger, Andre; Ansorger, Anneliese; Galay, Chance; Lafitte, Chanelle; Cobzas, DanaFor the app, we developed an Android version utilizing Google's ARCore toolkit. The Design students prototyped screens for user profiles, buying art, as well as filtering and browsing functionality. This functionality has not yet been implemented, and we instead chose to focus on the AR screens. The user can browse through and preview different paintings and frames.Item AR.T deliver report(2020) Lafitte, Chanelle; Galay, Chance; Cobzas, DanaOriginal artwork is often very expensive; being able to see how a painting will look on a wall before you buy is advantageous. As a collaborative project between the MacEwan Computer Science and Design departments, we set out to do develop an AR app that can be used by by consumers to shop for art on the walls of their homes and offices. Existing mobile AR applications cannot identify vertical surfaces, such as walls. Our solution is to implement a target image that can be posted onto vertical surfaces to be detected by our app. We developed an OpenCV prototype to test this method of using object-detection to set a starting point for subsequent tracking. The prototype was successful in rendering 3d objects, true to scale, onto walls. Next, we developed an Android version utilizing Google's ARCore toolkit. This also delivered good results. Ultimately, we were successful in showcasing art on walls using smartphones in real-time.Item OpenCV laptop demo(2020) Driedger, Andre; Ansorger, Anneliese; Galay, Chance; Lafitte, Chanelle; Cobzas, DanaWhen we started the project, we had decided to make a program that would use feature matching to recognize a specific image (eg. a poster or sticker), find it’s orientation, and then display some kind of useful AR artifacts in the 3D space of our recognized image. We have implemented this in OpenCV, to show that we have an in-depth understanding of how AR works.