Thursday, December 17, 2015

Progress Report

Progress:

  • Downloaded & am using the Testbench software
  • Collected data for the Openvibe group and for my own education
  • I have been testing around with the connecting command in the sample code created by Emotiv
  • Finished presentation and posted it up on the blog
  • Learned an easier way to put on the headset
Problems:
  • I've forgotten a lot of Objective-C since my study of it near the beginning of the year
  • Strange errors from code
  • Timing with my other classes conflicting with my project work
Plan:
  • Get my app to connect to the headset by looking closer through the code (I have been doing this, I think I should be able to finish it soon)
  • Collect more data of me doing various tasks or whatever the group needs of me with the headset (this will benefit their research, and also let me use the headset the most I can)
  • Use Lynda.com to review Objective-C
Puns! 
  • What kind of fish performs brain surgery? A Neuro-sturgeon!
  • What do you call an empty skull? A no brainer!

Monday, December 14, 2015

December Seminar Presentation

This is the link for my seminar presentation. I tried to make the terms simpler for everyone to understand, so I might want to update it soon, just to improve.


https://docs.google.com/presentation/d/11Ef6HuEKUuWeCziFFKuvMfBO7NiTaNh_Ak-tnhkVrFw/edit?usp=sharing

Sunday, December 13, 2015

New Progress Report Will Be Made Tomorrow

I need the headset to do this new report, which is still at school. My Sunday update will be done on Monday. Thank you for your patience.

Sunday, December 6, 2015

Progress Report: Insight Testing



This week, with the arrival of the headset, I have been experimenting with the capabilities of the already created app. I have found that preparing the headset on your head is the initial challenge, as it requires very direct sensor to scalp connection. This makes sense based on the data it's collecting, however if you are not used to it it can be annoying. This is how the headset looks once placed correctly, and a white light shows on the side when it is turned on. The connection was a little spotty but it could be fixed if I rehydrate the sensor, or part my hair further. This can be very difficult with curly hair.


There are various tests and challenges you can complete within the app. First, it requires you to record a baseline, where it asks that you do certain tasks such as close your eyes and relax for ten seconds, or to behave normally with a normal amount of blinking. the Insight can record artifacts such as blinking so it is important to keep that in mind. Every task has a somewhat long loading time, and when I look closer into the code I might be able to tell why. Here are some screenshots from the app. The green means fully connected and the yellow means somewhat. 



The emotions that Emotiv classifies the waves into is attention, focus, engagement, interest, excitement, affinity, relaxation, and stress. From the few tests that I tried, it seems that the attention, focus, interest, relaxation, and stress levels appear the most. It could be that for the tests that I took only those classifications were needed. I will be looking into this further. Just as a personal note, I might need to work on lowering stress. I can post my individual results more often, however I find them unnecessary. The organization of the GUI is pretty and easy to look at. It also gives me at a glance very important information.



 Problems had been connection issues. My plan now is to test this out even more, finish my seminar presentation, think about what I am presenting during Hour of Code, and consider incorporating some ideas that Emotiv used into my own app design.


RE: Patent Project

Even though many companies have got patents in this field. As an app developer, you can still keep an eye on the possible innovation opportunities. Especially when you feel the existing technology is not good enough or not convenient. Any need is an indication of potential innovation.