Talk:ListenLog: Difference between revisions

From Project VRM
Jump to navigation Jump to search
mNo edit summary
Line 4: Line 4:
# Assumption that whenever personal data is captured, it is captured ultimately on behalf of a vendor, even if users are told otherwise. The default assumption is that data is stored and used by the vendor who is in control of the current user experience, e.g. whomever distributes this software application or hosts this website. The assumption might be that it is used by an unwarranted vendor or that if explicit permission is given, the data might be used in an unwarranted fashion (e.g. when you provide your phone number to a vendor who then later uses it to solicit you).
# Assumption that whenever personal data is captured, it is captured ultimately on behalf of a vendor, even if users are told otherwise. The default assumption is that data is stored and used by the vendor who is in control of the current user experience, e.g. whomever distributes this software application or hosts this website. The assumption might be that it is used by an unwarranted vendor or that if explicit permission is given, the data might be used in an unwarranted fashion (e.g. when you provide your phone number to a vendor who then later uses it to solicit you).
# Concern that even if the data is owned or controlled by users, it might eventually be compromised, e.g. by a malicious 3rd party, oppressive government, legal body, etc. who can tie this data to me. Since we intend to store the data remotely, this might aggravate the concern.
# Concern that even if the data is owned or controlled by users, it might eventually be compromised, e.g. by a malicious 3rd party, oppressive government, legal body, etc. who can tie this data to me. Since we intend to store the data remotely, this might aggravate the concern.
# As a user, why would I choose to do this? Since LL will like have the ability for individuals to opt-in or opt-out, why would someone choose to collect their own data? What's in it for them? There might be built-in resistance without a compelling use case.
# As a user, why would I choose to do this? Since LL will likely have the ability for individuals to opt-in or opt-out, why would someone choose to collect their own data? What's in it for them? There might be built-in resistance without a compelling use case.


What might we do to address these concerns? Here are some suggestions I propose:
What might we do to address these concerns? Here are some suggestions I propose:
Line 10: Line 10:
* Store the data somewhere agnostic and other than on the application or vendor servers (I recommend Berkman servers)
* Store the data somewhere agnostic and other than on the application or vendor servers (I recommend Berkman servers)
* Transmit data securely
* Transmit data securely
* Automatically and by default store all data fully encrypted (i.e. "locked"). As an alternative to opt-in, users would need to "unlock their data" (this action would decrypt it wholesale) in order to get access to functionality or to share the data externally. Users could lock, unlock, delete all of their data at will. Users could also opt-out of capturing data in the first place.
* Automatically and by default store all data fully encrypted (i.e. "locked"). As an alternative to opt-in, users would need to "unlock their data" (this action would decrypt it wholesale) in order to get access to functionality or to share the data externally. Users could lock, unlock, and delete all of their data at will. Users could also opt-out of capturing data in the first place.
** Perhaps lock and unlock could apply to different sub-sets of data, e.g. location data
** Perhaps lock and unlock could apply to different sub-sets of data, e.g. location data
* Don't build any ability to share data into first version
* Don't build any ability to share data into first version

Revision as of 12:32, 2 February 2009

Users Might Object

Some individuals might perceive the ListenLog as yet another mechanism for aggregating personal attention data that will (or at least could) be used or used indiscriminately by a third party. There are four aspects of this concern, depending on how the LL concept is presented and implemented:

  1. Knee-jerk negative reaction to the concept of personal/attention data being monitored at all. This reaction is prior to any consideration as to why and for whom the data might be used.
  2. Assumption that whenever personal data is captured, it is captured ultimately on behalf of a vendor, even if users are told otherwise. The default assumption is that data is stored and used by the vendor who is in control of the current user experience, e.g. whomever distributes this software application or hosts this website. The assumption might be that it is used by an unwarranted vendor or that if explicit permission is given, the data might be used in an unwarranted fashion (e.g. when you provide your phone number to a vendor who then later uses it to solicit you).
  3. Concern that even if the data is owned or controlled by users, it might eventually be compromised, e.g. by a malicious 3rd party, oppressive government, legal body, etc. who can tie this data to me. Since we intend to store the data remotely, this might aggravate the concern.
  4. As a user, why would I choose to do this? Since LL will likely have the ability for individuals to opt-in or opt-out, why would someone choose to collect their own data? What's in it for them? There might be built-in resistance without a compelling use case.

What might we do to address these concerns? Here are some suggestions I propose:

  • Clear, careful presentation of the concept. This is unlike anything people will be familiar with and they will likely resist, misunderstand, or compare it to historic privacy violations. A primary emphasis on new, previously unavailable user functionality might be a good approach.
  • Store the data somewhere agnostic and other than on the application or vendor servers (I recommend Berkman servers)
  • Transmit data securely
  • Automatically and by default store all data fully encrypted (i.e. "locked"). As an alternative to opt-in, users would need to "unlock their data" (this action would decrypt it wholesale) in order to get access to functionality or to share the data externally. Users could lock, unlock, and delete all of their data at will. Users could also opt-out of capturing data in the first place.
    • Perhaps lock and unlock could apply to different sub-sets of data, e.g. location data
  • Don't build any ability to share data into first version
  • Focus functional development on a single piece of previously unavailable end-user functionality, e.g. "Search across everything I've ever listened to" to help demonstrate our user-driven intentions
  • Encourage additional audio applications/sites/devices to publicly commit to support the ListenLog standard. LL would be significantly more compelling to users if it spans devices/platforms

Khopper 15:31, 2 February 2009 (UTC)