Internet Monitor: Difference between revisions
BerkmanSysop (talk | contribs) |
No edit summary |
||
(17 intermediate revisions by 2 users not shown) | |||
Line 1: | Line 1: | ||
{{Template:Oldproject}} | |||
Internet Monitor is a research project based at [http://cyber.law.harvard.edu/ Harvard University's Berkman Center for Internet & Society]. Internet Monitor's aim is to evaluate, describe, and summarize the means, mechanisms, and extent of Internet content controls and Internet activity around the world. The project helps researchers, advocates, policymakers, and user communities understand trends in Internet health and activity through research, analysis, and data visualization. | Internet Monitor is a research project based at [http://cyber.law.harvard.edu/ Harvard University's Berkman Klein Center for Internet & Society]. Internet Monitor's aim is to evaluate, describe, and summarize the means, mechanisms, and extent of Internet content controls and Internet activity around the world. The project helps researchers, advocates, policymakers, and user communities understand trends in Internet health and activity through research, analysis, and data visualization. | ||
Internet Monitor has several technical components: | Internet Monitor has several technical components: | ||
Line 7: | Line 7: | ||
==Platform== | ==Platform== | ||
Contact: Ryan Morrison~Westphal ([mailto:rwestphal@cyber.law.harvard.edu rwestphal@cyber.law.harvard.edu]) | Contact: Ryan Morrison~Westphal ([mailto:rwestphal@cyber.law.harvard.edu rwestphal@cyber.law.harvard.edu]) | ||
The Internet Monitor platform | |||
The Internet Monitor platform gives policy makers, digital activists, researchers, and user communities an authoritative, independent, and multi-faceted set of quantitative data on the state of the global Internet. The platform brings together 20+ indicators on Internet access and infrastructure to create a single aggregated API, which offers a starting point for further analysis of Internet access conditions in 92 countries and a data backend for the Dashboard project. | |||
<br> | <br> | ||
''July 22, 2015:'' [https://thenetmonitor.org/blog/posts/a-berkman-center-summer-intern-explores-internet-monitor A Berkman Center summer intern explores Internet Monitor] | ''July 22, 2015:'' [https://thenetmonitor.org/blog/posts/a-berkman-center-summer-intern-explores-internet-monitor A Berkman Center summer intern explores Internet Monitor] | ||
===Tech:=== | ===Tech:=== | ||
* Ruby | * Ruby on Rails | ||
* JavaScript | |||
===Project ideas:=== | ===Project ideas:=== | ||
* Add a notes feature to the data API so Dashboard widgets can show more details on hover | |||
* Create pipelines to include new and fresh data in to Internet Monitor from more indicators | |||
* Include non-standard data into platform such as AccessNow’s KeepItOn campaign | |||
==Dashboard== | ==Dashboard== | ||
The Internet Monitor [https://dashboard.thenetmonitor.org/ dashboard] (launched in September 2015) offers users the opportunity to customize a collection of data visualization “widgets” according to their interest. The dashboard compiles and curates data from multiple sources, including primary data collected by the Berkman Klein Center and our partners, as well as relevant secondary data. Users can create individual boards that provide a real-time view of the state of the Internet across a variety of dimensions, enable easy comparisons across countries and data sources, and are easy to configure, edit, and share. <br> | |||
September 28, 2015: [https://thenetmonitor.org/blog/posts/announcing-the-internet-monitor-data-dashboard Announcing the Internet Monitor data dashboard!] | September 28, 2015: [https://thenetmonitor.org/blog/posts/announcing-the-internet-monitor-data-dashboard Announcing the Internet Monitor data dashboard!] | ||
Line 24: | Line 28: | ||
* Meteor | * Meteor | ||
* Bootstrap 3 | * Bootstrap 3 | ||
* | ===Project ideas:=== | ||
* | * More data-agnostic visualization widgets (bubble chart) | ||
* Include new “notes” feature into existing widgets (choropleth, bar chart, etc) | |||
==Website availability testing== | ==Website availability testing== | ||
Contact: Justin Clark ([mailto:jclark@cyber.law.harvard.edu jclark@cyber.law.harvard.edu]) | |||
Internet Monitor is developing a suite of tools to enable the automated collection of data on website availability in a number of countries worldwide. This effort has several components: data collection tools tailored specifically for assessing website availability and gathering evidence of deliberate blocking; global and country-specific lists of URLs to test; and additional sources of data. We are developing an analysis platform that will ingest data from the various sources to more accurately determine the availability of different sites in different locations. We are building a real-time testing UI, initially available only to invited participants, to enable on-demand testing in any of the supported locations worldwide. | |||
Internet Monitor is developing a suite of tools to enable the automated collection of data on website availability in a number of countries worldwide. This effort has several components: | |||
We are developing an analysis platform that will ingest data from the | |||
We are building a real-time testing | |||
===Project ideas:=== | ===Project ideas:=== | ||
Line 50: | Line 43: | ||
* System-on-a-Chip (SoC) collectors (Arduino, CHiP, Raspberry Pi, ESP8266) | * System-on-a-Chip (SoC) collectors (Arduino, CHiP, Raspberry Pi, ESP8266) | ||
* Tor integration | * Tor integration | ||
* Automated URL list generation | |||
* Machine-learned block page classification | |||
* Automated packet capture forensics |
Latest revision as of 11:34, 22 January 2019
Internet Monitor is a research project based at Harvard University's Berkman Klein Center for Internet & Society. Internet Monitor's aim is to evaluate, describe, and summarize the means, mechanisms, and extent of Internet content controls and Internet activity around the world. The project helps researchers, advocates, policymakers, and user communities understand trends in Internet health and activity through research, analysis, and data visualization.
Internet Monitor has several technical components:
Platform
Contact: Ryan Morrison~Westphal (rwestphal@cyber.law.harvard.edu)
The Internet Monitor platform gives policy makers, digital activists, researchers, and user communities an authoritative, independent, and multi-faceted set of quantitative data on the state of the global Internet. The platform brings together 20+ indicators on Internet access and infrastructure to create a single aggregated API, which offers a starting point for further analysis of Internet access conditions in 92 countries and a data backend for the Dashboard project.
July 22, 2015: A Berkman Center summer intern explores Internet Monitor
Tech:
- Ruby on Rails
- JavaScript
Project ideas:
- Add a notes feature to the data API so Dashboard widgets can show more details on hover
- Create pipelines to include new and fresh data in to Internet Monitor from more indicators
- Include non-standard data into platform such as AccessNow’s KeepItOn campaign
Dashboard
The Internet Monitor dashboard (launched in September 2015) offers users the opportunity to customize a collection of data visualization “widgets” according to their interest. The dashboard compiles and curates data from multiple sources, including primary data collected by the Berkman Klein Center and our partners, as well as relevant secondary data. Users can create individual boards that provide a real-time view of the state of the Internet across a variety of dimensions, enable easy comparisons across countries and data sources, and are easy to configure, edit, and share.
September 28, 2015: Announcing the Internet Monitor data dashboard!
Tech:
- Meteor
- Bootstrap 3
Project ideas:
- More data-agnostic visualization widgets (bubble chart)
- Include new “notes” feature into existing widgets (choropleth, bar chart, etc)
Website availability testing
Contact: Justin Clark (jclark@cyber.law.harvard.edu)
Internet Monitor is developing a suite of tools to enable the automated collection of data on website availability in a number of countries worldwide. This effort has several components: data collection tools tailored specifically for assessing website availability and gathering evidence of deliberate blocking; global and country-specific lists of URLs to test; and additional sources of data. We are developing an analysis platform that will ingest data from the various sources to more accurately determine the availability of different sites in different locations. We are building a real-time testing UI, initially available only to invited participants, to enable on-demand testing in any of the supported locations worldwide.
Project ideas:
- Planet Lab, Measurement Lab, RIPE ATLAS integrations
- Mobile component
- System-on-a-Chip (SoC) collectors (Arduino, CHiP, Raspberry Pi, ESP8266)
- Tor integration
- Automated URL list generation
- Machine-learned block page classification
- Automated packet capture forensics