Home » Blog » analytics » Big Data Analytics: Articles, Movies, Songs Robo-written by Computer? » Page 2
If you're new here, you may want to first register and subscribe to the RSS feed. Thanks for visiting!
The robo-journalistic feat itself is not that interesting. The LA times reporter simply wrote a script, dubbed ‘Quakebot’, that extracts earthquake reports from the USGS feed larger than Richter 3.0 and fills out an article template with the information. The initial article is not especially good, including extraneous information from the USGS such as the distance between Los Angeles and Sacramento. It includes a disclaimer at the bottom that the article was written by a computer program written by the journalist. It hit the LA Times website within 3 minutes. (The delay was due to a human approval step required, needed because of USGS feed glitches.) After being the first with the journalistic scoop, the article was then vastly improved over time by the LA Times’ human staff of editors and journalists into a very professional piece. The LA Times has other bots that auto-generate the scoop from similar government databases, such as city homicides.
The intent, supposedly, is to eliminate the journalistic drudge work associated with these types of articles. The computer takes care of getting that first article very quickly out to the public ‘under deadline.’ The deadline met and the first article out, human journalists are then free to do real, deep-digging work.
However, we will point out that ‘automated journalism’ is nothing new. Years ago, Google replaced the human editor, or at least human news aggregator/content selector, with Google News. Some of our readers may remember the joke Google used to put at the bottom of its Google News page, ‘No humans were harmed in the preparation of this page.’ Google later clarified this by replacing with a sentence that the news articles and their placement on the page were generated automatically by a computer algorithm, which is what they mean by ‘no humans were harmed.’ Some editors apparently felt that some humans had been harmed by Google News, which they saw as a major and very successful competitor to their front pages. (A few organizations sued, AFP was one of them if memory serves. Google reportedly argued that it was sending AFP and other organizations a lot of traffic in exchange for snippets. If memory serves, there was eventually a settlement, which may have included Google agreeing not to use some content.)
You can almost detect a certain pride in the Slate piece. Major newspapers like the LA Times, not traditionally known for their technological prowess, had finally stepped up to the plate and copied a page from Google, using computers to generate content.
Search API will now always return "real" Twitter user IDs. The with_twitter_user_id parameter is no longer necessary. An era has ended. ^TS
— Twitter API (@twitterapi)November7, 2011
Search API will now always return "real" Twitter user IDs. The with_twitter_user_id parameter is no longer necessary. An era has ended. ^TS
— Twitter API (@twitterapi)November7, 2011
Search API will now always return "real" Twitter user IDs. The with_twitter_user_id parameter is no longer necessary. An era has ended. ^TS
— Twitter API (@twitterapi)November7, 2011
Search API will now always return "real" Twitter user IDs. The with_twitter_user_id parameter is no longer necessary. An era has ended. ^TS
— Twitter API (@twitterapi)November7, 2011
Search API will now always return "real" Twitter user IDs. The with_twitter_user_id parameter is no longer necessary. An era has ended. ^TS
— Twitter API (@twitterapi)November7, 2011
Search API will now always return "real" Twitter user IDs. The with_twitter_user_id parameter is no longer necessary. An era has ended. ^TS
— Twitter API (@twitterapi)November7, 2011
Search API will now always return "real" Twitter user IDs. The with_twitter_user_id parameter is no longer necessary. An era has ended. ^TS
— Twitter API (@twitterapi)November7, 2011
Search API will now always return "real" Twitter user IDs. The with_twitter_user_id parameter is no longer necessary. An era has ended. ^TS
— Twitter API (@twitterapi)November7, 2011
Search API will now always return "real" Twitter user IDs. The with_twitter_user_id parameter is no longer necessary. An era has ended. ^TS
— Twitter API (@twitterapi)November7, 2011
There are 7 comments so far
Leave a Comment
Don't worry. We never use your email for spam.Recent Comments
- florimee on genetic disease turns you into a real-life vampire
- Acculation on Alien Pioneer plaque starmap to 3D printed jewelry transmedia: maker movement data-driven multiplatform media
- Acculation on Free Video Data Science Assessment Tool
- Acculation on Free Business Advice Chatbot Product
- Acculation on Online Consultation with Dr. Krebs (Big Data and Management Consulting)
[…] First in a series on applications of big data analytics, we discuss articles, songs, movies & content created & curated by algorithms. (Is hollywood out of a job? Future movies & songs robo-written by #bigdata #analytics algos? […]
[…] Unless you’ve been living under a large rock these last few days, you’ve probably heard that Los Angeles was struck in the last two weeks by what the USGS describes as a “moderate” 5.1 earthquake with “light” fore and aftershocks of around 4.5. (The Saint Patrick’s Day foreshock trembler prompted our earlier article on robot-written newspaper articles , music, and movies.) […]
[…] how you can take the Social Progress Index and use it to implement robot governance, something we talked about at the end of a previous post on Big Data Analytics. First, you have to accept that a metric like the Social Progress Index is a better objective […]
[…] generation, resulting in an exponential acceleration of technology. Futurists like Kurzweil who we have often mentioned in these blog pages believe this will lead to the Singularity, the point at which computer intelligence surpasses […]
[…] you read that article title right. We’ve done past articles on how computers were now writing newspaper articles automatically. When we learned there was software to “auto-generate” blog content, we were […]
[…] readers of our blog will recall our earlier article on predictive analytics models for Hollywood movie script or a pop songs. So, if you happen to be trying to sell one of these to a major corporation, you already have a […]
[…] did lightheartedly promise a series of articles on using big data for territorial expansion. Although there is no shortage of countries interested in these analytics applications, this […]