Tagged: Web Traffic Toggle Comment Threads | Keyboard Shortcuts

  • Gerrit Eicker 08:14 on 4. November 2011 Permalink
    Tags: , , , , , , , , , , , Business Practices, , , , , , , , Debiasing, , Economic Incentives, , , , , , , , , , , , , , , , , , , , , , , , Replication, , , , Search Bias, Search Engine Bias, , , , , , , , , , Web Traffic,   

    Search Engine Bias 

    Does Google favour its own sites in search results? New study: Google less biased than Bing; http://eicker.at/SearchEngineBias

     
    • Gerrit Eicker 08:14 on 4. November 2011 Permalink | Reply

      SEL: “Does Google favor its own sites in search results, as many critics have claimed? Not necessarily. New research suggests that claims that Google is ‘biased’ are overblown, and that Google’s primary competitor, Microsoft’s Bing, may actually be serving Microsoft-related results ‘far more’ often than Google links to its own services in search results. – In an analysis of a large, random sample of search queries, the study from Josh Wright, Professor of Law and Economics at George Mason University, found that Bing generally favors Microsoft content more frequently, and far more prominently, than Google favors its own content. According to the findings, Google references its own content in its first results position in just 6.7% of queries, while Bing provides search result links to Microsoft content more than twice as often (14.3%). … The findings of the new study are in stark contrast with a study on search engine ‘bias’ released earlier this year. That study, conducted by Harvard professor Ben Edelman concluded that ‘by comparing results across multiple search engines, we provide prima facie evidence of bias; especially in light of the anomalous click-through rates we describe above, we can only conclude that Google intentionally places its results first.’ … So, what conclusions to draw? Wright says that ‘analysis finds that own-content bias is a relatively infrequent phenomenon’ – meaning that although Microsoft appears to favor its own sites more often than Google, it’s not really a major issue, at least in terms of ‘bias’ or ‘fairness’ of search results that the engines present. Reasonable conclusion: Google [and Bing, though less so] really are trying to deliver the best results possible, regardless of whether they come from their own services [local search, product search, etc] or not. … But just because a company has grown into a dominant position doesn’t mean they’re doing wrong, or that governments should intervene and force changes that may or may not be “beneficial” to users or customers.

      Edelman/Lockwood: “By comparing results between leading search engines, we identify patterns in their algorithmic search listings. We find that each search engine favors its own services in that each search engine links to its own services more often than other search engines do so. But some search engines promote their own services significantly more than others. We examine patterns in these differences, and we flag keywords where the problem is particularly widespread. Even excluding ‘rich results’ (whereby search engines feature their own images, videos, maps, etc.), we find that Google’s algorithmic search results link to Google’s own services more than three times as often as other search engines link to Google’s services. For selected keywords, biased results advance search engines’ interests at users’ expense: We demonstrate that lower-ranked listings for other sites sometimes manage to obtain more clicks than Google and Yahoo’s own-site listings, even when Google and Yahoo put their own links first. … Google typically claims that its results are ‘algorithmically-generated’, ‘objective’, and ‘never manipulated.’ Google asks the public to believe that algorithms rule, and that no bias results from its partnerships, growth aspirations, or related services. We are skeptical. For one, the economic incentives for bias are overpowering: Search engines can use biased results to expand into new sectors, to grant instant free traffic to their own new services, and to block competitors and would-be competitors. The incentive for bias is all the stronger because the lack of obvious benchmarks makes most bias would be difficult to uncover. That said, by comparing results across multiple search engine, we provide prima facie evidence of bias; especially in light of the anomalous click-through rates we describe above, we can only conclude that Google intentionally places its results first.”

      ICLE: “A new report released [PDF] by the International Center for Law und Economics and authored by Joshua Wright, Professor of Law and Economics at George Mason University, critiques, replicates, and extends the study, finding Edelman und Lockwood’s claim of Google’s unique bias inaccurate and misleading. Although frequently cited for it, the Edelman und Lockwod study fails to support any claim of consumer harm – or call for antitrust action – arising from Google’s practices.Prof. Wright’s analysis finds own-content bias is actually an infrequent phenomenon, and Google references its own content more favorably than other search engines far less frequently than does Bing: In the replication of Edelman und Lockwood, Google refers to its own content in its first page of results when its rivals do not for only 7.9% of the queries, whereas Bing does so nearly twice as often (13.2%). – Again using Edelman und Lockwood’s own data, neither Bing nor Google demonstrates much bias when considering Microsoft or Google content, respectively, referred to on the first page of search results. – In our more robust analysis of a large, random sample of search queries we find that Bing generally favors Microsoft content more frequently-and far more prominently-than Google favors its own content. – Google references own content in its first results position when no other engine does in just 6.7% of queries; Bing does so over twice as often (14.3%). – The results suggest that this so-called bias is an efficient business practice, as economists have long understood, and consistent with competition rather than the foreclosure of competition. One necessary condition of the anticompetitive theories of own-content bias raised by Google’s rivals is that the bias must be sufficient in magnitude to exclude rival search engines from achieving efficient scale. A corollary of this condition is that the bias must actually be directed toward Google’s rivals. That Google displays less own-content bias than its closest rival, and that such bias is nonetheless relatively infrequent, demonstrates that this condition is not met, suggesting that intervention aimed at ‘debiasing’ would likely harm, rather than help, consumers.”

  • Gerrit Eicker 09:15 on 29. October 2011 Permalink
    Tags: , , API Charge, , , Charge, , , , , Google APIs Console, , Google Maps API, Google Maps API FAQ, Google Maps API Premier, Google Maps API Premier License, Google Maps API TOS, Google Maps API Usage, Google Maps Charge, Google Maps Usage, , , , , , , , , , , , Web Traffic   

    Google Maps Charge 

    Google adds a limit on free Google Maps API: over 25,000 daily and you’re charged; http://eicker.at/GoogleMapsCharge

     
    • Gerrit Eicker 09:15 on 29. October 2011 Permalink | Reply

      Google: “When the Maps API Terms of Service were updated in April of this year we announced that usage limits would be introduced to the Maps API starting on October 1st. With October upon us, I’d like to provide an update on how these limits are being introduced, and the impact it will have on your Maps API sites. The usage limits that now apply to Maps API sites are documented in the Maps API FAQ. However no site exceeding these limits will stop working immediately. We understand that developers need time to evaluate their usage, determine if they are affected, and respond if necessary. There are three options available for sites that are exceeding the limits: Reduce your usage to below the limits, Opt-in to paying for your excess usage at the rates given in the FAQ, Purchase a Maps API Premier license – To assist in evaluating whether your site is exceeding the usage limits we will shortly be adding the Maps API to the Google APIs Console. Once available you will be able to track your usage in the APIs Console by providing an APIs Console key when you load the Maps API. … We understand that the introduction of these limits may be concerning. However with the continued growth in adoption of the Maps API we need to secure its long term future by ensuring that even when used by the highest volume for-profit sites, the service remains viable. By introducing these limits we are ensuring that Google can continue to offer the Maps API for free to the vast majority of developers for many years to come.

      Google: “What usage limits apply to the Maps API? Web sites and applications using each of the Maps API may at no cost generate: up to 25,000 map loads per day for each API, up to 2,500 map loads per day that have been modified using the Styled Maps feature…”

      Google: “How much will excess map loads purchased online cost? Applications generating map load volumes below the usage limits can use the Maps API at no cost providing the application meets the requirements of the Google Maps API Terms of Service. Excess map loads over the usage limits are priced as follows [for 1,000 excess map loads]: JS Maps API v3: $4, JS Maps API v3 styled maps: $4/$8, Static Maps API: $4, Static Maps API styled maps: $4/$8, Street View Image API: $4, JS Maps API v2: $10 – Excess map loads will not be offered online for the Maps API for Flash. Sites using the Maps API for Flash and exceeding the usage limits should migrate to the JS Maps API v3, or purchase a Maps API Premier license.”

      Guardian: “Nothing free lasts forever; and it’s damn hard to make money putting ads on maps. That seems to be the conclusion to draw from Google’s decision to put limits on its Google Maps API. … 25,000 isn’t that many calls. – Although won’t immediately be cutting off those whose applications exceed the call rate, it’s clear that the easy days are over. And of course it also raises the question of whether Google has found that it’s too hard to monetise maps, or that the API calls are bypassing the best ways it has of monetising them. … Obviously, Google, as a business, is free to charge as and how it wants. But it will be interesting to see if this changes how developers approach the use of the maps APIs.”

      Wired: “Bad news, map hackers; the Google Maps free ride may be coming to and end. … The bad news is that once your app or website exceeds those limits you’ll be forking out $4 for every 1,000 people that hit your site (or view a map in your mobile app). Alternately, developers can cough up $10,000+ for a Google Maps API Premier licence, which, in addition to the unlimited access offers more advanced geocoding tools, tech support, and control over any advertising shown. … In other words, Google appears to be interested mainly in collecting fees from sites with consistently heavy traffic rather than experiments that see a one-time traffic spike. It doesn’t protect against every potentially expensive use case, but it should make map mashup fans breathe a little easier. – Developers worried about the potential costs of the Google Maps API can always use OpenStreetMap, which is free and, in many parts of the world, much more detailed than Google Maps. Of course, OpenStreetMap lacks some Google Maps features, most notably an equivalent to Street View.”

      AT: “Google’s approach to enforcement will likely not be very aggressive. According to the FAQ, sites that hit the rate limit and aren’t configured to pay overage fees will not immediately be cut off. This suggests that sites with an occasional traffic spike aren’t the intended target-Google is mainly looking to collect cash from sites with a consistently heavy load.

      PW: “Unfortunately, the price for styled maps could impact many more developers. Perhaps Google is charging for what it knows is a unique feature amongst its competitors. The feature is also likely extremely computation-intensive, which means it costs Google quite a bit more to provide that service.

    • Gerrit Eicker 14:28 on 22. November 2011 Permalink | Reply

      Google: “Understanding how the Maps API usage limits affect your sites – We recognise that sites may occasionally experience spikes in traffic that cause them to exceed the daily usage limits for a short period of time. For example, a media site that uses a map to illustrate a breaking news story, or a map-based data visualization that goes viral across social networks, may start to generate higher traffic volumes. In order to accommodate such bursts in popularity, we will only enforce the usage limits on sites that exceed them for 90 consecutive days. Once that criteria is met, the limits will be enforced on the site from that point onwards, and all subsequent excess usage will cause the site to incur charges. – Please be aware that Maps API applications developed by non-profit organisations, applications deemed by Google to be in the public interest, and applications based in countries where we do not support Google Checkout transactions or offer Maps API Premier are exempt from these usage limits. We will publish a process by which sites can apply for an exemption on the basis of the above criteria prior to enforcement of the limits commencing. Non-profit organizations are also encouraged to apply for a Google Earth Outreach grant, which provides all the additional benefits of a full Maps API Premier license. … To help you measure your site’s Maps API usage, we have now added the Maps API to the Google APIs Console.

  • Gerrit Eicker 07:41 on 27. October 2011 Permalink
    Tags: , , , , , , , , Content Takedown, , , Electronic Communications Privacy Act, , , , , Google Government Requests, Google Transparency Report, , Government Requests, , , , , , , , , , , , , , , , , , , , , Private Information, , , , , , , , , , , Web Traffic,   

    Google Transparency Report 

    How do governments affect access to information? Google’s Transparency Report 2011; http://eicker.at/GoogleTransparencyReport

     
    • Gerrit Eicker 07:42 on 27. October 2011 Permalink | Reply

      Google: “How do governments affect access to information on the Internet? To help shed some light on that very question, last year we launched an online, interactive Transparency Report. All too often, policy that affects how information flows on the Internet is created in the absence of empirical data. But by showing traffic patterns and disruptions to our services, and by sharing how many government requests for content removal and user data we receive from around the world, we hope to offer up some metrics to contribute to a public conversation about the laws that influence how people communicate online. – Today we’re updating the Government Requests tool with numbers for requests that we received from January to June 2011. For the first time, we’re not only disclosing the number of requests for user data, but we’re showing the number of users or accounts that are specified in those requests too. … We believe that providing this level of detail highlights the need to modernize laws like the Electronic Communications Privacy Act, which regulates government access to user information and was written 25 years ago – long before the average person had ever heard of email.”

      Google: “Transparency is a core value at Google. As a company we feel it is our responsibility to ensure that we maximize transparency around the flow of information related to our tools and services. We believe that more information means more choice, more freedom and ultimately more power for the individual. – We’ve created Government Requests to show the number of government inquiries for information about users and requests to remove content from our services. We hope this step toward greater transparency will help in ongoing discussions about the appropriate scope and authority of government requests. – Our interactive Traffic graphs provide information about traffic to Google services around the world. Each graph shows historic traffic patterns for a geographic region and service. By illustrating outages, this tool visualizes disruptions in the free flow of information, whether it’s a government blocking information or a cable being cut. We hope this raw data will help facilitate studies about service outages and disruptions.

      GigaOM: “Any lingering fantasies of the web as a no-man’s land where content is free from the restraints of geographical boundaries probably should be put to rest. Google Tuesday morning released a treasure trove of data relating to content-takedown requests, and the numbers speak for themselves: requests are up worldwide and Google complies with the majority of them. … When it comes to requests for user data, all that Google and companies of its ilk really can do is ensure that requests are within the bounds of the law and notify users of requests for their data. But in the United States, at least, the laws regarding web-user data are still fairly lax and don’t require a search warrant in many instances. It’s yet another example of the web and the law not being anywhere near on the same page. – It’s easy to poke them for being too willing to bend to the wills of government officials and authorities, but web companies can’t flaunt the laws of the countries in which they want to operate, either. Otherwise, as separate Google data illustrates, the lights might go out on their services in those countries.

      RWW: “Google has updated its Government Requests tool with data from the first half of this year. For the first time, the report discloses the number of users or accounts specified, not just the number of requests. Google also made the raw data behind government requests available to the public. … Electronic communications have changed a bit since 1986. They form a ubiquitous, always-on fabric of our lives now. Fortunately, Google isn’t any happier with the status quo than privacy-aware users are. It’s among a number of major Web companies pushing for better laws. And Google and other data-mining companies take their roles in public policy seriously. Both Google and Facebook’s lobbying efforts broke records this year.

      TC: “Google Declines To Remove Police Brutality Videos, Still Complies With 63% Of Gov’t Takedown Requests – US Government requests for user data jumped, however: 5950 versus 4287 during the same period in 2010, asking for information on 11,057 users. 93% of these were complied with, ‘fully or partially.’ So while they’re making something of a stand on removing data, they don’t seem to have any trouble giving it out.

      Guardian: “Figures revealed for the first time show that the US demanded private information about more than 11,000 Google users between January and June this year, almost equal to the number of requests made by 25 other developed countries, including the UK and Russia. – Governments around the world requested private data about 25,440 people in the first half of this year, with 11,057 of those people in the US. – It is the first time Google has released details about how many of its users are targeted by authorities, as opposed to the number of requests made by countries.

      VB: “Notably, in the United States, Google refused to remove YouTube clips showing police brutality. In these cases in particular, we are seeing how relatively neutral platforms such as YouTube can have great social impact depending on the intentions of the person posting the content and the integrity of the content host in keeping that content online.”

  • Gerrit Eicker 10:28 on 22. October 2011 Permalink
    Tags: , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , Mobile Media, Mobiler, , , , , , , , , , , , , Social Media Mobile, , Social Networking Mobile, , , , , , , , , Web Traffic   

    Social Media Mobile 

    Social networking on-the-go: U.S. mobile social media audience grows 37% in the past year; http://eicker.at/SocialMediaMobile

     
    • Gerrit Eicker 10:28 on 22. October 2011 Permalink | Reply

      ComScore: “[R]eleased results of a study on mobile social media usage based on data from its comScore MobiLens service, which showed that 72.2 million Americans accessed social networking sites or blogs on their mobile device in August 2011, an increase of 37 percent in the past year. The study also provided new insights into how mobile users interact with social media, finding that more than half read a post from an organization, brand or event while on their mobile device. – ‘Social media is one of the most popular and fastest growing mobile activities, reaching nearly one third of all U.S. mobile users,’ said Mark Donovan, comScore senior vice president for mobile. ‘This behavior is even more prevalent among smartphone owners with three in five accessing social media each month, highlighting the importance of apps and the enhanced functionality of smartphones to social media usage on mobile devices.‘ … In August 2011, more than 72.2 million people accessed social networking sites or blogs on their mobile device, an increase of 37 percent from the previous year. Nearly 40 million U.S. mobile users, more than half of the mobile social media audience, access these sites almost every day, demonstrating the importance of this activity to people’s daily routines. … 70 Percent of Mobile Social Networkers Posted a Status Update While on Their Mobile Device

      RWW: “While the mobile browser accounted for more visits, research shows that the social networking app audience has grown five times faster in the past year. While the mobile browsing social networking audience has grown 24% to 42.3 million users, the mobile social networking app audience shot up 126% to 42.3 million users in the past year. … People are increasingly checking social networks more from their mobile devices. More than half (52.9%) read posts from organizations/brands/events. One of three mobile social networkers snagged a coupon/offer/deal, and twenty-seven percent clicked on an ad while visiting a social networking site.”

      SEL: “In the US roughly 40 million mobile users access social networks (broadly defined to include blogs) on their handsets on a daily basis, according to comScore. The large number of mobile-social users comes as no surprise. Facebook previously announced it had 350 million active mobile users globally. – Google also sees mobile as a strategic front for social networking growth. The new version of Android (‘Ice Cream Sandwich’) prominently features Google+.”

      AF: “The consultancy found that 70 percent of those using Facebook on mobile devices – including smartphones and tablets – posted a status update from the gizmo on the go. – Facebook earlier this year disclosed that total mobile users worldwide exceeds 350 million. The U.S. portion of this at the end of August surpassed 57.3 million, according to comScore MobiLens.”

      ZDNet: “So far, there’s already some solid footing for mobile advertisers to get involved here. Mobile users accessing social networks were found to be more likely to interact with brands on those sites than not, and 52.9 percent said they read posts from organizations/brands/events. Additionally, one in three in this group said they received some kind of coupon/offer/deal, with one in four clicking on an ad while conducting mobile social networking.

  • Gerrit Eicker 09:05 on 20. October 2011 Permalink
    Tags: , , , , , , , , , , Funnels, Goal Flow, , , Google Analytics Flow Visualization, , , , , , , , , Mobile Reports, Multi-channel Funnels, , Non-linear, , , Path Analysis, Plot Rows, , , , Site Speed, Site Speed Report, , , , , , , , Traffic Visualisation, , , , , , , , Visitors Flow, , , , , Web Traffic   

    Google Analytics: Flow Visualization 

    Google introduces Flow Visualization for Google Analytics: visitors flow and goal flow; http://eicker.at/GAFlowVisualization

     
    • Gerrit Eicker 09:06 on 20. October 2011 Permalink | Reply

      Google: “[A]t Web 2.0 Summit [we] unveiled the release of ‘Flow Visualization’ in Google Analytics, a tool that allows you to analyze site insights graphically, and instantly understand how visitors flow across pages on your site. Starting this week, ‘Visitors Flow’ and ‘Goal Flow’ will be rolling out to all accounts. Other types of visualizers will be coming to Google Analytics in the coming few months, but in the meantime, here’s what you can expect from this initial release. … The Visitors Flow view provides a graphical representation of visitors’ flow through the site by traffic source (or any other dimensions) so you can see their journey, as well as where they dropped off. … Goal Flow provides a graphical representation for how visitors flow through your goal steps and where they dropped off. Because the goal steps are defined by the site owner, they should reflect the important steps and page groups of interest to the site. In this first iteration, we’re supporting only URL goals, but we’ll soon be adding events and possibly other goal types. … These two views are our first step in tackling flow visualization for visitors through a site, and we look forward to hearing your feedback as all users begin experiencing it in the coming weeks. We’re excited to bring useful and beautiful tools like these to help you understand your site, so stayed tuned for more!

      SEL: “Path analysis has historically been a feature that provided little insights on user behavior, mainly because visitors behave in such non linear ways that it is hard to learn something from their paths, even when looking at aggregated data. The best option to path analysis has been to analyze micro conversions, i.e. looking at each page and trying to learn if the page has fulfilled its objective. However, the visualizations below bring some interesting approaches that will be very helpful for web analysts. … As some might recognize, the visualization used on this feature is very similar to the one created by Charles J. Mainard shown below. This image, created in a 1869 to describe Napoleon’s disastrous Russian campaign of 1812, displays several variables in a single two-dimensional image…”

      LM: “I need Red Bull. Seriously, I can’t keep up with all the new features and announcement coming from Google Analytics lately. In the last few months, they’ve released a new interface, real-time data, multi-channel funnels, Google Analytics Premium, Google Webmaster Tools integration, plot rows, site speed report, new mobile reports, social media tracking, and now Flow Visualization. You can read their official announcement, but ours is much more informative [and we have video!]. … Navigation Flow: provides a graphical representation of your start/end nodes, and the paths to or from your site that your visitors follow. When you create a navigation flow, you have the option to identify a single page by URL, or to create a node that represents a group of pages whose URLs match a condition, for example, all pages whose URL contains a particular product identifier like shirts or jackets. … Sometimes, things are best explained with video. This is one of those times, so sit back, relax, and enjoy this brief tour through this new feature.

  • Gerrit Eicker 08:25 on 11. October 2011 Permalink
    Tags: , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , Web Traffic   

    Google Plus: Failure to Launch? 

    Chitika: Google Plus growth spurt short lived after it went public.What’s its USP? http://eicker.at/GooglePlusLaunch

    (More …)

     
    • Gerrit Eicker 08:25 on 11. October 2011 Permalink | Reply

      Chitika: “Mid-morning September 20th, Google+ officially entered public beta, drumming up the level of interest of the site far and wide across the web. Although able to boast 25 million unique visitors after only four weeks of operation, Google’s newest attempt at a social network saw its user base dwindle as shown by a recent article from Chitika Insights. … Reportedly, Google+ saw a surge in traffic of over 1200% due to the additional publicity, but the increased user base was only temporary, as was projected in an earlier insights post. – The data shows that, on the day of its public debut, Google+ traffic skyrocketed to peak levels. But, soon after, traffic fell by over 60% as it returned to its normal, underwhelming state. It would appear that although high levels of publicity were able to draw new traffic to Google+, few of them saw reason to stay. … The supply of users for social media sites is limited. To survive you must stand out and provide a service that others do not. – Features unique to your site must be just that – unique and difficult to duplicate – if they are not, the competitive advantage quickly disappears.

      RWW: “We at RWW can informally corroborate Chitika’s findings that interest in Google Plus is on the wane. Our monthly referrals from there are down 38% since their peak, while Facebook referrals are up 67% and Twitter referrals up 51% over the same period. – As we reported last week, the +1 button isn’t gaining much traction, either. Despite all the new features and responsiveness to user feedback, Google Plus just doesn’t seem to be catching on. There’s only so much time in a day for social networking, and this newcomer isn’t converting many users.

      Inquirer: “Google’s problem is not getting users in the first place, it seems, but rather keeping them after they have arrived. For now it appears that a lot of users are merely curious about Google+, but return to the tried and tested format of Facebook when the lustre fades. … While the jury is still out on which firm will win this battle, there’s no denying that the intense competition could make both social networks considerably better than they were before.

      RWW: “Many people say they don’t find [Google Plus] compelling though. We asked on Twitter and on Facebook and most people said that the value proposition was too unclear, that it wasn’t valuable enough to warrant the investment of time relative to the already heavy burden of Twitter and Facebook engagement. Google knows it needs to make changes to the service to increase its user retention. But you know who else has always struggled with new user retention? Twitter!

      UG: “While this is interesting, Chitika doesn’t provide much information about its data-gathering technique. Because it is an ad-network, one may suspect that it can see the referrer (Google+) to sites using its ad code. If that’s the case (and I’m not saying that it is), the method is not very accurate but one could argue that they should be able to pick up a (very) gross trend snapshot.The bottom-line is that Google+ saw a traffic spike during its public opening and that it subsequently faded, and I can believe that. This sound quite ‘normal’ to me, though. Secondly, second-hand data sampling on a 10-day period is hardly enough to tell if Google+ is a ‘failure to launch’ as Chitika puts it, so I think that there’s a bit of over-dramatization here. – It will take months (or years) and many evolution before we realize how well (or not) Google+ does/did. In the meantime, and as long as we don’t know how this data was measured, I would advise taking this with a grain of salt.”

c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel