Pinteresting
Shareaholic: Pinterest drives more referral traffic than Google Plus, YouTube, LinkedIn combined; http://eicker.at/Pinteresting
Shareaholic: Pinterest drives more referral traffic than Google Plus, YouTube, LinkedIn combined; http://eicker.at/Pinteresting
SocialFresh: When is the best time to publish online? To achieve social shares? Traffic? http://eicker.at/PublishingTiming
RWW: “When’s the Best Time to Blog und Share? – Anyone who spends their day on the Internet inevitably wonders this question. Should I start publishing later in the day, to hit the after-work traffic? Should I publish earlier in the morning, to catch commuters while they’re on the way to work? Or is everything completely random, driven by the off-chance that a post will end up on StumbleUpon and enjoy a slightly longer tail? Social sharing widget Shareaholic looked at its 2011 data, breaking it down to the top 100 days and times for sharing. See the results in Eastern Standard Time. … As most blogs know, the best time of day for social shares is between 8am and 12pm ET. Shareaholic’s data confirms this, showing that the most shares occur at 9am ET, moments before East coasters step into their offices to start the workday. Traffic declines throughout the day, spiking back up again around 9pm, and then slowly tapering off. Evidently, the best time of day to blog for pageviews is also 9am ET.”
SF: “Great content gets shared. Right? – But does the time and day that you publish that great content affect how much it gets shared or how many times it gets viewed? … We have some awesome data from Sharaholic on top days and times for getting your content seen and shared online. … If everyone is viewing content the most at 9am EST (they actually are), make sure your content is published and ready to be viewed shortly before then. … We wanted to take a look at two main metrics, social shares and traffic. If you want to mainly grow your social presence, getting more shares might be your goal. And for many, traffic is their biggest driving force. … Thursdays win out for the day with the most sharing. Social sharing in general is somewhat unpredictable pattern wise. But Thursday wins 10% more shares than all other days. In fact, 31% of the top 100 social share days in 2011 fell on Thursday. … In general, content later in the week looks to do better with sharing. … Pageviews also progress in a predictable order. Monday was top, then Tuesday is second, then Wednesday, then Thursday. We simply view more content on Monday and less as the week goes on. … 27% of all content shares occur between 8am and 12pm EST. – There is a spike at 9am and 10am and then a decline the rest of the day. There are also smaller but significant spikes in sharing at 2pm and 9pm EST. … The best time to blog for pageviews and social shares shares a lot of common ground, unlike the best day of the week. – Blog posts get the most views between 7am and 1pm EST on weekdays. The drop off of traffic is significant after that segment. … Also, use your analytics software to see what time zones read your content when.”
SF: “How to Increase EdgeRank and Add Fans in the Facebook Timeline – Historically, social marketers have widely accepted that once-per-day posting on Facebook was the right frequency of distribution to use to engage their Facebook fans. – A recent study conducted by bit.ly makes us think twice about this assumption, finding that the average shelflife of an update on Facebook is 3.2 hours before it disappears into the timeline and is no longer visible to users. … Using two different Facebook tools, we were able to determine the peak times when the fan volume for our page was high. – The combination of EdgeRank Checker and PageLever provides us insights that tell us the peak times that allow us to get our updates in front of as many fans as possible at the times they were logging on to check Facebook. – The best fan engagement time slots found were a spike between 6am and 8am and 6pm to 11pm.“
Does Google favour its own sites in search results? New study: Google less biased than Bing; http://eicker.at/SearchEngineBias
SEL: “Does Google favor its own sites in search results, as many critics have claimed? Not necessarily. New research suggests that claims that Google is ‘biased’ are overblown, and that Google’s primary competitor, Microsoft’s Bing, may actually be serving Microsoft-related results ‘far more’ often than Google links to its own services in search results. – In an analysis of a large, random sample of search queries, the study from Josh Wright, Professor of Law and Economics at George Mason University, found that Bing generally favors Microsoft content more frequently, and far more prominently, than Google favors its own content. According to the findings, Google references its own content in its first results position in just 6.7% of queries, while Bing provides search result links to Microsoft content more than twice as often (14.3%). … The findings of the new study are in stark contrast with a study on search engine ‘bias’ released earlier this year. That study, conducted by Harvard professor Ben Edelman concluded that ‘by comparing results across multiple search engines, we provide prima facie evidence of bias; especially in light of the anomalous click-through rates we describe above, we can only conclude that Google intentionally places its results first.’ … So, what conclusions to draw? Wright says that ‘analysis finds that own-content bias is a relatively infrequent phenomenon’ – meaning that although Microsoft appears to favor its own sites more often than Google, it’s not really a major issue, at least in terms of ‘bias’ or ‘fairness’ of search results that the engines present. Reasonable conclusion: Google [and Bing, though less so] really are trying to deliver the best results possible, regardless of whether they come from their own services [local search, product search, etc] or not. … But just because a company has grown into a dominant position doesn’t mean they’re doing wrong, or that governments should intervene and force changes that may or may not be “beneficial” to users or customers.”
Edelman/Lockwood: “By comparing results between leading search engines, we identify patterns in their algorithmic search listings. We find that each search engine favors its own services in that each search engine links to its own services more often than other search engines do so. But some search engines promote their own services significantly more than others. We examine patterns in these differences, and we flag keywords where the problem is particularly widespread. Even excluding ‘rich results’ (whereby search engines feature their own images, videos, maps, etc.), we find that Google’s algorithmic search results link to Google’s own services more than three times as often as other search engines link to Google’s services. For selected keywords, biased results advance search engines’ interests at users’ expense: We demonstrate that lower-ranked listings for other sites sometimes manage to obtain more clicks than Google and Yahoo’s own-site listings, even when Google and Yahoo put their own links first. … Google typically claims that its results are ‘algorithmically-generated’, ‘objective’, and ‘never manipulated.’ Google asks the public to believe that algorithms rule, and that no bias results from its partnerships, growth aspirations, or related services. We are skeptical. For one, the economic incentives for bias are overpowering: Search engines can use biased results to expand into new sectors, to grant instant free traffic to their own new services, and to block competitors and would-be competitors. The incentive for bias is all the stronger because the lack of obvious benchmarks makes most bias would be difficult to uncover. That said, by comparing results across multiple search engine, we provide prima facie evidence of bias; especially in light of the anomalous click-through rates we describe above, we can only conclude that Google intentionally places its results first.”
ICLE: “A new report released [PDF] by the International Center for Law und Economics and authored by Joshua Wright, Professor of Law and Economics at George Mason University, critiques, replicates, and extends the study, finding Edelman und Lockwood’s claim of Google’s unique bias inaccurate and misleading. Although frequently cited for it, the Edelman und Lockwod study fails to support any claim of consumer harm – or call for antitrust action – arising from Google’s practices. – Prof. Wright’s analysis finds own-content bias is actually an infrequent phenomenon, and Google references its own content more favorably than other search engines far less frequently than does Bing: In the replication of Edelman und Lockwood, Google refers to its own content in its first page of results when its rivals do not for only 7.9% of the queries, whereas Bing does so nearly twice as often (13.2%). – Again using Edelman und Lockwood’s own data, neither Bing nor Google demonstrates much bias when considering Microsoft or Google content, respectively, referred to on the first page of search results. – In our more robust analysis of a large, random sample of search queries we find that Bing generally favors Microsoft content more frequently-and far more prominently-than Google favors its own content. – Google references own content in its first results position when no other engine does in just 6.7% of queries; Bing does so over twice as often (14.3%). – The results suggest that this so-called bias is an efficient business practice, as economists have long understood, and consistent with competition rather than the foreclosure of competition. One necessary condition of the anticompetitive theories of own-content bias raised by Google’s rivals is that the bias must be sufficient in magnitude to exclude rival search engines from achieving efficient scale. A corollary of this condition is that the bias must actually be directed toward Google’s rivals. That Google displays less own-content bias than its closest rival, and that such bias is nonetheless relatively infrequent, demonstrates that this condition is not met, suggesting that intervention aimed at ‘debiasing’ would likely harm, rather than help, consumers.”
Google adds a limit on free Google Maps API: over 25,000 daily and you’re charged; http://eicker.at/GoogleMapsCharge
Google: “When the Maps API Terms of Service were updated in April of this year we announced that usage limits would be introduced to the Maps API starting on October 1st. With October upon us, I’d like to provide an update on how these limits are being introduced, and the impact it will have on your Maps API sites. The usage limits that now apply to Maps API sites are documented in the Maps API FAQ. However no site exceeding these limits will stop working immediately. We understand that developers need time to evaluate their usage, determine if they are affected, and respond if necessary. There are three options available for sites that are exceeding the limits: Reduce your usage to below the limits, Opt-in to paying for your excess usage at the rates given in the FAQ, Purchase a Maps API Premier license – To assist in evaluating whether your site is exceeding the usage limits we will shortly be adding the Maps API to the Google APIs Console. Once available you will be able to track your usage in the APIs Console by providing an APIs Console key when you load the Maps API. … We understand that the introduction of these limits may be concerning. However with the continued growth in adoption of the Maps API we need to secure its long term future by ensuring that even when used by the highest volume for-profit sites, the service remains viable. By introducing these limits we are ensuring that Google can continue to offer the Maps API for free to the vast majority of developers for many years to come.”
Google: “What usage limits apply to the Maps API? Web sites and applications using each of the Maps API may at no cost generate: up to 25,000 map loads per day for each API, up to 2,500 map loads per day that have been modified using the Styled Maps feature…”
Google: “How much will excess map loads purchased online cost? Applications generating map load volumes below the usage limits can use the Maps API at no cost providing the application meets the requirements of the Google Maps API Terms of Service. Excess map loads over the usage limits are priced as follows [for 1,000 excess map loads]: JS Maps API v3: $4, JS Maps API v3 styled maps: $4/$8, Static Maps API: $4, Static Maps API styled maps: $4/$8, Street View Image API: $4, JS Maps API v2: $10 – Excess map loads will not be offered online for the Maps API for Flash. Sites using the Maps API for Flash and exceeding the usage limits should migrate to the JS Maps API v3, or purchase a Maps API Premier license.”
Guardian: “Nothing free lasts forever; and it’s damn hard to make money putting ads on maps. That seems to be the conclusion to draw from Google’s decision to put limits on its Google Maps API. … 25,000 isn’t that many calls. – Although won’t immediately be cutting off those whose applications exceed the call rate, it’s clear that the easy days are over. And of course it also raises the question of whether Google has found that it’s too hard to monetise maps, or that the API calls are bypassing the best ways it has of monetising them. … Obviously, Google, as a business, is free to charge as and how it wants. But it will be interesting to see if this changes how developers approach the use of the maps APIs.”
Wired: “Bad news, map hackers; the Google Maps free ride may be coming to and end. … The bad news is that once your app or website exceeds those limits you’ll be forking out $4 for every 1,000 people that hit your site (or view a map in your mobile app). Alternately, developers can cough up $10,000+ for a Google Maps API Premier licence, which, in addition to the unlimited access offers more advanced geocoding tools, tech support, and control over any advertising shown. … In other words, Google appears to be interested mainly in collecting fees from sites with consistently heavy traffic rather than experiments that see a one-time traffic spike. It doesn’t protect against every potentially expensive use case, but it should make map mashup fans breathe a little easier. – Developers worried about the potential costs of the Google Maps API can always use OpenStreetMap, which is free and, in many parts of the world, much more detailed than Google Maps. Of course, OpenStreetMap lacks some Google Maps features, most notably an equivalent to Street View.”
AT: “Google’s approach to enforcement will likely not be very aggressive. According to the FAQ, sites that hit the rate limit and aren’t configured to pay overage fees will not immediately be cut off. This suggests that sites with an occasional traffic spike aren’t the intended target-Google is mainly looking to collect cash from sites with a consistently heavy load.”
PW: “Unfortunately, the price for styled maps could impact many more developers. Perhaps Google is charging for what it knows is a unique feature amongst its competitors. The feature is also likely extremely computation-intensive, which means it costs Google quite a bit more to provide that service.“
Google: “Understanding how the Maps API usage limits affect your sites – We recognise that sites may occasionally experience spikes in traffic that cause them to exceed the daily usage limits for a short period of time. For example, a media site that uses a map to illustrate a breaking news story, or a map-based data visualization that goes viral across social networks, may start to generate higher traffic volumes. In order to accommodate such bursts in popularity, we will only enforce the usage limits on sites that exceed them for 90 consecutive days. Once that criteria is met, the limits will be enforced on the site from that point onwards, and all subsequent excess usage will cause the site to incur charges. – Please be aware that Maps API applications developed by non-profit organisations, applications deemed by Google to be in the public interest, and applications based in countries where we do not support Google Checkout transactions or offer Maps API Premier are exempt from these usage limits. We will publish a process by which sites can apply for an exemption on the basis of the above criteria prior to enforcement of the limits commencing. Non-profit organizations are also encouraged to apply for a Google Earth Outreach grant, which provides all the additional benefits of a full Maps API Premier license. … To help you measure your site’s Maps API usage, we have now added the Maps API to the Google APIs Console.“
How do governments affect access to information? Google’s Transparency Report 2011; http://eicker.at/GoogleTransparencyReport
Google: “How do governments affect access to information on the Internet? To help shed some light on that very question, last year we launched an online, interactive Transparency Report. All too often, policy that affects how information flows on the Internet is created in the absence of empirical data. But by showing traffic patterns and disruptions to our services, and by sharing how many government requests for content removal and user data we receive from around the world, we hope to offer up some metrics to contribute to a public conversation about the laws that influence how people communicate online. – Today we’re updating the Government Requests tool with numbers for requests that we received from January to June 2011. For the first time, we’re not only disclosing the number of requests for user data, but we’re showing the number of users or accounts that are specified in those requests too. … We believe that providing this level of detail highlights the need to modernize laws like the Electronic Communications Privacy Act, which regulates government access to user information and was written 25 years ago – long before the average person had ever heard of email.”
Google: “Transparency is a core value at Google. As a company we feel it is our responsibility to ensure that we maximize transparency around the flow of information related to our tools and services. We believe that more information means more choice, more freedom and ultimately more power for the individual. – We’ve created Government Requests to show the number of government inquiries for information about users and requests to remove content from our services. We hope this step toward greater transparency will help in ongoing discussions about the appropriate scope and authority of government requests. – Our interactive Traffic graphs provide information about traffic to Google services around the world. Each graph shows historic traffic patterns for a geographic region and service. By illustrating outages, this tool visualizes disruptions in the free flow of information, whether it’s a government blocking information or a cable being cut. We hope this raw data will help facilitate studies about service outages and disruptions.”
GigaOM: “Any lingering fantasies of the web as a no-man’s land where content is free from the restraints of geographical boundaries probably should be put to rest. Google Tuesday morning released a treasure trove of data relating to content-takedown requests, and the numbers speak for themselves: requests are up worldwide and Google complies with the majority of them. … When it comes to requests for user data, all that Google and companies of its ilk really can do is ensure that requests are within the bounds of the law and notify users of requests for their data. But in the United States, at least, the laws regarding web-user data are still fairly lax and don’t require a search warrant in many instances. It’s yet another example of the web and the law not being anywhere near on the same page. – It’s easy to poke them for being too willing to bend to the wills of government officials and authorities, but web companies can’t flaunt the laws of the countries in which they want to operate, either. Otherwise, as separate Google data illustrates, the lights might go out on their services in those countries.”
RWW: “Google has updated its Government Requests tool with data from the first half of this year. For the first time, the report discloses the number of users or accounts specified, not just the number of requests. Google also made the raw data behind government requests available to the public. … Electronic communications have changed a bit since 1986. They form a ubiquitous, always-on fabric of our lives now. Fortunately, Google isn’t any happier with the status quo than privacy-aware users are. It’s among a number of major Web companies pushing for better laws. And Google and other data-mining companies take their roles in public policy seriously. Both Google and Facebook’s lobbying efforts broke records this year.”
TC: “Google Declines To Remove Police Brutality Videos, Still Complies With 63% Of Gov’t Takedown Requests – US Government requests for user data jumped, however: 5950 versus 4287 during the same period in 2010, asking for information on 11,057 users. 93% of these were complied with, ‘fully or partially.’ So while they’re making something of a stand on removing data, they don’t seem to have any trouble giving it out.”
Guardian: “Figures revealed for the first time show that the US demanded private information about more than 11,000 Google users between January and June this year, almost equal to the number of requests made by 25 other developed countries, including the UK and Russia. – Governments around the world requested private data about 25,440 people in the first half of this year, with 11,057 of those people in the US. – It is the first time Google has released details about how many of its users are targeted by authorities, as opposed to the number of requests made by countries.”
VB: “Notably, in the United States, Google refused to remove YouTube clips showing police brutality. In these cases in particular, we are seeing how relatively neutral platforms such as YouTube can have great social impact depending on the intentions of the person posting the content and the integrity of the content host in keeping that content online.”
Social networking on-the-go: U.S. mobile social media audience grows 37% in the past year; http://eicker.at/SocialMediaMobile
ComScore: “[R]eleased results of a study on mobile social media usage based on data from its comScore MobiLens service, which showed that 72.2 million Americans accessed social networking sites or blogs on their mobile device in August 2011, an increase of 37 percent in the past year. The study also provided new insights into how mobile users interact with social media, finding that more than half read a post from an organization, brand or event while on their mobile device. – ‘Social media is one of the most popular and fastest growing mobile activities, reaching nearly one third of all U.S. mobile users,’ said Mark Donovan, comScore senior vice president for mobile. ‘This behavior is even more prevalent among smartphone owners with three in five accessing social media each month, highlighting the importance of apps and the enhanced functionality of smartphones to social media usage on mobile devices.‘ … In August 2011, more than 72.2 million people accessed social networking sites or blogs on their mobile device, an increase of 37 percent from the previous year. Nearly 40 million U.S. mobile users, more than half of the mobile social media audience, access these sites almost every day, demonstrating the importance of this activity to people’s daily routines. … 70 Percent of Mobile Social Networkers Posted a Status Update While on Their Mobile Device”
RWW: “While the mobile browser accounted for more visits, research shows that the social networking app audience has grown five times faster in the past year. While the mobile browsing social networking audience has grown 24% to 42.3 million users, the mobile social networking app audience shot up 126% to 42.3 million users in the past year. … People are increasingly checking social networks more from their mobile devices. More than half (52.9%) read posts from organizations/brands/events. One of three mobile social networkers snagged a coupon/offer/deal, and twenty-seven percent clicked on an ad while visiting a social networking site.”
SEL: “In the US roughly 40 million mobile users access social networks (broadly defined to include blogs) on their handsets on a daily basis, according to comScore. The large number of mobile-social users comes as no surprise. Facebook previously announced it had 350 million active mobile users globally. – Google also sees mobile as a strategic front for social networking growth. The new version of Android (‘Ice Cream Sandwich’) prominently features Google+.”
AF: “The consultancy found that 70 percent of those using Facebook on mobile devices – including smartphones and tablets – posted a status update from the gizmo on the go. – Facebook earlier this year disclosed that total mobile users worldwide exceeds 350 million. The U.S. portion of this at the end of August surpassed 57.3 million, according to comScore MobiLens.”
ZDNet: “So far, there’s already some solid footing for mobile advertisers to get involved here. Mobile users accessing social networks were found to be more likely to interact with brands on those sites than not, and 52.9 percent said they read posts from organizations/brands/events. Additionally, one in three in this group said they received some kind of coupon/offer/deal, with one in four clicking on an ad while conducting mobile social networking.“
Google is making search more secure: starts encrypting search (and referrals!) via SSL; http://eicker.at/GoogleSSL
Google: “We’ve worked hard over the past few years to increase our services’ use of an encryption protocol called SSL, as well as encouraging the industry to adopt stronger security standards. For example, we made SSL the default setting in Gmail in January 2010 and introduced an encrypted search service located at https://encrypted.google.com four months later. Other prominent web companies have also added SSL support in recent months. – As search becomes an increasingly customized experience, we recognize the growing importance of protecting the personalized search results we deliver. As a result, we’re enhancing our default search experience for signed-in users. Over the next few weeks, many of you will find yourselves redirected to https://www.google.com [note the extra ‘s’] when you’re signed in to your Google Account. This change encrypts your search queries and Google’s results page. … [W]ebsites you visit from our organic search listings will still know that you came from Google, but won’t receive information about each individual query. They can also receive an aggregated list of the top 1,000 search queries that drove traffic to their site for each of the past 30 days through Google Webmaster Tools. … As we continue to add more support for SSL across our products and services, we hope to see similar action from other websites. That’s why our researchers publish information about SSL and provide advice to help facilitate broader use of the protocol.”
ATD: “Google said today it will soon use SSL encryption by default to improve security for signed-in search users, following SSL usage across the industry in Gmail, and on Twitter and Facebook. (You can see when a company is using SSL when a URL starts with ‘https.’) When SSL is used, Web site owners will get less information about what search terms visitors used to find them. Google said the move is a recognition of the increasingly customized and personalized nature of search.”
LM: “Now, if you were training at an SEO event like I was on the 17th and then was out of the office [and largely offline] on the 18th or if you live under a rock somewhere, you might not have heard Google’s official announcement that they will no longer be providing keyword data for organic search results if the user is signed into their Google account. – It’s not just Google Analytics that will be denied this data. … If you’re an SEO who uses the keywords report to prove the validity and efficacy of your work, you’re screaming and gnashing your teeth by this point. If you’re a causal analytics user, you may be asking the question ‘why do this?‘ … You can still see every single keyword that sent traffic through paid search, whether the user is signed in or not – just not organic search. Are users who click on paid search results less safe than users that click on organic results? … So far, since this change launched, LunaMetrics has seen 1% of our keywords clumped into (Not Provided.) A client with substantially larger organic search volume has already seen almost 2% of their organic keywords represented as Not Provided. We shall see how far-reaching these changes actually are in a few weeks when they’re rolled out completely.”
Google introduces Flow Visualization for Google Analytics: visitors flow and goal flow; http://eicker.at/GAFlowVisualization
Google: “[A]t Web 2.0 Summit [we] unveiled the release of ‘Flow Visualization’ in Google Analytics, a tool that allows you to analyze site insights graphically, and instantly understand how visitors flow across pages on your site. Starting this week, ‘Visitors Flow’ and ‘Goal Flow’ will be rolling out to all accounts. Other types of visualizers will be coming to Google Analytics in the coming few months, but in the meantime, here’s what you can expect from this initial release. … The Visitors Flow view provides a graphical representation of visitors’ flow through the site by traffic source (or any other dimensions) so you can see their journey, as well as where they dropped off. … Goal Flow provides a graphical representation for how visitors flow through your goal steps and where they dropped off. Because the goal steps are defined by the site owner, they should reflect the important steps and page groups of interest to the site. In this first iteration, we’re supporting only URL goals, but we’ll soon be adding events and possibly other goal types. … These two views are our first step in tackling flow visualization for visitors through a site, and we look forward to hearing your feedback as all users begin experiencing it in the coming weeks. We’re excited to bring useful and beautiful tools like these to help you understand your site, so stayed tuned for more!”
SEL: “Path analysis has historically been a feature that provided little insights on user behavior, mainly because visitors behave in such non linear ways that it is hard to learn something from their paths, even when looking at aggregated data. The best option to path analysis has been to analyze micro conversions, i.e. looking at each page and trying to learn if the page has fulfilled its objective. However, the visualizations below bring some interesting approaches that will be very helpful for web analysts. … As some might recognize, the visualization used on this feature is very similar to the one created by Charles J. Mainard shown below. This image, created in a 1869 to describe Napoleon’s disastrous Russian campaign of 1812, displays several variables in a single two-dimensional image…”
LM: “I need Red Bull. Seriously, I can’t keep up with all the new features and announcement coming from Google Analytics lately. In the last few months, they’ve released a new interface, real-time data, multi-channel funnels, Google Analytics Premium, Google Webmaster Tools integration, plot rows, site speed report, new mobile reports, social media tracking, and now Flow Visualization. You can read their official announcement, but ours is much more informative [and we have video!]. … Navigation Flow: provides a graphical representation of your start/end nodes, and the paths to or from your site that your visitors follow. When you create a navigation flow, you have the option to identify a single page by URL, or to create a node that represents a group of pages whose URLs match a condition, for example, all pages whose URL contains a particular product identifier like shirts or jackets. … Sometimes, things are best explained with video. This is one of those times, so sit back, relax, and enjoy this brief tour through this new feature.“
Chitika: Google Plus growth spurt short lived after it went public. – What’s its USP? http://eicker.at/GooglePlusLaunch
Chitika: “Mid-morning September 20th, Google+ officially entered public beta, drumming up the level of interest of the site far and wide across the web. Although able to boast 25 million unique visitors after only four weeks of operation, Google’s newest attempt at a social network saw its user base dwindle as shown by a recent article from Chitika Insights. … Reportedly, Google+ saw a surge in traffic of over 1200% due to the additional publicity, but the increased user base was only temporary, as was projected in an earlier insights post. – The data shows that, on the day of its public debut, Google+ traffic skyrocketed to peak levels. But, soon after, traffic fell by over 60% as it returned to its normal, underwhelming state. It would appear that although high levels of publicity were able to draw new traffic to Google+, few of them saw reason to stay. … The supply of users for social media sites is limited. To survive you must stand out and provide a service that others do not. – Features unique to your site must be just that – unique and difficult to duplicate – if they are not, the competitive advantage quickly disappears.”
RWW: “We at RWW can informally corroborate Chitika’s findings that interest in Google Plus is on the wane. Our monthly referrals from there are down 38% since their peak, while Facebook referrals are up 67% and Twitter referrals up 51% over the same period. – As we reported last week, the +1 button isn’t gaining much traction, either. Despite all the new features and responsiveness to user feedback, Google Plus just doesn’t seem to be catching on. There’s only so much time in a day for social networking, and this newcomer isn’t converting many users.”
Inquirer: “Google’s problem is not getting users in the first place, it seems, but rather keeping them after they have arrived. For now it appears that a lot of users are merely curious about Google+, but return to the tried and tested format of Facebook when the lustre fades. … While the jury is still out on which firm will win this battle, there’s no denying that the intense competition could make both social networks considerably better than they were before.”
RWW: “Many people say they don’t find [Google Plus] compelling though. We asked on Twitter and on Facebook and most people said that the value proposition was too unclear, that it wasn’t valuable enough to warrant the investment of time relative to the already heavy burden of Twitter and Facebook engagement. Google knows it needs to make changes to the service to increase its user retention. But you know who else has always struggled with new user retention? Twitter!”
UG: “While this is interesting, Chitika doesn’t provide much information about its data-gathering technique. Because it is an ad-network, one may suspect that it can see the referrer (Google+) to sites using its ad code. If that’s the case (and I’m not saying that it is), the method is not very accurate but one could argue that they should be able to pick up a (very) gross trend snapshot.The bottom-line is that Google+ saw a traffic spike during its public opening and that it subsequently faded, and I can believe that. This sound quite ‘normal’ to me, though. Secondly, second-hand data sampling on a 10-day period is hardly enough to tell if Google+ is a ‘failure to launch’ as Chitika puts it, so I think that there’s a bit of over-dramatization here. – It will take months (or years) and many evolution before we realize how well (or not) Google+ does/did. In the meantime, and as long as we don’t know how this data was measured, I would advise taking this with a grain of salt.”
Twitter introduces Twitter Web Analytics: helps website owners analyse Twitter’s imapct; http://eicker.at/TwitterWebAnalytics
Twitter: “Twitter Web Analytics, a tool that helps website owners understand how much traffic they receive from Twitter and the effectiveness of Twitter integrations on their sites. Twitter Web Analytics was driven by the acquisition of BackType, which we announced in July. – The product provides three key benefits: Understand how much your website content is being shared across the Twitter network – See the amount of traffic Twitter sends to your site – Measure the effectiveness of your Tweet Button integration … Twitter Web Analytics will be rolled out this week to a small pilot group of partners, and will be made available to all website owners within the next few weeks. We’re also committed to releasing a Twitter Web Analytics API for developers interested in incorporating Twitter data in their products.”
TC: “[A]t TechCrunch Disrupt, Twitter is debuting a brand new publisher analytics platform to help sites understand data around the Tweet button and sites using the t.co wrap. While the platform is still private, Twitter says it will be launched to the public soon. …Twitter is driving 100 million clicks per day to sites across the web, with 95 percent of links on Twitter wrapped in T.co. So clearly with both inbound and outbound traffic, Twitter is seeing massive traction for sites. … But while many third-party apps have tried to measure Twitter’s traffic for publishers, the best analytics always come from the source. This new product for publishers will decipher and make sense of all the inbound and outbound traffic from a publisher sites via the Tweet button and from links. … Of course, many people use Google Analytics and other platforms for their social media analytics from Twitter, Facebook and others. Luckily, you’ll be able to incorporate these in-depth Twitter analytics from your platform of choice, as Twitter will be releasing an API for this analytics platform. – The best part – all of this will be free for publishers. A few select publishers are currently testing the platform as well.”
RWW: “In August, Twitter took a big step toward cleaning up its analytics data by turning on its t.co short link wrapper for all tweeted links longer than 19 characters. T.co is still not fully implemented yet, but when it is, content providers on any platform will finally be able to accurately measure their referrals from Twitter. Prior to t.co, publishers would see different referrers if the clicks came from Twitter.com, Twitter’s client apps, third-party apps or bounced off some link shortener first. – That’s a very long tail, making Twitter referrals hard to measure. As a result of the confusion, Twitter was often discounted and discredited as a traffic referral source. But now that all tweeted links will go through t.co first, all clicks on Twitter links will come from one referrer. In short, Web publishers are just beginning to realize Twitter’s full traffic potential. – With the launch of Twitter Web Analytics, publishers will now be able to accurately measure the impact of Twitter in both inbound and outbound directions. With over 100 million active users, a number that has grown by 105% just this year so far, publishers and Twitter users are about to find out for sure about the value of this service.“
Gerrit Eicker 16:30 on 1. February 2012 Permalink |
Shareaholic: “Welcome to Shareaholic’s Referral Traffic Report. According to our findings based on aggregated data from more than 200,000 publishers that reach more than 260 million unique monthly visitors each month, Pinterest drives more referral traffic than Google Plus, LinkedIn and YouTube combined. … Pinterest grew from 2.5% of referral traffic in December to 3.6% of the referrals in January. That’s impressive growth from just owning .17% of the traffic back in July. … Referral traffic from Google+ dropped slightly in January, although Google’s product set (Google news, Google images, Gmail) continues to be a top referral source. Google continues to integrate Google+ into its offering more and more, so it will be an interesting trend to watch. … Eyeing its IPO this week, Facebook continues to dominate referral traffic, with mobile traffic alone accounting for 4.3% of overall referrals. Referral traffic grew by about 1% in January, making it the second fastest-growing site for referral traffic after Pinterest.”
GigaOM: “Not surprisingly, Facebook is holding steady at the top of Shareaholic’s survey, as it was responsible for more than a quarter of all referral traffic in January. Next in line was StumbleUpon, with 5.07 percent. It bears mention that while the Shareaholic survey is global, in the United States market alone StumbleUpon has in the past unseated Facebook as a top driver of referral traffic. – It’s exciting to see a relative newcomer growing so quickly in the web space. While the web’s more established companies are quite powerful these days, the fact that a startup like Pinterest has successfully established its own foothold shows that the competitive landscape is still alive and mainstream users are open to trying things from new players.”
Solis: “Many consumer brands are also experimenting with Pinterest, using pinboards to present complementary products, ideas, and imagery to inspire consumers to visualize and remix new possibilities. From fashion to interior design and home to retail to entertainment, brands are using Pinterest to thoughtfully assemble a curated lifestyle. And, they’re packaged for the social and mobile web and optimized for driving actions as part Facebook’s new frictionless sharing ecosystem.”
RWW: “Among many Pinterest users, as well as several artists who have had work pinned on the site, a code for giving proper credit is developing. Artist Laura C. George said Pinterest has no way of knowing if links tied to images link back to the original artists’ Web site, but so far Pinterest users have been better about giving credit than Tumblr.”