Market Researchers – Are You Interested in a Job As a Survey Researcher?

Market research analysts are responsible for determining what products people like to buy and at what price point they will purchase them. They will also gather data on the most effective means of marketing a particular product, and they will analyze past sales in order to predict the future.

Market research analysts will frequently use the Internet, the telephone, and mail, as well as personal interviews, in order to obtain information about consumers. After putting this information together, a survey researcher will then present this information in the form of charts and graphs to a company, so that they can then utilize this information in order to increase their sales.

Survey researchers will typically spend all day conducting surveys that will help corporations make positive fiscal decisions, and their methods of test taking will mirror those of the market research analyst.

The working conditions for these professionals typically involve strict deadlines and overtime may be needed. Survey researchers may also have to travel in order to conduct interviews in focus groups and face to face. A bachelors degree is usually sufficient in order to gain entry into a position in this field, although positions are fairly competitive.

It is also helpful for those hoping to enter the field to obtain internship experience working for companies and learning how to collect data, and most survey researchers are good at working with other people in order to conduct surveys and to identify the needs of potential customers.

In 2006, these professionals had over 250,000 jobs in America, with research analysts holding the overwhelming majority of these positions. Company management are the most frequent employers of these individuals, and professional firms are the entities which will usually hire survey researchers. The job outlook for these persons is expected to grow at a rapid pace over the next 10 years, as companies become increasingly competitive in order to increase market share.

In 2006, the middle 50th percentile of market research analysts made between $42,190 and $84,070, with survey researchers earning between $22,150 and $50,960.

Meet the New Software Analyst

As US equity markets closed out 2013 at new highs, the future of equity research is facing significant change. With “price targets” being reset for many soaring social, cloud and big data analytics stocks let’s meet the new software analyst. But first, a little background.

Equity research has marginally evolved with investment styles and trading strategies over the past couple of decades. The days of primary fundamental research, particularly on the sell-side, faded long ago. Most analysts don’t have the gumption or the time.

Shrinking commissions and heightened regulatory scrutiny yield lower returns on investment, continuing a cycle of reducing research resources. The sell-side analyst role now has three principal components: 1) to provide access to company managements in their existing coverage universe; 2) to provide coverage for companies that are underwriting clients; and, 3) to provide “hot data points” – particularly for handicapping quarterly results. Buy-siders compete for management access and seek to combine these data points with their own findings to feed trading decisions.

Unfortunately, individual data points legally obtained and disseminated rarely move the needle in providing an adequate sample size on which to base an investment, no less a trading decision. For buy-siders, even aggregating data points from numerous analysts covering a particular sector or company does not provide a relevant statistical sample.

Limitations of today’s analytics

For example, let’s say a mid-sized publicly-traded technology company goes to market with a blend of 100 direct sales teams (one salesperson and one systems engineer per team) and 500 channel partners (mixed 75%/25% between resellers and systems integrators). Further, assume that these teams and partners are dispersed in proportion to the company’s 65%/35% sales mix between North America and international. How many salespeople and channel partners would an analyst have to survey to get an accurate picture of the company’s business in any given quarter?

If a typical sell-side analyst covers 15-20 companies (quintuple that for buy-side analysts), the multiplier effect of data points that an analyst would have to touch makes it humanly impossible to gather sufficient information. Moreover, with 50% of most tech company deals closing in the final month of a quarter, of which half often close in the final two weeks of that month, how much visibility can an analyst have?

Further, why would a company’s sales team talk to anyone from the investment community in the final weeks of a quarter when the only people they are interested in speaking with are customers who can sign a deal? Now consider that many companies throughout the supply chain have instituted strict policies in response to recent scandals to prevent any employee from having any contact with anyone from the investment community.

Even the best-resourced analysts lack the tools to correlate the data points he/she does gather to identify meaningful patterns for either an individual company or an entire sector. Finally, with shorter-term investing horizons and high-frequency trading dominating volume, how relevant are these data points anyway?

The big data approach to research

Stocks generally tend to trade on either sector momentum or overall market momentum. Macro news or events are far more likely to impact a sector’s movement, and therefore a stock’s in that sector. This includes volatility around quarterly earnings – which can run 10%-30% for technology stocks – because the majority of “beats” or “misses” are frequently impacted by macro factors. Excuses such as “sales execution” or “product transition” or “merger integration” issues are less frequent than conference calls would suggest. “Customers postponed purchases” or “down-sized deals” or “customers released budgets” or “a few large deals closed unexpectedly” are more likely explanations.

Now, major sell-side and buy-side institutions are trialing new software that leverages cloud infrastructure and big data analytics to model markets and stocks. Massive data sets can include macro news from anywhere in the world, such as economic variables, political events, seasonal and cyclical factors. These can be blended with company-specific events, including earnings, financings or M&A activity. Newer data sources, including social media, GPS and spatial can also be layered into models. Users can input thousands of variables to build specific models for an entire market or an individual security.

As with any predictive analytics model the key is to ask the right questions. However, the machine learning capabilities of the software will allow the system to not only answer queries but to also determine what questions to ask.

The advantages to both sell-side and buy side firms are significant. They include:

  • Lower costs. Firms can avoid major technology investments by leveraging the scale and processing power of cloud-based infrastructure and analytics software. They can collect, correlate and analyze huge, complex data sets and built models in a fraction of the time and cost that it takes in-house analysts to do.
  • Accuracy. Machine learning and advanced predictive analytics techniques are far more reliable and scalable than models built in Excel spreadsheets. Patterns can be detected to capture small nuances in markets and/or between securities that high-frequency trading platforms have been exploiting for years.
  • Competitiveness. The software can make both sell-side and buy-side firms more competitive with the largest, most technologically advanced hedge funds that have custom-built platforms to perform analytics on this scale in real time. In addition to enhancing performance, the software can be leveraged to improve client services by making select tools available to individual investors.

Analysts become data scientists

The analyst skill set must evolve. They will still have to perform fundamental analysis to understand the markets they follow and each company’s management, strategy, products/services and distribution channels. And they will still have to judge whether a company can execute on these factors.

But to increase their value, analysts will have do statistical modeling and use analytics tools to gain a deeper understanding of what drivers move markets, sectors or particular stocks. Data discovery and visualization tools will replace spreadsheets for identifying dependencies, patterns and trends, valuation analysis, and investment decision making. Analysts will also need a deeper understand client strategies and trading styles in order to tailor their “research” to individual clients.

These technologies may well continue to shrink the ranks of analysts because of their inherent advantages. But those analysts who can master these techniques to complement their traditional roles may not only survive, but lift their value – at least until the playing field levels – because of their new alpha-generating capabilities.

Sprott Analyst Has Zero Doubt on Higher Natural Gas Prices

Introduction: We talked with Sprott Asset Management Research Analyst Eric Nuttall about the natural gas situation in Canada and the fate of many CBM gas producers and developers. Since our last conversation spot natural gas prices have dropped by 15 percent. Natural gas storage levels are about 2.5 trillion cubic feet, some 423 billion cubic feet higher than a year ago.

Eric Nuttall told us, “Nearly all small-cap natural gas producers have taken it in the teeth this year. The price decreases in their stocks have been absolutely brutal. There are now companies whose stocks are down 40 percent year-to-date, and yet are still strongly growing production on an adjusted share basis.” How will the CBM and natural gas sector pan out through the end of this year? He believes the gas storage surplus will correct itself.

StockInterview: How are the lower natural gas prices impacting Coalbed Methane producers?

Eric Nuttall: For many CBM or shallow gas producers, this means their current drilling program is likely uneconomic, suggesting deferrals in drilling programs until natural gas prices strengthen. It is this very supply response that we need to balance storage levels, so it should not come as a complete surprise.

StockInterview: What, then, should investors do while storage levels are rebalancing?

Eric Nuttall: I would view this period as an opportunity for medium to long-term minded individuals to start building positions in not just unconventional gas producers, but conventional ones as well. The long-term fundamentals are still extremely bullish for natural gas. Many quality names are down 20 to 40 percent year-to-date.

StockInterview: How do you view the long-term fundamentals for gas?

Eric Nuttall: North American natural gas production has been in decline for several years. Most incremental production is coming from smaller, more expensive-to-drill, thinner economic, higher decline pools and reservoirs. Over the past five years first-year decline rates on natural gas wells have doubled to 50 percent. The base decline rate has also doubled to approximately 25 to 30 percent. Pool size has also decreased materially over that time frame. The Western Canadian Sedimentary Basin and much of the US producing basins are mature. Consequently, higher and higher natural gas prices are required to create incentive for producers to drill increasingly marginal wells.

StockInterview: And you expect a continuation of declining natural gas production? And that is that your premise for higher natural gas pricing?

Eric Nuttall: Conventional gas production has been in decline for many years, and the growth areas have largely been unconventional, such as the Piceance Basin (tight gas), the Barnett Shale (shale gas), and the Jonah Field (tight, deep gas). Also, many of the growth assets, such as the Barnett Shale, are already a few years into development, and because the wells have such a steep decline rate in the first few years, it is only adding to the depleting base that we have to make up. It is unlikely that over the next three years, the increase in unconventional gas can offset the decline in conventional, because the depleting base is so much larger. The major natural gas basins in North America are mature. Decline rates are increasing. Pool size is decreasing. Rig count is increasing yet production is at best flat. Until LNG imports increase in a material way, which is not expected for at least four or five more years, I think the case for healthy natural gas prices is intact.

StockInterview: Earlier, you noted drilling was more expensive.

Eric Nuttall: Over the past year, onshore drillings costs are up over 15 percent while operating costs are up over 10 percent. A recent Wall Street Journal article commented on how rig rates for the Gulf of Mexico, on very deep drilling platforms, are as high as $520,000 per day, up from $185,000 a few years ago. And the drilling platforms are still leaving the Gulf of Mexico! Although many are leaving the Gulf of Mexico to go to more prospective areas such as the West African Coast, the current rig situation is still somewhat tight in the Gulf. We have only begun to see signs of moderating rig rate pricing.

StockInterview: How would bad weather, such as a hurricane, impact natural gas prices?

Eric Nuttall: Short term, you would see both natural gas and related stocks surge. If a hurricane strikes the producing area of the Gulf, and we almost need one to – to correct the surplus supply situation. Initially, you’ll have an emotional upward response. Only after assessing the status of production platforms and sub-sea infrastructure would we know the longer-term impact.

StockInterview: Should investors be watching the Weather Channel and ready to phone their stockbrokers?

Eric Nuttall: Timing on any natural gas investment right now is tricky. You need to have a medium- to longer-term focus. We probably have another two months of volatility. There are two camps right now on natural gas. One camp is saying that due to bloated storage levels companies are going to increasingly lay down their drilling rigs, cut production guidance, and stress their balance sheets. Then in the fall, when companies set their 2007 budgets, they will be using low gas prices and presenting moderating production growth profiles to their investors.

StockInterview: What does the other camp say?

Eric Nuttall: Another camp says that the current natural gas strip already discounts the present and forecasted storage levels. Also, stocks are cheap on a price-to-cash flow and price-to-net asset value ratios, and now is the time to load up on the stocks. I lean towards this viewpoint. But I am also admitting that until the fall, barring a severe hurricane, it is likely that the stocks are going to trade sideways, as opposed to in any clear direction.

StockInterview: One equities strategist, whom we interviewed, suggested some time in August we might start to see the natural gas stocks moving higher.

Eric Nuttall: There is the potential that we might endure another month or two of flat trading in small cap natural gas stocks. By the end of August, it is likely that we will have had both a supply and demand response – worries of massive laying down of rigs, forced well shut-in’s, and overleveraged balance sheets should have subsided. Investors will begin to focus on the natural gas strip rather than spot prices, which currently are around $9.00 for the upcoming winter and $8.00 for next summer.

StockInterview: And until then?

Eric Nuttall: Until that time comes, I think it likely, as a group, the large caps will outperform. They are more weighted towards oil, and have recently been catching a bid on the heel of a huge $22 billion all-cash takeover by Anadarko of Western Gas and Kerr-McGee. Importantly for unconventional gas investors, Anadarko paid around $2.00 for 3P (Possible) Mcf, which is very healthy (Western Gas was predominantly tight gas in Wyoming and coalbed methane in the Powder River Basin). It speaks to Anadarko’s view of strong long-term natural gas fundamentals. These all-cash transactions likely set the bottom in the large caps.

StockInterview: What do you see for the near-term?

Eric Nuttall: Many people have been hoping that warm weather or hurricanes would assist in working off the excess supply, but Mother Nature hasn’t been terribly helpful so far this summer. It appears that we will exit the natural gas injection season at least 10% over last year. Barring any incredible heat waves or significant hurricanes, natural gas prices are likely to remain sub-$6.50 until the fall. Unless we have a serious hot spell or a significant hurricane, it is likely that natural gas stocks will be very volatile without clear direction over the summer into the fall. I would think not until the fall, probably September – October, when people begin to focus not on natural gas spot prices, but on the strip pricing for the winter, which is still over C$10. Until that time comes, I wouldn’t see any clear direction in the stocks. The market is now providing opportunities to buy companies with high quality management for below-average multiples, commonly measured on a price-to-cash flow metric.

StockInterview: Have you given up on the CBM sector or is it coming back?

Eric Nuttall: There is zero doubt in my mind that natural gas is an excellent long-term investment. We’ve peaked in our ability to increase production meaningfully, just as we have with light oil. I think for there to be an increase in long-term natural gas supply, you have to provide incentive to producers to go drill wells that increasingly have lower economic rates of return. And to do that, you need higher natural gas prices. One of the few remaining growth prospects in Canada for natural gas production is coalbed methane. At current gas prices, the economics are very challenging. So to get a supply response from coalbed methane producers, you again need higher gas prices. The current surplus in gas storage will correct itself, and investors should position themselves ahead of natural gas stocks reacting to this inevitability.

COPYRIGHT © 2007 by StockInterview, Inc. ALL RIGHTS RESERVED.

Eye-Tracking For Marketing Research

Ever watched a TV commercial and not known what it was advertising? Sometimes we can see the same advertisement day after day and even become familiar with the advertisement’s narrative content. Yet when asked what the advertisement is trying to sell, we are at a loss. The question is why is the commercial failing so badly?

One way to answer this question is to run a marketing research study and simply ask respondents why they didn’t or couldn’t engage with the branding message in the advertisement. This might provide an answer. However, research has shown that visual attention is complex and involves both conscious and unconscious impulses. Because visual attention often depends upon unconscious impulses, respondents may not really understand their own visual behaviour. This can lead respondents to give rationalizations for their patterns of visual attention that are, in fact, quite wrong. This is a serious problem as, in marketing research, a wrong answer is often much worse than no answer at all.

You may well have heard of eye-tracking for marketing research. When used in a marketing research study, eye-tracking can give important insights into viewers’ engagement with marketing material through visual behaviour analysis. At a very basic level, visual behaviour analysis allows the marketing researcher to see through the eyes of the customer and to determine the customer’s focus of attention at any given point in time. The hope is that by conducting visual behaviour analysis, we can spot potential problems with the marketing material before the campaign is launched.

What can visual behavior analysis tell us that we don’t already know? Marketing professionals rely upon marketing research to garner insights into customer opinions and behaviour. This data is often interpreted with the aid of empathic skills, intuition and experience. However, eye-tracking gives a more direct access to the viewer’s thought processes through visual behaviour analysis. This is important as eye-tracking is not merely about viewers’ eye-gaze patterns: visual behaviour analysis helps us understand what the viewer is thinking. When we watch a viewer’s eye-gaze pattern over an advertisement, we gain an understanding of the viewer’s thought processes. What they are looking at and why? Are they paying attention to the key branding visuals? What is the link between attention to branding visuals and the ability of the viewer to recall branding information at a later date? Do the viewers read textual information? If so, how much of the text do they read?

These are just some of the generic insights offered by visual behaviour analysis. However, when we combine visual behaviour data with contextual information relating to the advertisement, the respondents’ demographic data and the respondents’ self-reported data, it is possible to build up a rich picture of the viewers’ overall engagement with the advertisement in terms of both behaviour and underlying opinions. This data helps us to better understand the viewer. It helps us determine what marketing messages work for viewers and what marketing messages leave them cold. As part of a multi-modal marketing research study, eye-tracking allows us to determine if the viewers ‘get’ our marketing message. If the viewer does ‘get it’, eye-tracking studies will tell us why and if the viewer doesn’t ‘get it’, the visual behaviour analysis will give us the data we need to determine why the advertisement has failed.

Eye-tracking involves three important steps. These are:-

The study – for the results of the eye-tracking study to be valid, the study itself must be performed using a rigorous research methodology. What this means is that the study should be performed in a scientific manner. This is often a point of confusion as some people claim that eye-tracking is not a science but rather qualitative and subjective. This is both true and false. It is true that eye-tracking data can be analysed in a qualitative way. The analysts can draw subjective inferences from the eye-tracking data. However, the validity of these inferences depends upon the validity of the data upon which they are founded. In order for the data to be valid, it must be collected in a scientific fashion. Failure to do so will not only lead to validity problems with the data but will seriously undermine the validity of any inferences drawn from the data.

The Analysis – at its most basic level, eye-tracking data reduces to a series of ‘point of regard’ co-ordinates. For screen based test media, this can be a data file containing time-stamped screen co-ordinates of the tracking subject’s eye-gaze. This data needs to be analysed to gain useful insights from the study. What can be done? Well there are many useful eye-tracking metrics. For instance, it is possible to track every glance test subjects make on the product as and when it appears on the screen. To do this, the product visuals are tracked within the advertisement and intersected with the test subjects’ point of regard co-ordinates. This will allow the analyst to quantify the test subjects’ focus of attention on the product and monitor their level of attention over time. Basically, if a metric involves viewer’s focus on attention to media visuals, it can be used.

The interpretation – provided the eye-tracking data has been collected in a valid way and processed so as to produce useful information, the eye-tracking analyst will provide you with a rigorous set of data and metrics relating to the viewer’s engagement with the advertisement and highlight potential problem areas. The eye-tracking data will be complemented with test subjects’ self-reported data. Respondents will be questioned about problem areas within the media and their overall level of recall of branding information will be assessed. Where retention of key marketing messages is wanting, the analyst will review the respondents’ eye-tracking data to try to discover what went wrong.

Consider the benefits of running eye-tacking studies against prospective marketing campaigns before they are launched. The visual behaviour analysis could identify problems with a campaign which could be corrected before the campaign begins. This has the potential to make campaigns more effective and allow you to avoid the situation where viewers are watching your advertisement with little idea of what you are trying to sell.