Our Blog

content marketing application

Marketing Tech: How AI Applies to Small Data

September 6, 2017 - Marketing, Technology - , , , ,
By: Patrick O'Fallon

As the world adopts more AI, phrases like deep learning, machine learning, neural networks, and Big Data have started to become more common in general discourse. The efficacy and value of leveraging predictive algorithms to accomplish tasks across very large data sets and within complex problems is evident and continuing to get even better. As a result, society is starting to think bigger and envisioning a future of self-driving cars, automated daily tasks/processes, and the ability to discover valuable knowledge that is deeply hidden within vast amounts of data.

While tackling big problems and challenges with AI is extremely important, not all problems are created equally. There is also a need to operate on small data sets to accomplish specific results. Within the Small Data concept, when data is scarce, AI can still be a viable option to fit these use cases. While this is a lot of science, it is effective in things like content marketing where it is going to help improve the impact of content in niche and specialty markets.

The value of Small Data

Small data is often described as actionable data, precise data, or informative data. Therein lies its value in the aggregate—Small Data reveals insights that are embedded within the micro that can render a more precise picture of the micro, but also add value to the macro. Researchers at Cornell Small Data Lab under the direction of Deborah Estrin work toward this vision by building apps and platforms to address Small Data, or data traces, in the fields of healthcare, shopping, and behavior analysis. In a healthcare example, the Cornell researchers developed apps to passively collect data for patients who suffer from rheumatoid arthritis to try and predict typically unpredictable condition exacerbations and flare-ups based on small deviations within the patients’ condition behavior. These researchers also attempt to leverage digital traces to analyze deviations, find patterns, and develop frameworks to address these smaller use cases.

Within the realm of marketing, Martin Lindstrom related Small Data to “seemingly insignificant behavioral observations containing very specific attributes pointing towards an unmet customer need.” In his book entitled Small Data, Lindstrom believes that marketers can take advantage of significant insights into customer behavior and personality profiles. Lindstrom professes that businesses are underestimating and not taking advantage of Small Data, as their market research is focused more on large data sets versus realizing the importance of closely observed smaller sample sizes. It appears that this flavor of Small Data analysis is akin to deriving more precise psychographic data of an individual or group of individuals, which is very coveted by modern businesses and marketers.

As new as these concepts sound, military intelligence has long held Small Data, also known as “human intelligence” or “on the ground intelligence” as critical for operations across the globe. As we are disrupted by automated tasks, drones, and massive data collection, many still believe that human intelligence on the smaller and more precise scale still carries significant weight in the modern intelligence dilemma.

Systems and algorithms fed by data

So where does AI fit on Small Data problem sets? Traditional models and modern Big Data algorithms are better nurtured and fed by large data sets. Generally speaking, the more good quality data is able to be fed into AI algorithms, the better the results. Andrew Ng relates deep learning to a rocketship in which the AI models are the rocket engine and the vast amounts of data are the fuel that feeds the system. However, when presented with less-than-sufficient amounts of data, modern systems are unlikely to reach the intended results.

This is a problem for Small Data as it is inherently insufficient to fully train the model and predictive systems. With AI models and more complex problem sets, there is a relational increase in the size of the AI model and the size of the required data set. In Small Data, methodologies have been created to better approach smaller problem sets. These techniques, both old and new, have been brought forth in order to help apply advanced algorithms to Small Data problems, such as:

  • Leverage Preprocessing or Existing Knowledge
    • The system can leverage preprocessing or knowledge from a different problem but similar domain space in order to optimize the results.

Content Marketing Application: Use engagement data within content to better understand the interests, needs, and format required to engage a market segment.

  • Probabilistic “Eigen” Basis Set
    • A concept that has been around for decades in computer vision and facial recognition leverages probabilistic models to help create eigenvectors. These eigenvectors are leveraged to create a basis set to represent a bigger set of training images. This basis set thought process is still relevant in creating new Small Data algorithms.

Content Marketing Application: Predict behavior in niche markets where there is insufficient data available to do Big Data training to predict keywords and topics required to increase Share of Voice and Share of Conversation.

  • Transfer Networks
    • Transfer Networks take outputs from a larger AI model as inputs to a smaller model that is built to address the Small Data problem set. NanoNets is a company attempting to capitalize on this model by leveraging concepts of pre-training and transfer networks.

Content Marketing Application: Automate tagging and hashtagging of your content so it reaches your target audience.  

Basically, data scientists are leveraging old ideas and new ideas to better approach Small Data. However, data correlations of pre-existing, pre-trained or preprocessed knowledge systems seem to be critical to the success of leveraging AI for Small Data.

The takeaway

While the world is thinking “big,” it seems almost counterintuitive to focus on “small,” especially when it comes to use cases of advanced algorithms and AI. It is evident that there is great value in leveraging patterns in the microcosm, not only on their own merit but also in how they relate to macroanalysis. Given the value of this Small Data, we still need smarter systems to derive this value across different domains at a quicker pace.

As Small Data analysis continues to grow in popularity, AI models are becoming more relevant to its success and can help drive big value to a smaller and precise problem space. Fixate IO is leveraging small data technologies to advance the way marketers approach content marketing in niche and specialized markets.


mm

Patrick O’Fallon is a Principal at Axiom Group in Denver, Colorado. Serving both public and private organizations, Patrick serves as an outsourced CIO to provide strategic consulting with specialized insight into the ever changing SecOps landscape. A graduate from Regis University with a degree in Computer Science, Patrick has a wide breadth of knowledge to support BiModal ITOps organizations by leveraging DevOps and SecOps expertise combined with over 15 years of ITOps experience. Patrick has consulted in Artificial Intelligence and Advanced Algorithms with various public and private organizations.