Having just finished teaching a couple of introductory data science classes this past academic quarter, I came to the realization that it’s hard for newbie data scientists to get started on a project of reasonable complexity. A number of my students got frustrated in establishing a framework (or “blueprint”) with which to start building their machine learning applications for their class project. A new title from Packt Publishing, “AI Blueprints,” by Dr. Joshua Eckroth, helps solve this problem by laying out six real-life business scenarios for how AI can solve critical challenges with state-of-the-art AI software libraries and a well thought out workflow. For each scenario, Eckroth provides the following means with which to approach the project:
- Characterize the problem
- Develop a method
- Design a deployment strategy
- Design and implement a continuous evaluation
Here is a series of abstracts describing each of the project blueprints offered in the book:
A Blueprint for Planning Cloud Infrastructure (Chapter 2) – This is a software framework that should delight any data-driven start-up company that is motivated to optimize the cost of their compute resources. The project outlines how to use a “constraint solver” to determine how many machines in the cloud are needed to complete tasks in the smallest amount of time and within a certain monetary budget. I love the example deployment used: image processing for the Pan-STARRS1 astronomy data archive using the Amazon Elastic Compute Cloud (EC2). The constraint solver used was Red Hat’s OptaPlanner, and all the code is in Java.
A Blueprint for Making Sense of Feedback (Chapter 3) – This framework features a java back-end with a Python front end. The goal of the project was to examine Tweets and comments obtained from the Twitter and Reddit APIs, along with news articles from News API. Then the sentiment of the “feedback” is estimated, i.e. positive, negative, or neutral. The code uses the CoreNLP library (sentiment analysis using a technique known as recursive neural tensor networks). As a front-end, the project uses the Dash Python library for creating dashboards using plotly.js to draw the plots.
A Blueprint for Recommending Products and Services (Chapter 4) – Here we have a Python based framework using two Python libraries: implicit for building recommendation systems, and faiss from Facebook AI Research for efficient nearest neighbor search. Included is a technique that can be used for online evaluation of the recommendation system’s accuracy.
A Blueprint for Detecting Your Logo in Social Media (Chapter 5) – This is a very timely framework based on deep learning and convolutional neural networks (CNNs). The idea is to use neural networks and deep learning for image processing, specifically for detecting and recognizing brand logos embedded in images. The code discussed is all Python using the TensorFlow and Keras software frameworks. The chapter includes a well-crafted brief overview of deep learning. The example uses the popular YOLO algorithim for image detection and recognition.
A Blueprint for Discovering Trends and Recognizing Anomalies (Chapter 6) – This Python based framework develops a Bayesian state space time-series model, also known as a dynamic linear model (DLM) for forecasting website traffic. The goal is to discover linear trends with static models and moving-average models, along with seasonal trends. An additional goal is recognizing anomalies by noticing significant deviations from normal activity using robust principle component analysis (RPCA) and clustering. The example uses the daily frequency of email messages on the R-help mailing list, an email list for user seeking help with the R programming language.
A Blueprint for Understanding Queries and Generating Responses (Chapter 7) – This framework addresses interactive AI systems that allow users to directly ask for answers to various kinds of questions. The project shows how to configure and train the Rasa NLP Python library to recognize user intentions in text. Next, it shows how to develop domain-specific logic using the Prolog language. Then the process of generating grammatically correct responses using the SimpleNLG Java library is explored. Finally, Google API’s are used to convert speech to text and text to speech.
I really appreciate the premise of this book, setting the stage for machine learning development with real-life frameworks that can help kick-start the process. I’ve already picked up a number of good ideas, including awareness of some cool libraries. My only concern with the book is the inclusion and dependence of URLs that will likely go stale after a year or so. This could unfortunately reduce the shelf-life of the book. The code bundle for the book is available on this GitHub repo.
Contributed by Daniel D. Gutierrez, Managing Editor and Resident Data Scientist for insideAI News. In addition to being a tech journalist, Daniel also is a consultant in data scientist, author, educator and sits on a number of advisory boards for various start-up companies.
Sign up for the free insideAI News newsletter.
Speak Your Mind