Home > General > Machine Learning In Python
24%
Machine Learning In Python

Machine Learning In Python

          
5
4
3
2
1

Out of Stock


Premium quality
Premium quality
Bookswagon upholds the quality by delivering untarnished books. Quality, services and satisfaction are everything for us!
Easy Return
Easy return
Not satisfied with this product! Keep it in original condition and packaging to avail easy return policy.
Certified product
Certified product
First impression is the last impression! Address the book’s certification page, ISBN, publisher’s name, copyright page and print quality.
Secure Checkout
Secure checkout
Security at its finest! Login, browse, purchase and pay, every step is safe and secured.
Money back guarantee
Money-back guarantee:
It’s all about customers! For any kind of bad experience with the product, get your actual amount back after returning the product.
On time delivery
On-time delivery
At your doorstep on time! Get this book delivered without any delay.
Notify me when this book is in stock
Add to Wishlist

About the Book

Machine Learning in Python shows you how to successfully analyze data using only two core machine learning algorithms and how to apply them using Python. By focusing on two algorithm families that effectively predict outcomes, this book is able to provide full descriptions of the mechanisms at work and the examples that illustrate the machinery with specific, hackable code. The algorithms are explained in simple terms with no complex math and applied using Python, with guidance on algorithm selection, data preparation and using the trained models in practice.

About the Author

Michael Bowles began his career as an assistant professor at MIT and went on to found and run two Silicon Valley startups, both of which went public.  Dr. Bowles currently teaches machine learning at Hacker Dojo (a shared workspace in Silicon Valley), consults on machine learning projects and is involved in a number of startups using machine learning in such areas as bioinformatics and high frequency trading. His courses at Hacker Dojo are nearly always sold out and receive great feedback from participants.



Table of Contents:
Introduction Chapter 1 The Two Essential Algorithms for Making Predictions ·Why Are These Two Algorithms So Useful? ·What Are Penalized Regression Methods? ·What Are Ensemble Methods? ·How to Decide Which Algorithm to Use ·The Process Steps for Building a Predictive Model ·Framing a Machine Learning Problem ·Feature Extraction and Feature Engineering ·Determining Performance of a Trained Model ·Chapter Contents and Dependencies Chapter 2 Understand the Problem by Understanding the Data ·The Anatomy of a New Problem ·Different Types of Attributes and Labels Drive Modeling Choices ·Things to Notice about Your New Data Set ·Classification Problems: Detecting Unexploded Mines Using Sonar ·Physical Characteristics of the Rocks Versus Mines Data Set ·Statistical Summaries of the Rocks versus Mines Data Set ·Visualization of Outliers Using Quantile]Quantile Plot ·Statistical Characterization of Categorical Attributes ·How to Use Python Pandas to Summarize the Rocks Versus Mines Data Set ·Visualizing Properties of the Rocks versus Mines Data Set ·Visualizing with Parallel Coordinates Plots ·Visualizing Interrelationships between Attributes and Labels ·Visualizing Attribute and Label Correlations Using a Heat Map ·Summarizing the Process for Understanding Rocks versus Mines Data Set ·Real]Valued Predictions with Factor Variables: How Old Is Your Abalone? ·Parallel Coordinates for Regression Problems--Visualize Variable Relationships for Abalone Problem ·How to Use Correlation Heat Map for Regression--Visualize Pair]Wise Correlations for the Abalone Problem ·Real]Valued Predictions Using Real]Valued Attributes: Calculate How Your Wine Tastes ·Multiclass Classification Problem: What Type of Glass Is That? Chapter 3 Predictive Model Building: Balancing Performance, Complexity and Big Data ·The Basic Problem: Understanding Function Approximation ·Working with Training Data ·Assessing Performance of Predictive Models ·Factors Driving Algorithm Choices and Performance--Complexity and Data ·Contrast Between a Simple Problem and a Complex Problem ·Contrast Between a Simple Model and a Complex Model ·Factors Driving Predictive Algorithm Performance ·Choosing an Algorithm: Linear or Nonlinear? ·Measuring the Performance of Predictive Models ·Performance Measures for Different Types of Problems ·Simulating Performance of Deployed Models ·Achieving Harmony Between Model and Data ·Choosing a Model to Balance Problem Complexity, Model Complexity and Data Set Size ·Using Forward Stepwise Regression to Control Over fitting ·Evaluating and Understanding Your Predictive Model ·Control Over fitting by Penalizing Regression ·Coefficients--Ridge Regression Chapter 4 Penalized Linear Regression ·Why Penalized Linear Regression Methods Are So Useful ·Extremely Fast Coefficient Estimation ·Variable Importance Information ·Extremely Fast Evaluation When Deployed ·Reliable Performance ·Sparse Solutions ·Problem May Require Linear Model ·When to Use Ensemble Methods ·Penalized Linear Regression: Regulating Linear Regression for Optimum Performance ·Training Linear Models: Minimizing Errors and More ·Adding a Coefficient Penalty to the OLS Formulation ·Other Useful Coefficient Penalties--Manhattan and Elastic Net ·Why Lasso Penalty Leads to Sparse Coefficient Vectors ·Elastic Net Penalty Includes Both Lasso and Ridge ·Solving the Penalized Linear Regression Problem ·Understanding Least Angle Regression and Its Relationship to Forward Stepwise Regression ·How LARS Generates Hundreds of Models of Varying Complexity ·Choosing the Best Model from The Hundreds LARS Generates ·Using Glmnet: Very Fast and Very General ·Comparison of the Mechanics of Glmnet and LARS Algorithms ·Initializing and Iterating the Glmnet Algorithm ·Extensions to Linear Regression with Numeric Input ·Solving Classification Problems with Penalized Regression ·Working with Classification Problems Having More Than Two Outcomes ·Understanding Basis Expansion: Using Linear Methods on Nonlinear Problems ·Incorporating Non-Numeric Attributes into Linear Methods Chapter 5 Building Predictive Models using Penalized Linear Methods ·Python Packages for Penalized Linear Regression ·Multivariable Regression: Predicting Wine Taste ·Building and Testing a Model to Predict Wine Taste ·Training on the Whole Data Set before Deployment ·Basis Expansion: Improving Performance by Creating New Variables from Old Ones ·Binary Classification: Using Penalized Linear Regression to Detect Unexploded Mines ·Build a Rocks versus Mines Classifier for Deployment ·Multiclass Classification: Classifying Crime Scene ·Glass Samples Chapter 6 Ensemble Methods ·Binary Decision Trees ·How a Binary Decision Tree Generates Predictions ·How to Train a Binary Decision Tree ·Tree Training Equals Split Point Selection ·How Split Point Selection Affects Predictions ·Algorithm for Selecting Split Points ·Multivariable Tree Training--Which Attribute to Split? ·Recursive Splitting for More Tree Depth ·Over fitting Binary Trees ·Measuring Over fit with Binary Trees ·Balancing Binary Tree Complexity for Best Performance ·Modifications for Classification and Categorical Features ·Bootstrap Aggregation: "Bagging" ·How Does the Bagging Algorithm Work? ·Bagging Performance--Bias versus Variance ·How Bagging Behaves on Multivariable Problem ·Bagging Needs Tree Depth for Performance ·Summary of Bagging ·Gradient Boosting ·Basic Principle of Gradient Boosting Algorithm ·Parameter Settings for Gradient Boosting ·How Gradient Boosting Iterates Toward a Predictive Model ·Getting the Best Performance from Gradient Boosting ·Gradient Boosting on a Multivariable Problem ·Summary for Gradient Boosting ·Random Forest ·Random Forests: Bagging Plus Random Attribute Subsets ·Random Forests Performance Drivers ·Random Forests Summary Chapter 7 Building Ensemble Models with Python ·Solving Regression Problems with Python Ensemble Packages ·Building a Random Forest Model to Predict Wine Taste ·Constructing a Random Forest Regressor Object ·Modeling Wine Taste with Random Forest Regressor ·Visualizing the Performance of a Random ·Forests Regression Model ·Using Gradient Boosting to Predict Wine Taste ·Using the Class Constructor for Gradient Boosting Regressor ·Using Gradient Boosting Regressor to Implement a Regression Model ·Assessing the Performance of a Gradient Boosting Model ·Coding Bagging to Predict Wine Taste ·Incorporating Non-Numeric Attributes in Python Ensemble Models ·Coding the Sex of Abalone for Input to Random Forest Regression in Python ·Assessing Performance and the Importance of Coded Variables ·Coding the Sex of Abalone for Gradient Boosting Regression in Python ·Assessing Performance and the Importance of Coded Variables with Gradient Boosting ·Solving Binary Classification Problems with Python Ensemble Methods ·Detecting Unexploded Mines with Python Random Forest ·Constructing a Random Forests Model to Detect Unexploded Mines ·Determining the Performance of a Random Forests Classifier ·Detecting Unexploded Mines with Python Gradient Boosting ·Determining the Performance of a Gradient Boosting Classifier ·Solving Multiclass Classification Problems with Python Ensemble Methods ·Classifying Glass with Random Forests ·Dealing with Class Imbalances ·Classifying Glass Using Gradient Boosting ·Assessing the Advantage of Using Random Forest Base Learners with Gradient Boosting ·Comparing Algorithms Summary Index


Best Sellers



Product Details
  • ISBN-13: 9788126555925
  • Publisher: Wiley India Pvt Ltd
  • Binding: Paperback
  • No of Pages: 360
  • ISBN-10: 8126555920
  • Publisher Date: 01 Jun 2015
  • Language: English

Related Categories

Similar Products

How would you rate your experience shopping for books on Bookswagon?

Add Photo
Add Photo

Customer Reviews

REVIEWS           
Click Here To Be The First to Review this Product
Machine Learning In Python
Wiley India Pvt Ltd -
Machine Learning In Python
Writing guidlines
We want to publish your review, so please:
  • keep your review on the product. Review's that defame author's character will be rejected.
  • Keep your review focused on the product.
  • Avoid writing about customer service. contact us instead if you have issue requiring immediate attention.
  • Refrain from mentioning competitors or the specific price you paid for the product.
  • Do not include any personally identifiable information, such as full names.

Machine Learning In Python

Required fields are marked with *

Review Title*
Review
    Add Photo Add up to 6 photos
    Would you recommend this product to a friend?
    Tag this Book
    Read more
    Does your review contain spoilers?
    What type of reader best describes you?
    I agree to the terms & conditions
    You may receive emails regarding this submission. Any emails will include the ability to opt-out of future communications.

    CUSTOMER RATINGS AND REVIEWS AND QUESTIONS AND ANSWERS TERMS OF USE

    These Terms of Use govern your conduct associated with the Customer Ratings and Reviews and/or Questions and Answers service offered by Bookswagon (the "CRR Service").


    By submitting any content to Bookswagon, you guarantee that:
    • You are the sole author and owner of the intellectual property rights in the content;
    • All "moral rights" that you may have in such content have been voluntarily waived by you;
    • All content that you post is accurate;
    • You are at least 13 years old;
    • Use of the content you supply does not violate these Terms of Use and will not cause injury to any person or entity.
    You further agree that you may not submit any content:
    • That is known by you to be false, inaccurate or misleading;
    • That infringes any third party's copyright, patent, trademark, trade secret or other proprietary rights or rights of publicity or privacy;
    • That violates any law, statute, ordinance or regulation (including, but not limited to, those governing, consumer protection, unfair competition, anti-discrimination or false advertising);
    • That is, or may reasonably be considered to be, defamatory, libelous, hateful, racially or religiously biased or offensive, unlawfully threatening or unlawfully harassing to any individual, partnership or corporation;
    • For which you were compensated or granted any consideration by any unapproved third party;
    • That includes any information that references other websites, addresses, email addresses, contact information or phone numbers;
    • That contains any computer viruses, worms or other potentially damaging computer programs or files.
    You agree to indemnify and hold Bookswagon (and its officers, directors, agents, subsidiaries, joint ventures, employees and third-party service providers, including but not limited to Bazaarvoice, Inc.), harmless from all claims, demands, and damages (actual and consequential) of every kind and nature, known and unknown including reasonable attorneys' fees, arising out of a breach of your representations and warranties set forth above, or your violation of any law or the rights of a third party.


    For any content that you submit, you grant Bookswagon a perpetual, irrevocable, royalty-free, transferable right and license to use, copy, modify, delete in its entirety, adapt, publish, translate, create derivative works from and/or sell, transfer, and/or distribute such content and/or incorporate such content into any form, medium or technology throughout the world without compensation to you. Additionally,  Bookswagon may transfer or share any personal information that you submit with its third-party service providers, including but not limited to Bazaarvoice, Inc. in accordance with  Privacy Policy


    All content that you submit may be used at Bookswagon's sole discretion. Bookswagon reserves the right to change, condense, withhold publication, remove or delete any content on Bookswagon's website that Bookswagon deems, in its sole discretion, to violate the content guidelines or any other provision of these Terms of Use.  Bookswagon does not guarantee that you will have any recourse through Bookswagon to edit or delete any content you have submitted. Ratings and written comments are generally posted within two to four business days. However, Bookswagon reserves the right to remove or to refuse to post any submission to the extent authorized by law. You acknowledge that you, not Bookswagon, are responsible for the contents of your submission. None of the content that you submit shall be subject to any obligation of confidence on the part of Bookswagon, its agents, subsidiaries, affiliates, partners or third party service providers (including but not limited to Bazaarvoice, Inc.)and their respective directors, officers and employees.

    Accept

    New Arrivals



    Inspired by your browsing history


    Your review has been submitted!

    You've already reviewed this product!