Call Us

Home / Blog / Data Science / Data Pipeline - Google Colab → Local DB (MySQL)

Data Pipeline - Google Colab → Local DB (MySQL)

  • July 10, 2023
  • 5954
  • 78
Author Images

Meet the Author : Mr. Sharat Chandra

Sharat Chandra is the head of analytics at 360DigiTMG as well as one of the founders and directors of AiSPRY. With more than 17 years of work experience in the IT sector, Sharat Chandra has a wide range of expertise in areas like retail, manufacturing, medical care, etc. With over ten years of expertise as the head trainer at 360DigiTMG, Sharat Chandra has been assisting his pupils in making the move to the IT industry simple. Along with the Oncology team, he made a contribution to the field of LSHC, especially to the field of cancer therapy, which was published in the British magazine of Cancer research magazine.

Read More >
Data Pipeline How to Access the data residing in Database (MySQL) on a local machine via Python code in Google Colab?

Solution:

It might be challenging to access a locally hosted database from the Google Colab server.

Since the servers are geographically separated, they must connect online. When you wish to utilize the Python kernel, Google Colab creates instances on the fly.

If you wish to access the database from a Colab instance, you must first be able to connect to the remote server.

There are several tools and services that may be used to link the two servers. In this blog, we will expose the local host to the web using ngrok .

Without changing any network settings or opening any ports on our router, ngrok service has Secure Tunnels to provide immediate and open access to remote systems.

We may access our MySQL service across the TCP protocol with a safe, dependable tunnel provided by the ngrok service.

Also, check this Data Science Courses in Chennai to start a career in Data Science.

 

Steps to Connect Local DB with Google Colab:

Data Pipeline
  • 1. Download ngrok
    Unzip the extracted software and save it in a folder on your local Windows machine.
  • 2. Sign up and validate your email.
  • 3. Follow the setup instructions sent to your registered email.
  • 4. Connect your account: Establish an account connection by adding your authtoken to the standard ngrok.yml configuration file. Longer session times and access to additional features will result from this step.
  • 5. To start a TCP tunnel forwarding to your local port 3306, run the below command.
    ngrok tcp 3306
Data Pipeline

The hostname and port combination URI is: 0.tcp.in.ngrok.io:19***, which maps to localhost 3306 of the local MySQL database installation.

To connect with Google Colab, this host and port combination exposes our database services to the internet.

The Google Colab notebook's code snippet for connecting to the local host DB (MySQL) server:

Data Pipeline

Conclusion

As organizations realize the importance of Data and move their business strategies to be data-driven, the importance of Data engineers in the market is on an upward trend. The availability of Data for deriving insights is a must. The experiments on the Data are performed using various processing platforms with Python IDEs, Google Colab is one of the many free services that are on offer. Data pipelines incorporating ETL or ELT as per the needs of the business must be built integrating the multiple tools involved in the technology stack.

What do you think is the most challenging task in developing these pipelines, we would love to hear your experiences and opinions/views.

Want to learn more about data science? Enroll in the Best Data Science courses in Bangalore to do so.

Data Science Placement Success Story

Data Science Training Institutes in Other Locations

Agra, Ahmedabad, Amritsar, Anand, Anantapur, Bangalore, Bhopal, Bhubaneswar, Chengalpattu, Chennai, Cochin, Dehradun, Malaysia, Dombivli, Durgapur, Ernakulam, Erode, Gandhinagar, Ghaziabad, Gorakhpur, Gwalior, Hebbal, Hyderabad, Jabalpur, Jalandhar, Jammu, Jamshedpur, Jodhpur, Khammam, Kolhapur, Kothrud, Ludhiana, Madurai, Meerut, Mohali, Moradabad, Noida, Pimpri, Pondicherry, Pune, Rajkot, Ranchi, Rohtak, Roorkee, Rourkela, Shimla, Shimoga, Siliguri, Srinagar, Thane, Thiruvananthapuram, Tiruchchirappalli, Trichur, Udaipur, Yelahanka, Andhra Pradesh, Anna Nagar, Bhilai, Borivali, Calicut, Chandigarh, Chromepet, Coimbatore, Dilsukhnagar, ECIL, Faridabad, Greater Warangal, Guduvanchery, Guntur, Gurgaon, Guwahati, Hoodi, Indore, Jaipur, Kalaburagi, Kanpur, Kharadi, Kochi, Kolkata, Kompally, Lucknow, Mangalore, Mumbai, Mysore, Nagpur, Nashik, Navi Mumbai, Patna, Porur, Raipur, Salem, Surat, Thoraipakkam, Trichy, Uppal, Vadodara, Varanasi, Vijayawada, Visakhapatnam, Tirunelveli, Aurangabad

Data Analyst Courses in Other Locations

ECIL, Jaipur, Pune, Gurgaon, Salem, Surat, Agra, Ahmedabad, Amritsar, Anand, Anantapur, Andhra Pradesh, Anna Nagar, Aurangabad, Bhilai, Bhopal, Bhubaneswar, Borivali, Calicut, Cochin, Chengalpattu, Dehradun, Dombivli, Durgapur, Ernakulam, Erode, Gandhinagar, Ghaziabad, Gorakhpur, Guduvanchery, Gwalior, Hebbal, Hoodi , Indore, Jabalpur, Jaipur, Jalandhar, Jammu, Jamshedpur, Jodhpur, Kanpur, Khammam, Kochi, Kolhapur, Kolkata, Kothrud, Ludhiana, Madurai, Mangalore, Meerut, Mohali, Moradabad, Pimpri, Pondicherry, Porur, Rajkot, Ranchi, Rohtak, Roorkee, Rourkela, Shimla, Shimoga, Siliguri, Srinagar, Thoraipakkam , Tiruchirappalli, Tirunelveli, Trichur, Trichy, Udaipur, Vijayawada, Vizag, Warangal, Chennai, Coimbatore, Delhi, Dilsukhnagar, Hyderabad, Kalyan, Nagpur, Noida, Thane, Thiruvananthapuram, Uppal, Kompally, Bangalore, Chandigarh, Chromepet, Faridabad, Guntur, Guwahati, Kharadi, Lucknow, Mumbai, Mysore, Nashik, Navi Mumbai, Patna, Pune, Raipur, Vadodara, Varanasi, Yelahanka

 
Read
Success Stories
Make an Enquiry