A Python web scraper that tracks used car prices across major Ecuadorian automotive marketplaces.
- Multi-site Scraping: Collects data from PatioTuerca and OLX Ecuador with expandable architecture for additional sites
- Price Tracking: Monitors price changes over time for all listings
- Data Analysis: Calculates average prices, identifies deals, and tracks market trends
- Automated Notifications: Sends email alerts for price drops and new listings
- Reporting: Generates HTML reports with market insights and significant price changes
- Scheduling: Runs daily jobs automatically to keep data current
- Python 3.7 or higher
- pip (Python package installer)
-
Clone the repository:
git clone https://github.com/your-username/ecuador-car-price-tracker.git cd ecuador-car-price-tracker -
Create and activate a virtual environment:
python -m venv venv source venv/bin/activate # On Windows: venv\Scripts\activate
-
Install required packages:
pip install -r requirements.txt
-
Configure email settings:
- Open
car_price_tracker.py - Update the email configuration in the
send_email_notificationmethod - If using Gmail, create an app password in your Google account security settings
- Open
Run the script once:
python car_price_tracker.pyThis will:
- Scrape car listings from supported websites
- Store data in a SQLite database (
car_prices.db) - Generate an HTML report (
car_prices_report.html) - Send an email notification if configured
To run the tracker continuously with scheduled jobs:
nohup python car_price_tracker.py > car_tracker.log 2>&1 &Create a batch file run_tracker.bat:
@echo off
cd C:\path\to\script\directory
python car_price_tracker.pyAdd this to Windows Task Scheduler to run at system startup.
-
Create a new scraping method following the pattern of existing ones:
def scrape_new_site(self, max_pages=3): # Implementation here return new_listings, updated_prices
-
Add a call to your new method in the
run_daily_jobmethod
Customize the email alert criteria in the run_daily_job method to match your preferences.
Modify the scheduling in the main section:
# Change from daily to every 12 hours
schedule.every(12).hours.do(tracker.run_daily_job, email="your_email@example.com")The script uses SQLite with two main tables:
id: Primary keylisting_id: Original ID from the source websitewebsite: Source website nametitle: Listing titlemake: Car manufacturermodel: Car modelyear: Manufacturing yearmileage: Odometer readinglocation: City/region in Ecuadorurl: Original listing URLseller_type: Dealer or privatefeatures: Additional featuresfirst_seen: Date first discovered
id: Primary keycar_id: Foreign key to cars tableprice: Price in USDdate: Date the price was recorded
The HTML report includes:
- Total number of listings
- Number of cars with price changes
- Top car makes by popularity
- Recent price drops with links to listings
The script logs all activities to car_tracker.log, including:
- Scraping starts and completions
- Database operations
- Errors and exceptions
- Email notifications
Contributions are welcome! Please feel free to submit a Pull Request.
This project is licensed under the MIT License - see the LICENSE file for details.
This script is for educational purposes only. Please review and respect the Terms of Service of each website before deploying this scraper.