Portfolio Project :-Building a Free Tender Announcement Website

# Building a Free Tender Announcement Website

This past year, I embarked on a project that aimed to address a common challenge: finding timely tender announcements. As someone who frequently seeks out tender opportunities, I noticed that existing websites often fell short in providing up-to-date information. To tackle this issue, I decided to create my own website—a platform that would aggregate and organize tender announcements from various sources.

## The Problem

The problem was twofold: first, existing websites lacked real-time updates, leaving me to manually check multiple platforms daily. Second, the data from these websites was often disorganized and required human intervention to process effectively.

## Our Solution

During my ALX DevOps training, I teamed up with Natnael Gedlu to build our **Free.Tender** website. Our goal was clear: create a user-friendly webpage that autonomously fetched data from different sources and presented it in an organized format.

### Front-End and Back-End Focus

Natnael took charge of the front-end design, ensuring that our website was intuitive and visually appealing. Meanwhile, I focused on the back-end development, where the real technical challenges awaited.

### Technical Challenges

The most difficult aspect of our project was fetching data from external websites using the `curl` tool. We needed to extract relevant information from HTML code and transform it into a structured format for our database. Here's how we tackled it:

1. **Using `curl` for Data Retrieval**: We leveraged `curl` to make HTTP requests to various tender announcement websites. By specifying the `-X GET` option, we ensured that we were making GET requests to retrieve data.

2. **Parsing HTML Data**: Once we received the HTML content, we needed to extract relevant details such as tender descriptions, deadlines, and contact information. Parsing HTML is notoriously tricky due to its unstructured nature. We wrote custom functions (using `CURLOPT_WRITEFUNCTION`) that processed the raw HTML data and extracted the necessary fields.

3. **Data Organization**: The extracted data needed further processing before being stored in our database. We sorted tenders by category, region and languages classification.

# Summarize

We use a nice template which we found it from niceadmin website.

Print   Email

Related Articles