Skip to content

Commit

Permalink
eBuy Documentation update
Browse files Browse the repository at this point in the history
  • Loading branch information
BuckinghamAJ committed Jun 27, 2024
1 parent 571d477 commit 910c388
Show file tree
Hide file tree
Showing 2 changed files with 61 additions and 0 deletions.
2 changes: 2 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,8 @@ This application is designed to run as a cron daemon within a Docker image on [c
4) Extracts the text from each of those documents using [textract](https://github.com/deanmalmgren/textract).
5) Restructures the data and inserts it into a PostgreSQL database.
In a future release, the script will poll the database for the number of user-validated predictions on document compliance. If there's a significant increase, those newly validated documents will be sent to the [machine learning component](https://github.com/GSA/srt-ml) of the application to train a new and improved model.

**Note:** eBuy is now integrated into SRT. Navigate to [eBuy Documentation](https://github.com/GSA/srt-fbo-scraper/tree/main/documentation/eBuy.md) for more information
# Developer Requirements
## Software Components and Tools
The following is a summary of the software and tools that are needed for development of this project:
Expand Down
59 changes: 59 additions & 0 deletions documentation/eBuy.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,59 @@
# eBuy
SRT has worked to integrate eBuy Open's active RFQs into the scraper to determine 508 compliance. Due to the security this process as of now is manual.

## Manual Process
Note: Be sure to have the latest code updates, and verify you have the ebuy_csv.py file.

Additionally, for ease, with the latest code execute a ```pip install -e .``` inside the srt-fbo-scraper local repo.
### Export CSV
1. Navigate to [eBuy Open](https://www.ebuy.gsa.gov/ebuyopen/)
2. Log into the web interface
3. Go up to the columns button and select the following options:
- RFQID
- Attachments
- AttachmentCount
- BuyerAgency
- BuyerEmail
- Category
- CategoryName
- Description
- IssueDate
- Source
- Status
- Title
4. Export the CSV
5. Place csv into the ebuy folder (srt-fbo-scraper/ebuy)

### Running ebuy_parser
Below is the help out for the ebuy_parser command line operation that should be avaliable after running the pip install above.

**Important:** Reach out to the lead developer at the time to obtain a local .env file that will provide you the postgres login data for the SRT databases on cloud.gov. **NEVER COMMIT THE .ENV FILE TO GITHUB!**

```
usage: ebuy_parser [-h] [-c _CONFIG] -f FILE [--model-name PREDICTION.MODEL_NAME] [--model-path PREDICTION.MODEL_PATH] [-e ENVIRONMENT]
options:
-h, --help show this help message and exit
-c _CONFIG, --config _CONFIG
Define general configuration with yaml file
-f FILE, --file FILE eBuy CSV file to process
Prediction Model Options:
--model-name PREDICTION.MODEL_NAME
Define the name to the prediction model in binaries folder.
--model-path PREDICTION.MODEL_PATH
Define the absolute path to the prediction model.
-e ENVIRONMENT, --environment ENVIRONMENT
Define the cloud.gov environment to run
```
1. In the command line you will specify multiple things, 1.) Exported csv file name, and 2.) Environment you want to insert the eBuy data into (defaults to local).
2. Based on the environment the script will provide you with instructions on something to execute in a separate terminal window. Example:
```
cf ssh -L localhost:54000:<Service Database Name & password>:5432 srt-fbo-scraper-staging
```
This is to open local port to the postgres service instance in cloud.gov

3. Once that command is running, press enter to continue the ebuy_parser function.
4. Next, the parser will open up eBuy Open and that is when you will login with your credientials.
5. Once logged in, you can press enter on your keyboard in the command line. The script obtained the security cookies needed to send out with each download request to eBuy's server. No security information will be saved once command is done.
6. Verify the script completes without errors, the RFQs are in the applicable environment, and you should be good to go.

0 comments on commit 910c388

Please sign in to comment.