Sometimes I need to upload large CSV files to PostgreSQL. CSV file might have hundreds of columns, that's why i want a tool that can do some magic for me. My main goal is to be able to review a bunch of data. Execute some SQL-requests and decide what to do next with it.
A nice tool to handle this is pgfutter.
./pgfutter csv file.csv --db mydatabase
It will automatically create all columns and fill them with data. The table name will be name of the file. If you want custom table name, do this:
./pgfutter csv file.csv --db mydatabase --table mytable
For more use cases, visit project GitHub. If you want to import CSV to existing PostgreSQL table, check COPY.
Have your ever searched google maps china offset? Most people who was in China yes. Here is my story behind this question.
Sitemaps are important. Especially for big websites. It is always a good idea to develop your website with SEO in mind. Unfortunately, most developers ignore this part.
Snippet on how to upload files to DigitalOcean Spaces with boto3 and python.
Flask-Backbone is my take to create an initial structure and a set of rules, so most of my flask projects are easy to support.
Bunny.net is a well-known CDN and storage among developers. Here is my python snippet on how to upload a file to the storage.
In this post, I'll talk about filters. Jinja2 has a list of built-in filters, and Flask leverages them.