A tool written in Python to download all Snapmaps content from a specific location.

Overview

snapmap-archiver

A tool written in Python to download all Snapmaps content from a specific location.

snapmap-archiver splash

Setup

pip3 install snapmap-archiver

View on PyPI

Install dependencies with pip3.

pip3 install -r requirements.txt

Install aria2c

Usage

python3 -m snapmap_archiver -o [OUTPUT DIR] -l="[LATITUDE],[LONGITUDE]"

Unfortunately you have to use the arbitrary -l="lat,lon" rather than just -l "lat,lon" when parsing negative numbers as argsparse interprets said numbers as extra arguments.

Optional Arguments

Export JSON

You can export a JSON file with info about downloaded snaps with the --write-json argument, which will contain information like the time the Snap was posted, and the Snap location.

Snap Radius

The radius from the coordinates you provide that will be included for downloads. -r 20000 will download all Snaps within a 20km radius of your coordinates.

Comments
  • Added support for merging video and overlay file into one video file

    Added support for merging video and overlay file into one video file

    Added support for merging video and overlay file into one video file using ffmpeg. You can disable this using the --no-overlay argument. This solves #3 Also added overlayText and filetype fields to the --write-json argument.

    opened by Gertje823 1
  • Merge overlay.png with media.mp4

    Merge overlay.png with media.mp4

    Snaps with text and stickers don't include said graphics in the video file, instead they're stored in an image called overlay.png, and displayed over the top by the browser/app.

    We could have an option like --merge-overlay which would use something like ffmpeg or avconv to put the image over the top of the media.mp4 file and export it as a new file.

    opened by king-millez 1
  • [Question] Downloading the snaps

    [Question] Downloading the snaps

    I notice you're using Aria for downloading the videos. Is there a significant enough time difference using the external library over say a piece of code like

    import requests
    ...
    with open(filename + '.mp4', 'w+') as f:
       # req_headers from get_data.py
       f.write(requests.get(snap['media']['raw_url'], headers=req_headers))
    

    now that above code most likely won't run out of the box but I feel removing the Aria requirement would make it more attractive to people as they could get it running straight from pip

    And sorry for opening an issue to discuss this, I don't know where the best place to talk about the code is.

    opened by AlanTheBlank 0
  • Bump urllib3 from 1.26.3 to 1.26.4

    Bump urllib3 from 1.26.3 to 1.26.4

    Bumps urllib3 from 1.26.3 to 1.26.4.

    Release notes

    Sourced from urllib3's releases.

    1.26.4

    :warning: IMPORTANT: urllib3 v2.0 will drop support for Python 2: Read more in the v2.0 Roadmap

    • Changed behavior of the default SSLContext when connecting to HTTPS proxy during HTTPS requests. The default SSLContext now sets check_hostname=True.

    If you or your organization rely on urllib3 consider supporting us via GitHub Sponsors

    Changelog

    Sourced from urllib3's changelog.

    1.26.4 (2021-03-15)

    • Changed behavior of the default SSLContext when connecting to HTTPS proxy during HTTPS requests. The default SSLContext now sets check_hostname=True.
    Commits

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language

    You can disable automated security fix PRs for this repo from the Security Alerts page.

    dependencies 
    opened by dependabot[bot] 0
  • File naming

    File naming

    A better way to name the downloaded Snaps would be nice, rather than [SNAP_ID].mp4, which isn't really searchable. The data returned by utils.organise_data() could be used, which includes the Snap location and the timestamp from when it was posted.

    The actual API also contains some extra data for certain snaps, like the raw text used for a video.

    At the moment I'm thinking a good naming convention could be [MINOR_LOCATION] - [TIMESTAMP] - [SNAP_ID].mp4...

    opened by king-millez 0
  • Dynamically change radius to get maximum amount of relevant content

    Dynamically change radius to get maximum amount of relevant content

    When querying the API, you can pass the parameter radiusMeters, which will give you more/less specific content the lower/higher the number you provide is. Starting from 95000 and working down to 0, we could iterate through the maximum amount of content in a specific area. I'm thinking iterating by -2500 for values > 10000 would be good, then -1000 until 1000, then -100 to 100, then -10 to 0, arbitrarily ending with 1.

    opened by king-millez 0
Releases(2.0)
  • 2.0(Dec 23, 2022)

    Version 2.0

    Install with pip install snapmap-archiver

    Updated Codebase

    • Usable as a package
    • The SnapmapArchiver class can be used across projects for custom integration with other packages
    • More efficient: no more blank excepts!
    • Better API integration
    • Uses dict.get instead of countless except KeyError checks

    Installable

    • Package is now (properly) installable through pip
    • Now buildable with setup.py
    Source code(tar.gz)
    Source code(zip)
  • 1.3.1(Jan 6, 2022)

Owner
Canberra based developer. 15yo
Desktop utility to download images/videos/music/text from various websites, and more

Desktop utility to download images/videos/music/text from various websites, and more

Kurt Bestor 11.2k Jan 08, 2023
Youtube videos and channels scraper python wrapper!

YouTubeCrawle Wrapper for python Why This wrapper? This is wrapper is not limited to videos only it can scrape both channel and videos seperately ;D

Kei 16 Aug 08, 2022
Download all games from a public Itch.io Game Jam

Itch Jam Downloader Downloads all games from a public Itch.io Game Jam. What you'll need: Python 3.8+ pip install -r requirements.txt For site mirrori

Dragoon Aethis 19 Dec 07, 2022
Python library to download bulk of images from Bing.com

Python library to download bulk of images form Bing.com. This package uses async url, which makes it very fast while downloading.

Guru Prasad Singh 105 Dec 14, 2022
This is a simple Python Script to download Imgur Pictures with the short url!

Imgur Downloader This is a simple Python Script that runs a process with progress bar that downloads an Imgur Picture! Code Example Features Progress

OGMatrix 1 Nov 18, 2021
A tool to make easy to search for directories in the URL.

Welcome to Brutos Directory Scanner 🚀 The Brutos is a python script used to provide agility in obtaining verifications to informations about related

Sérgio Corrêa 4 Apr 14, 2022
File Downloader

File Downloader Watches a file containing download links and runs a command to download them. The link file is in form of: # comment DOWNLOAD_LINK

Pouriya 1 Jan 08, 2022
Code for "Adversarial Motion Priors Make Good Substitutes for Complex Reward Functions"

Adversarial Motion Priors Make Good Substitutes for Complex Reward Functions Codebase for the "Adversarial Motion Priors Make Good Substitutes for Com

Alejandro Escontrela 54 Dec 13, 2022
Storing, versioning, and downloading files from S3 made as easy as using open() in Python. Caching included.

open(LARGE) Storing, versioning, and downloading files from S3 made as easy as using open() in Python. Caching included. Motivation Oftentimes, especi

András Schmelczer 2 Jan 30, 2022
Download the resources of the Blue Archive easily!

blue-archive-bundle-downloader Download the resources of the Blue Archive easily! Known issue In Windows It works only if the console is "fullscreen"

Ryu juheon 7 Apr 08, 2022
A growing collection of search plugins for the qBittorrent, an awesome and opensource torrent client

qBittorrent Search Plugins This is a still growing collection of search plugins for qBittorent, an amazing and open source torrent client, maintained

Alessio Tudisco 59 Dec 26, 2022
A simple contents download module using url for python

A simple contents download module using url for python

Fayas Noushad 16 Oct 20, 2022
A prometheus exporter for torrent downloader like qbittorrent/transmission/deluge

downloader-exporter A prometheus exporter for qBitorrent/Transmission/Deluge. Get metrics from multiple servers and offers them in a prometheus format

Lei Shi 41 Nov 18, 2022
Libretrofuzz - Fuzzy Retroarch thumbnail downloader

Fuzzy Retroarch thumbnail downloader In Retroarch, when you use the manual scann

8 Nov 26, 2022
Newsemble is an API that provides easy access to the current news for programmatic analysis

Newsemble is an API that provides easy access to the current news for programmatic analysis. It has been built using Python, BeautifulSoup and MongoDB.

Rishabh 43 Dec 16, 2022
Web Downloader With Python

Web Downloader Introduction This module will provide API to download the webpage components : html file, image file, css fil, javascript file, href li

3 Dec 28, 2022
Write reproducible code for getting and processing ChEMBL

chembl_downloader Don't worry about downloading/extracting ChEMBL or versioning - just use chembl_downloader to write code that knows how to download

Charles Tapley Hoyt 34 Dec 25, 2022
AkShare is an elegant and simple financial data interface library for Python, built for human beings! 开源财经数据接口库

Overview AkShare requires Python(64 bit) 3.7 or greater, aims to make fetch financial data as convenient as possible. Write less, get more! Documentat

Albert King 5.8k Jan 03, 2023
Download Thumbnail of YouTube Videos

Download Thumbnail of YouTube Videos in High Quality Variables: API_ID : Get From my.telegram.org API_HASH : Get from my.telegram.org BOT_TOKEN : Your

Arun 6 Jun 08, 2022
In this repository you will find the test carried out to enter, as a python developer, the company Keeper Solutions.

Bookmarks In this repository you will find the test carried out to enter, as a python developer, the company Keeper Solutions. First it is necessary t

0 Jan 12, 2022