Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

DAVIS2018 does not exist #5

Open
phananh1010 opened this issue Jun 24, 2021 · 4 comments
Open

DAVIS2018 does not exist #5

phananh1010 opened this issue Jun 24, 2021 · 4 comments

Comments

@phananh1010
Copy link

Hello,

I am looking for the Davis-2018 from your link but all the datasets there do not have zip files. Currently, you code does not work for the Davis datasets.

Is it because they remove the Davis-2018 dataset, or is it because the link was incorrect? Either way, can you please check it.

Thanks,
Anh

@yzcv
Copy link

yzcv commented Dec 22, 2021

Hi, there,

I've written a short python file to process the youtube-vos folders into zips. You may refer to this for processing the Davis dataset.

import os
import zipfile

def zipDir(dirpath, outFullName):
	zipname = zipfile.ZipFile(outFullName, 'w', zipfile.ZIP_DEFLATED)
	for path, dirnames, filenames in os.walk(dirpath):
		fpath= path.replace(dirpath, '')

		for filename in filenames:
			zipname.write(os.path.join(path, filename), os.path.join(fpath, filename))
	zipname.close()

if __name__=="__main__":   
    g=os.walk('/home/yz/Downloads/STTN/datasets/youtube-vos/train_all_frames/JPEGImages')
    for path, dir_list, file_list in g:
        for dir_name in dir_list:
            input_path = os.path.join(path,dir_name)
            output_path = '/home/yz/Downloads/STTN/datasets/youtube-vos/JPEGImages/'+dir_name+'.zip'
            print(input_path, '\n', output_path)
            zipDir(input_path, output_path)

@diaodeyi
Copy link

diaodeyi commented Jan 5, 2022

Hi, there,

I've written a short python file to process the youtube-vos folders into zips. You may refer to this for processing the Davis dataset.

import os
import zipfile

def zipDir(dirpath, outFullName):
	zipname = zipfile.ZipFile(outFullName, 'w', zipfile.ZIP_DEFLATED)
	for path, dirnames, filenames in os.walk(dirpath):
		fpath= path.replace(dirpath, '')

		for filename in filenames:
			zipname.write(os.path.join(path, filename), os.path.join(fpath, filename))
	zipname.close()

if __name__=="__main__":   
    g=os.walk('/home/yz/Downloads/STTN/datasets/youtube-vos/train_all_frames/JPEGImages')
    for path, dir_list, file_list in g:
        for dir_name in dir_list:
            input_path = os.path.join(path,dir_name)
            output_path = '/home/yz/Downloads/STTN/datasets/youtube-vos/JPEGImages/'+dir_name+'.zip'
            print(input_path, '\n', output_path)
            zipDir(input_path, output_path)

I think your srcipt need to improve , some image in the youtube-vos don't seem to be start from 1. So when use the dataset.py in STTN. It may be wrong.

@yzcv
Copy link

yzcv commented Jan 7, 2022

Hi, there,
I've written a short python file to process the youtube-vos folders into zips. You may refer to this for processing the Davis dataset.

import os
import zipfile

def zipDir(dirpath, outFullName):
	zipname = zipfile.ZipFile(outFullName, 'w', zipfile.ZIP_DEFLATED)
	for path, dirnames, filenames in os.walk(dirpath):
		fpath= path.replace(dirpath, '')

		for filename in filenames:
			zipname.write(os.path.join(path, filename), os.path.join(fpath, filename))
	zipname.close()

if __name__=="__main__":   
    g=os.walk('/home/yz/Downloads/STTN/datasets/youtube-vos/train_all_frames/JPEGImages')
    for path, dir_list, file_list in g:
        for dir_name in dir_list:
            input_path = os.path.join(path,dir_name)
            output_path = '/home/yz/Downloads/STTN/datasets/youtube-vos/JPEGImages/'+dir_name+'.zip'
            print(input_path, '\n', output_path)
            zipDir(input_path, output_path)

I think your srcipt need to improve , some image in the youtube-vos don't seem to be start from 1. So when use the dataset.py in STTN. It may be wrong.

Your are right. Thanks for pointing this out. : )

@ToscanaGoGithub
Copy link

你好,

我正在从您的链接中寻找 Davis-2018,但那里的所有数据集都没有 zip 文件。目前,您的代码不适用于 Davis 数据集。

是因为他们删除了 Davis-2018 数据集,还是因为链接不正确?无论哪种方式,请您检查一下。

谢谢, 安

Hello,

I am looking for the Davis-2018 from your link but all the datasets there do not have zip files. Currently, you code does not work for the Davis datasets.

Is it because they remove the Davis-2018 dataset, or is it because the link was incorrect? Either way, can you please check it.

Thanks, Anh

Hello, do you know how to get the DAVIS2018 dataset?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants