The OU-ISIR Gait Database is meant to aid research efforts in the general area of developing, testing and evaluating algorithms for gait-based human identification. The Institute of Scientific and Industrial Research (ISIR), Osaka University (OU) has copyright in the collection of gait video and associated data and serves as a distributor of the OU-ISIR Gait Database.
The data have been collected by since March 2009 through outreach activity events in Japan. The approved informed consent was obtained from all the subjects in this dataset. The data set consists of persons walking on the ground surrounded by the 2 cameras at 30 fps, 640 by 480 pixels. The datasets are basically distributed in a form of silhouette sequences registered and size-normalized to 88 by 128 pixels size. Detailed descriptions are found in the following journal and a supplemental document.
- Haruyuki Iwama, Mayu Okumura, Yasushi Makihara, and Yasushi Yagi, ``The OU-ISIR Gait Database Comprising the Large Population Dataset and Performance Evaluation of Gait Recognition,'' IEEE Trans. on Information Forensics and Security, Vol. 7, No. 5, pp. 1511-1521, Oct. 2012.[PDF] [Bib]
Dataset StructureA set of the sequences captured by camera 1 in our capture system is defined as dataset C1, and that captured by camera 2 is defined as dataset C2. Samples of still images (captured color images) are shown in below figures (top:C1, bottom:C2).
Currently, the dataset C1 includes over 4,000 subjects with a wide range of ages and is available, while the dataset C2 is under preparation. Examples of subjects are shown below.
VersionThe dataset could be updated when new sequences are added or silhouette creation process is slightly revised. Currently, there are two versions, Version 1 and Version 2 (latest version, totally includes 4,016 subjects with ages ranging from 1 to 94 years). The differences of Version 2 from Version 1 are the accuracy of bounding box (more accurate) of each silhouette region and the size of moving-average filter applied in the size-normalized silhouette creation process.
SubsetEach dataset comprises two main subsets, A and B. A is a set of two sequences (gallery and probe sequences) per subject. B is a set of one sequences per subject. In addition, each of the main subsets is further devided into 5 subsets based on the observation angles, 55 [deg], 65 [deg], 75 [deg], 85 [deg], and including all four angles.
All the subsets are abbreviated in a uniform way. The format is,
OULP-[Camera ID(C1/C2)][Version ID(V1/V2)]-[Sequence Type(A/B)]-[Observation Angle Type(55/65/75/85/All)].
The number of subjects in each subset is summarized in following documents.
Subject's set in each subset is strictly defined in the form of subject ID list supplied with silhouette sequence data, which includes subject ID and the start and end frame numbers of sequence.
How to get the dataset?To advance the state-of-the-art in gait-based application, this dataset including a set of size-normalized silhouette sequences and that of subject ID lists could be downloaded as a zip file with password protection and the password will be issued on a case-by-case basis. To receive the password, the requestor must send the release agreement signed by a legal representative of your institution (e.g., your supervisor if you are a student) to the database administrator by mail, e-mail, or FAX.
- Release agreement
- Silhouette sample (password: sample)
- Dataset: OULP-C1V2 (latest version)
- Dataset: OULP-C1V1
- Protocols and benchmarks for cross-view gait recognition
Current version of subject ID list (Format version 1.0) does not include subject's age and gender information. The complete information including age and gender is soon to be released in the next format version.
It was often reported that Windows built-in zip did not work to extract all the files. Please try another unzip software if necessary.
The database administratorDepartment of Intelligent Media, The Institute of Scientific and Industrial Research, Osaka University
Address: 8-1 Mihogaoka, Ibaraki, Osaka, 567-0047, JAPAN