Name | n3a2_01hr_com |
---|---|
Data | 318.62M (+ 0B) |
Tables | 5 (+ 0) |
Columns | 48 (+ 0) |
Table Rows | 587,406 (+ 0) |
Media | 0B (+ 0B) |
Files | 0 (+ 0) |
Last Commit | 2023-06-19 19:07:11 (+ 169 d) |
该招聘数据库共有5个表和138,997个就业岗位,所有的就业岗位是由中国484个地区的37,933个企业提供的。每个工作岗位有名称,职位描述,学历要求,工作年限,薪资,工作性质,发布时间等字段。在企业表中,每个企业包含有企业名称,公司简介,地区等信息。
This is a jobs database with 138,997 records from 37,933 companies over 484 regions of China. Each job has title, zhi wei miao shu, xue li yao qiu, gong zuo nian xian, salary, gong zuo xing zhi, fa bu shi jian, etc. In table company, there are title, gong si jian jie, region, etc. The whole China jobs data set has 5 tables.
Tables | Rows | Columns | Non-empty |
---|---|---|---|
company | 37,933 | title |
100%
|
gong_si_jian_jie |
98.15%
|
||
region |
99.9%
|
||
gong_si_di_zhi |
97.42%
|
||
gong_si_hang_ye |
0%
|
||
gong_si_gui_mo |
0%
|
||
gong_si_xing_zhi |
0%
|
||
gong_si_guan_wang |
0%
|
||
telphone |
0%
|
||
lian_xi_ren |
0%
|
||
chuan_zhen |
0%
|
||
0%
|
|||
zipcode |
0%
|
||
address |
0%
|
||
latitude |
0%
|
||
longitude |
0%
|
||
company_x_job | 270,702 | company_id |
100%
|
job_id |
100%
|
||
job | 138,997 | title |
100%
|
zhi_wei_miao_shu |
94.75%
|
||
xue_li_yao_qiu |
100%
|
||
gong_zuo_nian_xian |
100%
|
||
salary |
100%
|
||
salary_from |
87.54%
|
||
salary_to |
87.54%
|
||
gong_zuo_xing_zhi |
100%
|
||
fa_bu_shi_jian |
100%
|
||
district |
94.75%
|
||
zhao_pin_ren_shu |
87.31%
|
||
zhi_wei_fen_lei |
94.75%
|
||
region | 484 | title |
100%
|
region_x_job | 139,290 | region_id |
100%
|
job_id |
100%
|
No media sets.
Time | Data | Tables | Columns | Rows | Media | Files |
---|---|---|---|---|---|---|
2023-06-19 (+ 169 d) | 318.62M (+ 0B) | 5 (+ 0) | 48 (+ 0) | 587,406 (+ 0) | 0B (+ 0B) | 0 (+ 0) |
2023-01-01 (+ 95 d) | 318.62M (+ 0B) | 5 (+ 0) | 48 (+ 0) | 587,406 (+ 0) | 0B (+ 0B) | 0 (+ 0) |
2022-09-27 (+ 271 d) | 318.62M (+ 0B) | 5 (+ 0) | 48 (+ 0) | 587,406 (+ 0) | 0B (+ 0B) | 0 (+ 0) |
2021-12-30 (+ 115 d) | 318.62M (+ 0B) | 5 (+ 0) | 48 (+ 0) | 587,406 (+ 0) | 0B (+ 0B) | 0 (+ 0) |
2021-09-05 (+ 150 d) | 318.62M (+ 0B) | 5 (+ 0) | 48 (+ 0) | 587,406 (+ 0) | 0B (+ 0B) | 0 (+ 0) |
2021-04-08 (+ 68 d) | 318.62M (+ 0B) | 5 (+ 0) | 48 (+ 0) | 587,406 (+ 0) | 0B (+ 0B) | 0 (+ 0) |
2021-01-29 (+ 120 d) | 318.62M (+ 0B) | 5 (+ 0) | 48 (+ 0) | 587,406 (+ 0) | 0B (+ 0B) | 0 (+ 0) |
2020-10-01 (+ 38 d) | 318.62M (+ 0B) | 5 (+ 0) | 48 (+ 0) | 587,406 (+ 0) | 0B (+ 0B) | 0 (+ 0) |
2020-08-23 (+ 141 d) | 318.62M (+ 0B) | 5 (+ 0) | 48 (+ 0) | 587,406 (+ 0) | 0B (+ 0B) | 0 (+ 0) |
2020-04-04 (+ 196 d) | 318.62M (+ 0B) | 5 (+ 0) | 48 (+ 0) | 587,406 (+ 0) | 0B (+ 0B) | 0 (+ 0) |
2019-09-21 (+ 197 d) | 318.62M (+ 0B) | 5 (+ 0) | 48 (+ 0) | 587,406 (+ 0) | 0B (+ 0B) | 0 (+ 0) |
2019-03-07 (+ 43 d) | 318.62M (+ 0B) | 5 (+ 0) | 48 (+ 0) | 587,406 (+ 0) | 0B (+ 0B) | 0 (+ 0) |
2019-01-23 (+ 44 d) | 318.62M (+ 0B) | 5 (+ 0) | 48 (+ 0) | 587,406 (+ 0) | 0B (+ 0B) | 0 (+ 0) |
2018-12-09 (+ 29 d) | 318.62M (+ 0B) | 5 (+ 0) | 48 (+ 0) | 587,406 (+ 0) | 0B (+ 0B) | 0 (+ 0) |
2018-11-09 (+ 27 d) | 318.62M (+ 240.28M) | 5 (+ 0) | 48 (+ 0) | 587,406 (+ 0) | 0B (+ 0B) | 0 (+ 0) |
2018-10-13 (+ 29 d) | 78.34M (+ 0B) | 5 (+ 0) | 48 (+ 0) | 587,406 (+ 0) | 0B (+ 0B) | 0 (+ 0) |
2018-09-13 (+ 34 d) | 78.34M (+ 0B) | 5 (+ 0) | 48 (+ 0) | 587,406 (+ 0) | 0B (+ 0B) | 0 (+ 0) |
2018-08-09 (+ 27 d) | 78.34M (+ 0B) | 5 (+ 0) | 48 (+ 0) | 587,406 (+ 0) | 0B (+ 0B) | 0 (+ 0) |
2018-07-12 (+ 31 d) | 78.34M (+ 0B) | 5 (+ 0) | 48 (+ 0) | 587,406 (+ 0) | 0B (+ 0B) | 0 (+ 0) |
2018-06-10 (+ 31 d) | 78.34M (+ 0B) | 5 (+ 0) | 48 (+ 0) | 587,406 (+ 0) | 0B (+ 0B) | 0 (+ 0) |
2018-05-10 (+ 28 d) | 78.34M (+ 0B) | 5 (+ 0) | 48 (+ 0) | 587,406 (+ 0) | 0B (+ 0B) | 0 (+ 0) |
2018-04-11 (+ 38 d) | 78.34M (+ 0B) | 5 (+ 0) | 48 (+ 0) | 587,406 (+ 0) | 0B (+ 0B) | 0 (+ 0) |
2018-03-03 | 78.34M | 5 | 48 | 587,406 | 0B | 0 |
Contact us for pricing to download the latest commit / release of this database.
In the same time, you can also access this data set via API.
Select a membership plan and sign up. Return to this page, click Online Query to access the API query maker. Create and open an API call to acquire the data.
The scanning and profiling of data size increments are done separately from those of the number of rows.
Subscribe to be notified of major data releases and updates.
STAY IN THE LOOP.