Name | n3_lyz_bearcatproducts.com |
---|---|
Data | 1.39M (+ 0B) |
Tables | 8 (+ 0) |
Columns | 30 (+ 0) |
Table Rows | 2,192 (+ 0) |
Media | 34.38M (+ 0B) |
Files | 471 (+ 0) |
Last Commit | 2023-06-19 18:12:05 (+ 169 d) |
This is a production database of home garden equipments & cleaning tools with 159 records by 14 categories. Each equipment record is comprised of title and category. From the whole garden equipments and cleaning tools database, there are 189 production specifications, 147 production informations, 460 production frequently asked questions and 545 images URLs records in each table.
The 14 categories are chippers, shredders, wheeled trimmers, pressure washers, generators, PTO machines, trash pumps, water pumps, debris loaders, wheeled vacuums, skid steer chippers, log splitters, stump grinders and options & accessories.
The whole home cleaning tools and garden equipments database has 8 tables.
Tables | Rows | Columns | Non-empty |
---|---|---|---|
category | 14 | title |
100%
|
image_slug | 545 | production_id |
100%
|
production | 159 | title |
100%
|
category_id |
100%
|
||
production_faq | 460 | question |
100%
|
answer |
100%
|
||
production_id |
100%
|
||
production_infor | 147 | description |
100%
|
production_id |
100%
|
||
production_speci | 189 | main_key |
100%
|
production_id |
100%
|
||
speci_value | 678 | key |
100%
|
value |
100%
|
||
production_speci_id |
100%
|
Size (Bytes) | Files |
---|---|
34.38M | 471 |
Time | Data | Tables | Columns | Rows | Media | Files |
---|---|---|---|---|---|---|
2023-06-19 (+ 169 d) | 1.39M (+ 0B) | 8 (+ 0) | 30 (+ 0) | 2,192 (+ 0) | 34.38M (+ 0B) | 471 (+ 0) |
2023-01-01 (+ 95 d) | 1.39M (+ 0B) | 8 (+ 0) | 30 (+ 0) | 2,192 (+ 0) | 34.38M (+ 0B) | 471 (+ 0) |
2022-09-27 (+ 271 d) | 1.39M (+ 0B) | 8 (+ 0) | 30 (+ 0) | 2,192 (+ 0) | 34.38M (+ 0B) | 471 (+ 0) |
2021-12-30 (+ 115 d) | 1.39M (+ 0B) | 8 (+ 0) | 30 (+ 0) | 2,192 (+ 0) | 34.38M (+ 0B) | 471 (+ 0) |
2021-09-05 (+ 150 d) | 1.39M (+ 0B) | 8 (+ 0) | 30 (+ 0) | 2,192 (+ 0) | 34.38M (+ 0B) | 471 (+ 0) |
2021-04-08 (+ 68 d) | 1.39M (+ 0B) | 8 (+ 0) | 30 (+ 0) | 2,192 (+ 0) | 34.38M (+ 0B) | 471 (+ 0) |
2021-01-29 (+ 120 d) | 1.39M (+ 0B) | 8 (+ 0) | 30 (+ 0) | 2,192 (+ 0) | 34.38M (+ 0B) | 471 (+ 0) |
2020-10-01 (+ 38 d) | 1.39M (+ 0B) | 8 (+ 0) | 30 (+ 0) | 2,192 (+ 0) | 34.38M (+ 0B) | 471 (+ 0) |
2020-08-23 (+ 141 d) | 1.39M (+ 0B) | 8 (+ 0) | 30 (+ 0) | 2,192 (+ 0) | 34.38M (+ 0B) | 471 (+ 0) |
2020-04-04 (+ 196 d) | 1.39M (+ 0B) | 8 (+ 0) | 30 (+ 0) | 2,192 (+ 0) | 34.38M (+ 0B) | 471 (+ 0) |
2019-09-21 (+ 197 d) | 1.39M (+ 0B) | 8 (+ 0) | 30 (+ 0) | 2,192 (+ 0) | 34.38M (+ 0B) | 471 (+ 0) |
2019-03-07 (+ 43 d) | 1.39M (+ 0B) | 8 (+ 0) | 30 (+ 0) | 2,192 (+ 0) | 34.38M (+ 0B) | 471 (+ 0) |
2019-01-23 (+ 44 d) | 1.39M (+ 0B) | 8 (+ 0) | 30 (+ 0) | 2,192 (+ 0) | 34.38M (+ 0B) | 471 (+ 0) |
2018-12-09 (+ 29 d) | 1.39M (+ 0B) | 8 (+ 0) | 30 (+ 0) | 2,192 (+ 0) | 34.38M (+ 0B) | 471 (+ 0) |
2018-11-09 (+ 27 d) | 1.39M (+ 400K) | 8 (+ 0) | 30 (+ 0) | 2,192 (+ 0) | 34.38M (+ 0B) | 471 (+ 0) |
2018-10-13 (+ 29 d) | 1M (+ 0B) | 8 (+ 0) | 30 (+ 0) | 2,192 (+ 0) | 34.38M (+ 0B) | 471 (+ 0) |
2018-09-13 (+ 34 d) | 1M (+ 0B) | 8 (+ 0) | 30 (+ 0) | 2,192 (+ 0) | 34.38M (+ 0B) | 471 (+ 0) |
2018-08-09 (+ 27 d) | 1M (+ 0B) | 8 (+ 0) | 30 (+ 0) | 2,192 (+ 0) | 34.38M (+ 0B) | 471 (+ 0) |
2018-07-12 (+ 31 d) | 1M (+ 0B) | 8 (+ 0) | 30 (+ 0) | 2,192 (+ 0) | 34.38M (+ 0B) | 471 (+ 0) |
2018-06-10 (+ 31 d) | 1M (+ 0B) | 8 (+ 0) | 30 (+ 0) | 2,192 (+ 0) | 34.38M (+ 0B) | 471 (+ 0) |
2018-05-10 (+ 28 d) | 1M (+ 0B) | 8 (+ 0) | 30 (+ 0) | 2,192 (+ 0) | 34.38M (+ 0B) | 471 (+ 0) |
2018-04-11 (+ 38 d) | 1M (+ 0B) | 8 (+ 0) | 30 (+ 0) | 2,192 (+ 0) | 34.38M (+ 0B) | 471 (+ 0) |
2018-03-03 (+ 35 d) | 1M (+ 0B) | 8 (+ 0) | 30 (+ 0) | 2,192 (+ 0) | 34.38M (+ 0B) | 471 (+ 0) |
2018-01-27 | 1M | 8 | 30 | 2,192 | 34.38M | 471 |
Contact us for pricing to download the latest commit / release of this database.
In the same time, you can also access this data set via API.
Select a membership plan and sign up. Return to this page, click Online Query to access the API query maker. Create and open an API call to acquire the data.
The scanning and profiling of data size increments are done separately from those of the number of rows.
Subscribe to be notified of major data releases and updates.
STAY IN THE LOOP.