312 Commits

Author SHA1 Message Date
Ajay Ramachandran
fcd0fb7ac7 Merge pull request #616 from ajayyy/dependabot/npm_and_yarn/js-yaml-3.14.2
Bump js-yaml from 3.14.1 to 3.14.2
2025-11-17 16:22:24 -05:00
dependabot[bot]
b97b50a8f6 Bump js-yaml from 3.14.1 to 3.14.2
Bumps [js-yaml](https://github.com/nodeca/js-yaml) from 3.14.1 to 3.14.2.
- [Changelog](https://github.com/nodeca/js-yaml/blob/master/CHANGELOG.md)
- [Commits](https://github.com/nodeca/js-yaml/compare/3.14.1...3.14.2)

---
updated-dependencies:
- dependency-name: js-yaml
  dependency-version: 3.14.2
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-11-17 17:48:52 +00:00
Ajay
6d5b6dd3ae Add service test case for get skip segments 2025-10-14 03:46:27 -04:00
Ajay
0412386870 Add support for spotify service 2025-10-14 03:34:02 -04:00
Ajay
1eedc9fa09 Remove old test 2025-10-03 15:13:31 -04:00
Ajay
c1fc6519b4 Remove username restriction 2025-10-03 15:09:02 -04:00
Ajay
2d5d3637fd Fix errors 2025-09-30 23:02:16 -04:00
Ajay
99ed7698c4 Handle trimmed UUID duplicates 2025-09-30 22:52:32 -04:00
Ajay Ramachandran
c0ee5206a2 Merge pull request #613 from ajayyy/dependabot/npm_and_yarn/tar-fs-2.1.4
Bump tar-fs from 2.1.3 to 2.1.4
2025-09-26 23:58:45 -04:00
dependabot[bot]
9c65f3ca34 Bump tar-fs from 2.1.3 to 2.1.4
Bumps [tar-fs](https://github.com/mafintosh/tar-fs) from 2.1.3 to 2.1.4.
- [Commits](https://github.com/mafintosh/tar-fs/compare/v2.1.3...v2.1.4)

---
updated-dependencies:
- dependency-name: tar-fs
  dependency-version: 2.1.4
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-09-26 19:54:27 +00:00
Ajay Ramachandran
c2c92cd168 Merge pull request #610 from mini-bomba/warning-history
Keep a history of warnings in the public database
2025-09-19 02:15:59 -04:00
Ajay Ramachandran
b4ea2018d5 Merge pull request #596 from ajayyy/dependabot/npm_and_yarn/multi-456de2e4f1
Bump serialize-javascript and mocha
2025-09-18 16:08:32 -04:00
dependabot[bot]
da448af4cf Bump serialize-javascript and mocha
Bumps [serialize-javascript](https://github.com/yahoo/serialize-javascript) to 6.0.2 and updates ancestor dependency [mocha](https://github.com/mochajs/mocha). These dependencies need to be updated together.


Updates `serialize-javascript` from 6.0.0 to 6.0.2
- [Release notes](https://github.com/yahoo/serialize-javascript/releases)
- [Commits](https://github.com/yahoo/serialize-javascript/compare/v6.0.0...v6.0.2)

Updates `mocha` from 10.1.0 to 10.8.2
- [Release notes](https://github.com/mochajs/mocha/releases)
- [Changelog](https://github.com/mochajs/mocha/blob/main/CHANGELOG.md)
- [Commits](https://github.com/mochajs/mocha/compare/v10.1.0...v10.8.2)

---
updated-dependencies:
- dependency-name: serialize-javascript
  dependency-type: indirect
- dependency-name: mocha
  dependency-type: direct:development
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-09-18 19:34:12 +00:00
Ajay Ramachandran
33a7934f33 Merge pull request #607 from ajayyy/dependabot/npm_and_yarn/form-data-4.0.4
Bump form-data from 4.0.0 to 4.0.4
2025-09-18 15:32:47 -04:00
Ajay Ramachandran
a2cad19167 Merge pull request #612 from ajayyy/dependabot/npm_and_yarn/multi-e981fcb12d
Bump path-to-regexp and express
2025-09-18 15:32:39 -04:00
dependabot[bot]
721720a60d Bump path-to-regexp and express
Bumps [path-to-regexp](https://github.com/pillarjs/path-to-regexp) to 1.9.0 and updates ancestor dependency [express](https://github.com/expressjs/express). These dependencies need to be updated together.


Updates `path-to-regexp` from 1.8.0 to 1.9.0
- [Release notes](https://github.com/pillarjs/path-to-regexp/releases)
- [Changelog](https://github.com/pillarjs/path-to-regexp/blob/master/History.md)
- [Commits](https://github.com/pillarjs/path-to-regexp/compare/v1.8.0...v1.9.0)

Updates `express` from 4.21.1 to 4.21.2
- [Release notes](https://github.com/expressjs/express/releases)
- [Changelog](https://github.com/expressjs/express/blob/4.21.2/History.md)
- [Commits](https://github.com/expressjs/express/compare/4.21.1...4.21.2)

---
updated-dependencies:
- dependency-name: path-to-regexp
  dependency-version: 1.9.0
  dependency-type: indirect
- dependency-name: express
  dependency-version: 4.21.2
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-09-13 13:49:39 +00:00
Ajay Ramachandran
220fe52013 Merge pull request #611 from ajayyy/dependabot/npm_and_yarn/axios-1.12.1
Bump axios from 1.8.4 to 1.12.1
2025-09-13 09:48:30 -04:00
dependabot[bot]
07c0f5cfbd Bump axios from 1.8.4 to 1.12.1
Bumps [axios](https://github.com/axios/axios) from 1.8.4 to 1.12.1.
- [Release notes](https://github.com/axios/axios/releases)
- [Changelog](https://github.com/axios/axios/blob/v1.x/CHANGELOG.md)
- [Commits](https://github.com/axios/axios/compare/v1.8.4...v1.12.1)

---
updated-dependencies:
- dependency-name: axios
  dependency-version: 1.12.1
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-09-13 12:38:31 +00:00
mini-bomba
899000309f make eslint scream about promises, then fix all lints
also rewrite a bunch of test suites from using done callbacks to using
async functions - it's way too easy to forget about a .catch() clause
2025-09-11 01:14:40 +02:00
mini-bomba
5664ff4f58 fix missed awaits for db.prepare in test cases 2025-09-10 23:24:44 +02:00
mini-bomba
c942eea640 autogenerate userids for the postwarning test suite 2025-09-10 22:59:54 +02:00
mini-bomba
b09e552d1d add disableTime column to the warnings table 2025-09-10 22:50:43 +02:00
mini-bomba
3e74a0da58 Remove warning expiry, save warning history 2025-09-10 18:54:56 +02:00
mini-bomba
1b99a8534c type IDatabase::prepare with overloads 2025-09-10 17:08:39 +02:00
Ajay
3711286ef2 Fix old xss prevention only removing first less than symbol 2025-07-30 01:26:02 -04:00
Ajay
74b9b123a8 Merge branch 'master' of https://github.com/ajayyy/SponsorBlockServer 2025-07-25 02:15:29 -04:00
Ajay
367cb24478 Add hook category 2025-07-25 02:15:27 -04:00
dependabot[bot]
41c91b8b03 Bump form-data from 4.0.0 to 4.0.4
Bumps [form-data](https://github.com/form-data/form-data) from 4.0.0 to 4.0.4.
- [Release notes](https://github.com/form-data/form-data/releases)
- [Changelog](https://github.com/form-data/form-data/blob/master/CHANGELOG.md)
- [Commits](https://github.com/form-data/form-data/compare/v4.0.0...v4.0.4)

---
updated-dependencies:
- dependency-name: form-data
  dependency-version: 4.0.4
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-07-21 23:11:19 +00:00
Ajay Ramachandran
40c4ec7437 Merge pull request #606 from ajayyy/dependabot/npm_and_yarn/tar-fs-2.1.3
Bump tar-fs from 2.1.2 to 2.1.3
2025-06-03 14:40:36 -04:00
dependabot[bot]
70ce320737 Bump tar-fs from 2.1.2 to 2.1.3
Bumps [tar-fs](https://github.com/mafintosh/tar-fs) from 2.1.2 to 2.1.3.
- [Commits](https://github.com/mafintosh/tar-fs/commits)

---
updated-dependencies:
- dependency-name: tar-fs
  dependency-version: 2.1.3
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-06-03 09:19:22 +00:00
Ajay Ramachandran
0bfc9b30f5 Merge pull request #605 from mini-bomba/fix/banned-webhooks
don't send dearrow webhooks for banned users
2025-05-29 12:09:59 -04:00
mini-bomba
bce5385864 shortcircuit the new user check for banned users 2025-05-26 16:41:55 +02:00
mini-bomba
f71c4ceba9 don't send dearrow webhooks for banned users 2025-05-25 22:46:23 +02:00
Ajay
69ca711bb3 Merge branch 'master' of https://github.com/ajayyy/SponsorBlockServer 2025-04-28 19:17:55 -04:00
Ajay
e519986027 Fix dearrow hiding 2025-04-28 19:17:54 -04:00
Ajay Ramachandran
c0e7401a73 Merge pull request #604 from mini-bomba/limit-usernames
Limit username creation
2025-04-28 19:04:03 -04:00
Ajay
314461c9f0 Fix old user check 2025-04-28 19:03:24 -04:00
mini-bomba
655789e62d Limit username creation 2025-04-29 00:10:20 +02:00
Ajay
339ba127eb Fix log 2025-04-28 02:59:15 -04:00
Ajay
da393da9e9 Fix log 2025-04-28 02:56:36 -04:00
Ajay
aa2c7bf6ea Fix log 2025-04-28 02:41:34 -04:00
Ajay
c82708aae8 Fix log 2025-04-28 02:40:57 -04:00
Ajay
26c575d37a Add log 2025-04-28 02:32:17 -04:00
Ajay Ramachandran
1b3b1b1cb3 Merge pull request #603 from mini-bomba/request-validator-rule-names
Add request validator rule names
2025-04-26 01:46:42 -04:00
mini-bomba
9bc4bf8c7b Add request validator rule names 2025-04-26 02:10:33 +02:00
Ajay
cbc38c5ac8 Add another logging webhook 2025-04-25 17:43:01 -04:00
Ajay Ramachandran
e7f3753077 Merge pull request #602 from mini-bomba/request-validator
Create an engine for rule-based request validation
2025-04-25 17:37:25 -04:00
mini-bomba
f44d3cd92c rephrase old rejection logs 2025-04-25 21:52:39 +02:00
mini-bomba
4db4e9458e hook up extra functions to the request validator 2025-04-25 21:52:39 +02:00
mini-bomba
b2cd048909 load request validator rules from env as json 2025-04-25 21:11:30 +02:00
mini-bomba
5c249fb02b test cases for the request validator engine 2025-04-25 21:11:30 +02:00
mini-bomba
f7e5394a18 create a request validator engine 2025-04-25 21:11:30 +02:00
Ajay
161db6df0c Don't error if failing to parse vanced ua 2025-04-25 13:51:46 -04:00
Ajay
920d288f0b Add title to webhook 2025-04-25 13:00:34 -04:00
Ajay
0d005c23bf Add another validity filter 2025-04-25 12:55:44 -04:00
Ajay
9f745d3a8b Move permission check 2025-04-21 23:50:46 -04:00
Ajay
39f8dc6c22 Fix revanced ua 2025-04-21 20:14:43 -04:00
Ajay
08ba5c21b1 Fix validity check 2025-04-21 19:39:46 -04:00
Ajay
cfd61dc8dd Validity check 2025-04-21 19:26:42 -04:00
Ajay
039fb3ac7a More logs 2025-04-21 12:39:14 -04:00
Ajay
fccebfa487 Fixed webhook again 2025-04-21 11:27:16 -04:00
Ajay
6130ac8150 Change color for dearrow webhook 2025-04-21 11:20:08 -04:00
Ajay
7e681d2cd5 Fix webhook newlines 2025-04-21 11:15:30 -04:00
Ajay
707b36d161 Fix user agent parser lower casing 2025-04-21 11:10:47 -04:00
Ajay
b849328fae More logging 2025-04-21 10:53:49 -04:00
Ajay
3d596f4528 Save user agent for dearrow 2025-04-17 01:05:34 -04:00
Ajay
ed5a397a30 Improve permission check 2025-04-15 02:01:41 -04:00
Ajay
300642fd4f Fix innertube failure handling 2025-04-12 00:44:10 -04:00
Ajay
46580322fc Fix dearrow old submitter check 2025-04-11 02:44:19 -04:00
Ajay
318152dac6 ua 2025-04-11 02:41:17 -04:00
Ajay
8111d34b30 Fix dearrow threshold not configurable 2025-04-10 16:51:46 -04:00
Ajay
ac78dee210 Fix undefined error 2025-04-10 14:38:08 -04:00
Ajay
d18a4a13f2 Check dearrow vote history for new submitters 2025-04-10 12:47:32 -04:00
Ajay
8d40d61efc Allow max users without a submitter threshold 2025-04-10 02:35:22 -04:00
Ajay
74f6224091 Add new user limit per 5 mins 2025-04-10 02:26:09 -04:00
Ajay
9b55dc5d4d Add new config option 2025-04-08 16:52:16 -04:00
Ajay
8cd2138989 Use config for old submitter check 2025-04-08 16:50:04 -04:00
Ajay
e40af45c73 Fix query 2025-04-08 16:43:01 -04:00
Ajay
5de1fe4388 Fix post config url 2025-04-08 16:26:51 -04:00
Ajay
2ef3d68af0 Await promise not being awaited 2025-04-08 16:02:29 -04:00
Ajay
f67244663e Use alias for getting server config 2025-04-08 15:56:23 -04:00
Ajay
00064d5a7c Fix config fetching 2025-04-08 15:46:14 -04:00
Ajay
ac26aed21c Add endpoints for config setting 2025-04-08 15:18:32 -04:00
Ajay
2aa3589312 Add ability to set config 2025-04-08 15:15:39 -04:00
Ajay
82af8f200b Add better ua parsing 2025-04-08 14:21:24 -04:00
Ajay
69cb33aad0 Add logs 2025-04-08 13:23:24 -04:00
Ajay
3817d7fdba Better submission error message 2025-04-08 13:21:01 -04:00
Ajay
34a6a83e44 Change dearrow permission requirements 2025-04-07 19:28:52 -04:00
Ajay
0967373cb2 Rename func 2025-04-07 00:57:48 -04:00
Ajay
b7794b57d0 Fix can vote checks 2025-04-07 00:57:08 -04:00
Ajay
550339db41 Add permission check in more places 2025-04-07 00:36:01 -04:00
Ajay
b69f050b44 Old submitter only 2025-04-07 00:29:53 -04:00
Ajay Ramachandran
59a986f32f Merge pull request #600 from ajayyy/dependabot/npm_and_yarn/multi-b9f445934c
Bump semver and nodemon
2025-03-29 15:07:03 -04:00
dependabot[bot]
7088a1688d Bump semver and nodemon
Bumps [semver](https://github.com/npm/node-semver) to 7.7.1 and updates ancestor dependencies [semver](https://github.com/npm/node-semver) and [nodemon](https://github.com/remy/nodemon). These dependencies need to be updated together.


Updates `semver` from 7.3.7 to 7.7.1
- [Release notes](https://github.com/npm/node-semver/releases)
- [Changelog](https://github.com/npm/node-semver/blob/main/CHANGELOG.md)
- [Commits](https://github.com/npm/node-semver/compare/v7.3.7...v7.7.1)

Updates `semver` from 6.3.0 to 7.7.1
- [Release notes](https://github.com/npm/node-semver/releases)
- [Changelog](https://github.com/npm/node-semver/blob/main/CHANGELOG.md)
- [Commits](https://github.com/npm/node-semver/compare/v7.3.7...v7.7.1)

Updates `nodemon` from 2.0.20 to 3.1.9
- [Release notes](https://github.com/remy/nodemon/releases)
- [Commits](https://github.com/remy/nodemon/compare/v2.0.20...v3.1.9)

---
updated-dependencies:
- dependency-name: semver
  dependency-type: indirect
- dependency-name: semver
  dependency-type: indirect
- dependency-name: nodemon
  dependency-type: direct:development
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-03-29 04:55:45 +00:00
Ajay Ramachandran
f07f94fb09 Merge pull request #598 from ajayyy/dependabot/npm_and_yarn/tar-fs-2.1.2
Bump tar-fs from 2.1.1 to 2.1.2
2025-03-29 00:54:48 -04:00
Ajay Ramachandran
a7758a2608 Merge pull request #599 from ajayyy/dependabot/npm_and_yarn/axios-1.8.4
Bump axios from 1.7.7 to 1.8.4
2025-03-29 00:54:40 -04:00
dependabot[bot]
fd5bc43281 Bump axios from 1.7.7 to 1.8.4
Bumps [axios](https://github.com/axios/axios) from 1.7.7 to 1.8.4.
- [Release notes](https://github.com/axios/axios/releases)
- [Changelog](https://github.com/axios/axios/blob/v1.x/CHANGELOG.md)
- [Commits](https://github.com/axios/axios/compare/v1.7.7...v1.8.4)

---
updated-dependencies:
- dependency-name: axios
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-03-29 04:40:49 +00:00
dependabot[bot]
3633d0fbb4 Bump tar-fs from 2.1.1 to 2.1.2
Bumps [tar-fs](https://github.com/mafintosh/tar-fs) from 2.1.1 to 2.1.2.
- [Commits](https://github.com/mafintosh/tar-fs/compare/v2.1.1...v2.1.2)

---
updated-dependencies:
- dependency-name: tar-fs
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-03-28 22:28:04 +00:00
Ajay
aae56887da Update gitignore 2025-03-12 02:46:05 -04:00
Ajay
4fe7cebcb3 Add caching for 5 length skip skip segment query 2025-03-12 02:45:59 -04:00
Ajay
31e678fdc2 Store titles for casual vote submissions
When an uploader changes the title, it will reset the casual votes
2025-02-17 03:16:57 -05:00
Ajay
d44ce3c2dc Add casual votes table export 2025-02-16 14:43:55 -05:00
Ajay
5f9b4c8acc Make casual downvotes apply to all categories 2025-02-13 04:03:38 -05:00
Ajay
d608125b41 Add endpoint for casual submission count 2025-02-12 03:52:03 -05:00
Ajay
fb3abb3216 Fix index for casual votes 2025-02-06 03:01:22 -05:00
Ajay
ccde64e90f Change casual submission to allow submitting multiple categories 2025-02-06 02:57:09 -05:00
Ajay
4abf57b0ce Save casual mode status in db 2025-02-06 02:51:13 -05:00
Ajay
07435b9af1 Add casual mode endpoint 2025-02-05 03:38:55 -05:00
Ajay Ramachandran
ab9cab8ff5 Merge pull request #582 from hanydd/dev_join
Change reduce to join function for simplicity
2025-02-03 20:55:55 -05:00
Ajay Ramachandran
311c653ea2 Merge pull request #592 from Choromanski/feature/node-deprecation
Upgraded github actions dependencies
2025-02-03 20:54:48 -05:00
Ajay Ramachandran
e92d47e1a4 Merge pull request #593 from ajayyy/dependabot/npm_and_yarn/multi-9f37c16f8f
Bump cookie and express
2025-02-03 20:54:22 -05:00
Ajay Ramachandran
3734b88cb5 Merge pull request #595 from mchangrh/patch-1
bump & lock rsync dockerfile
2025-01-18 12:54:45 -05:00
Michael M. Chang
00086d9001 bump & lock rsync dockerfile 2025-01-18 06:45:09 -08:00
Ajay
a37a552b17 Fix video labels keys not clearing properly 2025-01-18 03:32:55 -05:00
Ajay
fa29cfd3c6 Add endpoint to get segment ID 2025-01-18 02:56:57 -05:00
Ajay
be9d97ae2b Add option to trim UUIDs in skip segments endpoint 2025-01-18 02:09:46 -05:00
Ajay
06f83cd8d4 Allow voting and viewing with partial UUID 2025-01-18 02:04:27 -05:00
Ajay
80b1019783 Allow video labels cashing with prefix of 4 2025-01-18 00:22:17 -05:00
Ajay
2455d2cd7e Make hasStartSegment result optional 2025-01-17 23:59:17 -05:00
Ajay
e2a9976cd0 Add hasStartSegment to video label 2025-01-17 23:30:32 -05:00
Ajay
bba06511ce Remove unnecessary parts of video labels request 2025-01-17 04:38:08 -05:00
Ajay
043268dc10 Merge branch 'master' of https://github.com/ajayyy/SponsorBlockServer 2024-12-04 14:49:36 -05:00
Ajay
003fe77e72 Change backup retention 2024-12-04 14:49:34 -05:00
Ajay Ramachandran
efe59c5098 Merge pull request #594 from mini-bomba/dearrow_locked_vip_downvotes
Send a different message for VIP downvotes on locked titles
2024-11-14 19:56:02 -05:00
Ajay Ramachandran
7ef6452eb5 Double quote 2024-11-14 19:49:25 -05:00
mini-bomba
9c01b711a5 Send a different message for VIP downvotes on locked titles 2024-11-14 22:30:14 +01:00
Ajay
b2981fe782 Don't allow multiple downvotes on one submission 2024-11-10 15:21:40 -05:00
Ajay
405805ff89 Add check against missing api video detail failing to fetch 2024-11-07 02:24:39 -05:00
Ajay
01c306287a Fix axios error handling 2024-11-06 21:23:15 -05:00
Ajay
826d49ba1f Add support for floatie proxy 2024-11-04 15:04:43 -05:00
Ajay
b03057c5bf Fix redis cache metrics generation 2024-10-30 02:35:59 -04:00
Ajay
54e03a389b Remove string from metrics 2024-10-30 02:32:33 -04:00
Ajay
93f7161724 Only uploaded warned info for upvotes 2024-10-27 02:17:34 -04:00
dependabot[bot]
efa6c10d56 Bump cookie and express
Bumps [cookie](https://github.com/jshttp/cookie) to 0.7.1 and updates ancestor dependency [express](https://github.com/expressjs/express). These dependencies need to be updated together.


Updates `cookie` from 0.6.0 to 0.7.1
- [Release notes](https://github.com/jshttp/cookie/releases)
- [Commits](https://github.com/jshttp/cookie/compare/v0.6.0...v0.7.1)

Updates `express` from 4.21.0 to 4.21.1
- [Release notes](https://github.com/expressjs/express/releases)
- [Changelog](https://github.com/expressjs/express/blob/4.21.1/History.md)
- [Commits](https://github.com/expressjs/express/compare/4.21.0...4.21.1)

---
updated-dependencies:
- dependency-name: cookie
  dependency-type: indirect
- dependency-name: express
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-10-19 12:03:41 +00:00
Ajay
e9c0c44528 Add new webhook for was warned 2024-10-18 03:08:24 -04:00
Brian Choromanski
4dfbb9039d Upgraded more actions dependencies 2024-10-08 21:17:01 -04:00
Brian Choromanski
05c5cf57e4 Upgraded actions dependencies 2024-10-08 21:10:10 -04:00
Ajay
566eabdc31 Add metrics endpoint 2024-10-02 20:06:57 -04:00
Ajay Ramachandran
f26db7238a Merge pull request #591 from ajayyy/dependabot/npm_and_yarn/multi-d66d039ac5
Bump serve-static and express
2024-09-17 02:30:27 -04:00
dependabot[bot]
fb05ec51d3 Bump serve-static and express
Bumps [serve-static](https://github.com/expressjs/serve-static) to 1.16.2 and updates ancestor dependency [express](https://github.com/expressjs/express). These dependencies need to be updated together.


Updates `serve-static` from 1.15.0 to 1.16.2
- [Release notes](https://github.com/expressjs/serve-static/releases)
- [Changelog](https://github.com/expressjs/serve-static/blob/v1.16.2/HISTORY.md)
- [Commits](https://github.com/expressjs/serve-static/compare/v1.15.0...v1.16.2)

Updates `express` from 4.19.2 to 4.21.0
- [Release notes](https://github.com/expressjs/express/releases)
- [Changelog](https://github.com/expressjs/express/blob/4.21.0/History.md)
- [Commits](https://github.com/expressjs/express/compare/4.19.2...4.21.0)

---
updated-dependencies:
- dependency-name: serve-static
  dependency-type: indirect
- dependency-name: express
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-09-17 06:18:07 +00:00
Ajay
eeb9f1b02f Log more when redis increment fails 2024-09-15 04:30:44 -04:00
Ajay
8ba68e1b4c One less call when dealing with lru cache for ttl result and ensure reset keys cleared 2024-09-14 18:02:22 -04:00
Ajay
17059fdbe6 One less call when dealing with lru cache 2024-09-14 17:52:50 -04:00
Ajay
6e5f4f7610 Fix active requests list not getting deleted 2024-09-14 17:33:17 -04:00
Ajay
c313590d36 persona's revenge 2024-09-13 21:51:06 -04:00
Ajay
4508ad11f2 Fix error when submitter ip not found 2024-09-13 14:37:37 -04:00
Ajay
dc5158257e Fix errors when postgres returns undefined and trying to save to redis 2024-09-13 14:36:52 -04:00
Ajay
6edd71194b Log redis stats on high db load 2024-09-13 14:29:32 -04:00
Ajay
7678be1e24 Add max redis response time for reads 2024-09-13 04:06:50 -04:00
Ajay
d28ac39d4f Allow newly used header 2024-09-07 23:31:16 -04:00
Ajay
5fd6b5eb8b Fix canSubmitOriginal query on postgres 2024-09-01 19:29:08 -04:00
Ajay
0e1a38c4d4 Merge branch 'master' of https://github.com/ajayyy/SponsorBlockServer 2024-09-01 19:01:00 -04:00
Ajay
c496be5651 Disable innertube tests while they are broken 2024-09-01 19:00:59 -04:00
Ajay Ramachandran
15a9c3a4eb Merge pull request #583 from ajayyy/dependabot/npm_and_yarn/braces-3.0.3
Bump braces from 3.0.2 to 3.0.3
2024-09-01 18:57:32 -04:00
Ajay Ramachandran
f1ebd56526 Merge pull request #590 from ajayyy/dependabot/npm_and_yarn/axios-1.7.7
Bump axios from 1.6.0 to 1.7.7
2024-09-01 18:57:18 -04:00
Ajay
258749ac31 Add more strict requirements for voting for original thumbnails 2024-09-01 18:56:29 -04:00
dependabot[bot]
ccccb1af3c Bump axios from 1.6.0 to 1.7.7
Bumps [axios](https://github.com/axios/axios) from 1.6.0 to 1.7.7.
- [Release notes](https://github.com/axios/axios/releases)
- [Changelog](https://github.com/axios/axios/blob/v1.x/CHANGELOG.md)
- [Commits](https://github.com/axios/axios/compare/v1.6.0...v1.7.7)

---
updated-dependencies:
- dependency-name: axios
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-09-01 22:43:59 +00:00
Ajay
13b8a988db update dep 2024-09-01 18:43:28 -04:00
Ajay
803fc18554 Verify old submissions right after someone votes on it 2024-08-16 00:36:42 -04:00
Ajay
59373cf346 Fix rejected server-side rendered ads issue not rejecting 2024-08-14 23:42:47 -04:00
Ajay
05fd6abe91 Use env vars in workflow 2024-08-12 01:04:12 -04:00
Ajay
090e185765 Add support for poToken and visitor data
Fixes api requests

https://github.com/iv-org/invidious/pull/4789
2024-08-12 00:33:11 -04:00
Ajay
d2df5cef98 Merge branch 'master' of https://github.com/ajayyy/SponsorBlockServer 2024-08-11 13:49:48 -04:00
Ajay
214946897d Hardcode nn-block reputation 2024-08-11 13:49:46 -04:00
Ajay Ramachandran
8da5de4d7b Merge pull request #588 from mini-bomba/dearrow-locked-titles-downvotes
postBranding.ts webhook changes
2024-08-04 09:38:24 +02:00
mini-bomba
380ec8d0ca Reformat SQL code in postBranding.ts webhook code 2024-08-03 22:01:46 +02:00
mini-bomba
72086b0195 Send webhook messages when a locked title is downvoted
also take downvotes & verification into consideration when comparing
titles in webhook code
2024-08-03 21:56:31 +02:00
mini-bomba
61dcfeb69f Don't send to #dearrow-locked-titles when downvoting unlocked title
voteType passed to sendWebhooks() function to avoid confusion in the
future should someone forget about the if statement
2024-08-03 21:39:21 +02:00
Ajay Ramachandran
19d6d85aa6 Merge pull request #589 from mini-bomba/tests-fix
fix postgres+redis tests
2024-08-03 21:32:18 +02:00
mini-bomba
814ceb56f1 fix postgres+redis tests
made on request
https://discord.com/channels/603643120093233162/607338052221665320/1269373542550470730
2024-08-03 21:23:44 +02:00
Ajay Ramachandran
195cc14d25 Merge pull request #585 from mini-bomba/unrelated_chapter_suggestions
Don't show completely unrelated chapter suggestions
2024-08-03 21:19:59 +02:00
Ajay Ramachandran
9427bf4f3d Merge pull request #586 from TristanWasTaken/db-schema
docs: fix typos in DatabaseSchema.md
2024-08-03 08:00:47 +02:00
mini-bomba
3f026409cd Don't show completely unrelated chapter suggestions
Chapter suggestions should be at least slightly related to what the user
has already typed.
This change stops the server from sending suggestions that postgresql
deems to be "less than 10% similar"

Also modified tests to reflect this change.
2024-07-29 02:26:53 +02:00
Ajay
d75b9ddcaa Show failure reason in webook 2024-07-24 13:42:40 -04:00
Ajay
2fb3d05055 private video? 2024-07-24 13:06:19 -04:00
Ajay
165ed8a6e0 Fix original thumbnail votes being shown because of fetch all 2024-07-09 19:49:37 -04:00
Ajay
495b8031e3 Add better logging for failed reputation call 2024-06-30 09:40:25 -04:00
HanYaodong
374ddc74bd Use join function for simplicity 2024-06-25 21:33:47 +08:00
Ajay
738f863581 Don't send server-side render error for title submissions 2024-06-25 14:36:05 +05:30
Tristan
8b5e69f36f docs: fix typos in DatabaseSchema.md 2024-06-24 03:14:05 +02:00
Ajay
10e37824d8 Merge branch 'master' of https://github.com/ajayyy/SponsorBlockServer 2024-06-21 15:43:30 +05:30
Ajay
428343e7d8 Require a vote for original to show 2024-06-21 15:43:26 +05:30
Ajay Ramachandran
4e69ac60bc Merge pull request #584 from TristanWasTaken/db
docs: update DatabaseSchema.md
2024-06-21 09:00:52 +05:30
Tristan
3b03792903 docs: fix userFeatures md list 2024-06-21 03:17:31 +02:00
Tristan
1a0b6ab097 Update DatabaseSchema.md 2024-06-21 03:15:07 +02:00
Tristan
8e5084cd72 docs: update private schemas 2024-06-21 03:11:28 +02:00
Tristan
96feaf3cbe docs: update public schemas 2024-06-21 03:08:38 +02:00
Tristan
d08cfee5b4 docs: update private indexes 2024-06-21 01:35:38 +02:00
Tristan
96dd9eceb3 docs: update public indexes 2024-06-21 01:34:25 +02:00
Tristan
4422104294 docs: format lists 2024-06-21 01:34:15 +02:00
Tristan
4ad553478b chore: fix misleading/unclear migration comments 2024-06-21 00:50:10 +02:00
dependabot[bot]
47323156c1 Bump braces from 3.0.2 to 3.0.3
Bumps [braces](https://github.com/micromatch/braces) from 3.0.2 to 3.0.3.
- [Changelog](https://github.com/micromatch/braces/blob/master/CHANGELOG.md)
- [Commits](https://github.com/micromatch/braces/compare/3.0.2...3.0.3)

---
updated-dependencies:
- dependency-name: braces
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-06-16 11:24:49 +00:00
Ajay
a181d52fb2 Fix types 2024-06-12 12:01:40 +05:30
Ajay
ee9ed6af1f Add server-side ads check for dearrow submissions 2024-06-12 11:57:59 +05:30
Ajay
ec1e6d63a4 Add protection against server-side ad injection (SSAP) 2024-06-12 09:55:41 +05:30
Ajay
5c10e071dc Change how video duration check works for submissions 2024-05-27 13:54:02 -04:00
Ajay
8eb6f5b2ea Merge branch 'master' of https://github.com/ajayyy/SponsorBlockServer 2024-05-07 15:37:59 -04:00
Ajay
bdfe4938d2 Fix vote webhook not working 2024-05-07 15:37:57 -04:00
Ajay Ramachandran
bcf29e4047 Merge pull request #579 from ajayyy/dependabot/npm_and_yarn/express-4.19.2
Bump express from 4.18.2 to 4.19.2
2024-05-05 01:28:41 -04:00
Ajay Ramachandran
622c3f27d6 Merge pull request #581 from mini-bomba/videoduration-inconsistency
Make returned video duration in getBranding.ts consistent
2024-05-05 01:28:22 -04:00
mini-bomba
7c1abd9747 Make returned video duration in getBranding.ts consistent
Instead of picking the first segment returned by the db (i.e. possibly
random), sort segments by submission time and use the oldest visible
segment with a non-zero video duration.
2024-05-04 21:56:03 +02:00
Ajay
709485e0e9 Increase frequency of docker forgets 2024-04-27 00:42:55 -04:00
Ajay
f841d8173b Fix ttl cache key not properly cleared 2024-04-22 00:53:09 -04:00
Ajay
b2f7e1b39b Fix locked check for thumbnail downvotes 2024-04-21 23:13:10 -04:00
Ajay
47ea6ae8d3 Only check request time for readiness if cache has filled up 2024-04-21 13:38:32 -04:00
Ajay
063607fe30 Add etags for branding as well 2024-04-20 13:16:34 -04:00
Ajay
4b795da5a0 Merge branch 'master' of https://github.com/ajayyy/SponsorBlockServer 2024-04-20 00:02:16 -04:00
Ajay
8043bd9006 Make max response time configurable 2024-04-20 00:02:15 -04:00
Ajay Ramachandran
bd8f4b7539 Merge pull request #577 from SuperStormer/master
cosmetic fix for lock reason
2024-04-19 21:22:57 -04:00
Ajay
0f97ce4a49 Make redis readiness check recoverable 2024-04-19 21:20:40 -04:00
Ajay
cfd7c3d8c4 Add more to ready check 2024-04-19 20:24:42 -04:00
Ajay
af7d8428ab Improve ready check 2024-04-19 20:05:52 -04:00
Ajay
7c51586664 Add error server 2024-04-16 03:01:44 -04:00
Ajay
2251ddc251 Add ready endpoint 2024-04-16 01:13:56 -04:00
Ajay
07d4dde4f6 Add connections to status 2024-04-16 00:13:51 -04:00
Ajay
b934b7a937 Use innertube when possible 2024-04-14 01:26:03 -04:00
Ajay
f2cf2e2aac Add db stats to logs 2024-04-13 03:00:26 -04:00
Ajay
2887a8505c Improve logging and fix ip fetch error breaking skip segments 2024-04-13 01:54:59 -04:00
Ajay
e289fe9075 Add ttl cache 2024-04-12 01:29:23 -04:00
Ajay
2cd9401a51 Fix etag tests 2024-04-11 18:12:02 -04:00
Ajay
47bea9ee6e Trigger usage of cache key when checking ttl 2024-04-11 17:57:53 -04:00
Ajay
0602fdd651 Use cache for ttl if possible
Also fixes etag when compression enabled
2024-04-11 17:54:32 -04:00
Ajay
7c77bf566e Remove quotes when processing etag 2024-04-11 17:07:13 -04:00
Ajay
1009fff9e9 Fix caching issues with one specific key form
.c regex was any character plus a c instead of intenced dot
2024-04-11 17:04:17 -04:00
Ajay
f43e59250f Add quotes to etag 2024-04-11 14:11:04 -04:00
Ajay
dc2115ef20 Change status timeout 2024-04-09 13:29:18 -04:00
dependabot[bot]
55c3e4f01f Bump express from 4.18.2 to 4.19.2
Bumps [express](https://github.com/expressjs/express) from 4.18.2 to 4.19.2.
- [Release notes](https://github.com/expressjs/express/releases)
- [Changelog](https://github.com/expressjs/express/blob/master/History.md)
- [Commits](https://github.com/expressjs/express/compare/4.18.2...4.19.2)

---
updated-dependencies:
- dependency-name: express
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-03-28 17:17:43 +00:00
Ajay
af31f511a5 Add tests for fetch all 2024-03-24 13:52:33 -04:00
Ajay
0d9cce0512 Fix wrong comparison with votes filtering 2024-03-24 13:42:39 -04:00
Ajay
c19d6fe97a Only send low voted segments when asked for 2024-03-22 18:37:39 -04:00
Ajay
47c109f012 Fix act as vip unlocking segments 2024-03-21 19:35:13 -04:00
Ajay
a921085da6 Fix vip downvotes unlocking 2024-03-21 19:28:05 -04:00
Ajay
d5ebd8ec1a Improve self downvoting for dearrow 2024-03-20 13:47:23 -04:00
Ajay
a7f10f7727 Attempt to fix docker build error 2024-03-17 13:40:53 -04:00
Ajay
1c234846db Merge branch 'master' of https://github.com/ajayyy/SponsorBlockServer 2024-03-17 13:20:25 -04:00
Ajay
bc1ca098e7 Fix bug causing people to accidentally super downvote 2024-03-17 13:20:23 -04:00
Ajay Ramachandran
cf21ebc2de Merge pull request #578 from ajayyy/dependabot/npm_and_yarn/follow-redirects-1.15.6
Bump follow-redirects from 1.15.4 to 1.15.6
2024-03-16 21:07:40 -04:00
dependabot[bot]
2426a6ee03 Bump follow-redirects from 1.15.4 to 1.15.6
Bumps [follow-redirects](https://github.com/follow-redirects/follow-redirects) from 1.15.4 to 1.15.6.
- [Release notes](https://github.com/follow-redirects/follow-redirects/releases)
- [Commits](https://github.com/follow-redirects/follow-redirects/compare/v1.15.4...v1.15.6)

---
updated-dependencies:
- dependency-name: follow-redirects
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-03-16 23:28:16 +00:00
SuperStormer
ba65c28459 Update postSkipSegments.ts 2024-03-15 02:20:24 -04:00
Ajay
591b342855 Add default user count, update url 2024-03-06 00:47:51 -05:00
Ajay Ramachandran
8d8388386e Merge pull request #571 from ajayyy/dependabot/npm_and_yarn/follow-redirects-1.15.4
Bump follow-redirects from 1.15.1 to 1.15.4
2024-02-27 03:49:44 -05:00
Ajay
a54bf556ed Revert "Fix usercounter behind cloudflare"
This reverts commit 9bcceb7e5b.
2024-02-27 03:49:03 -05:00
Ajay
f1c5b8a359 Merge branches 'master' and 'master' of https://github.com/ajayyy/SponsorBlockServer 2024-02-27 03:33:44 -05:00
Ajay
9bcceb7e5b Fix usercounter behind cloudflare 2024-02-27 03:33:38 -05:00
Ajay Ramachandran
da0cf0dedc Merge pull request #575 from ajayyy/dependabot/npm_and_yarn/axios-1.6.0
Bump axios from 1.1.3 to 1.6.0
2024-02-20 17:16:53 -05:00
dependabot[bot]
1cefdf4dac Bump axios from 1.1.3 to 1.6.0
Bumps [axios](https://github.com/axios/axios) from 1.1.3 to 1.6.0.
- [Release notes](https://github.com/axios/axios/releases)
- [Changelog](https://github.com/axios/axios/blob/v1.x/CHANGELOG.md)
- [Commits](https://github.com/axios/axios/compare/v1.1.3...v1.6.0)

---
updated-dependencies:
- dependency-name: axios
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-02-20 21:00:34 +00:00
Ajay
aec2aa4457 Fix keys not properly clearing 2024-02-16 22:14:09 -05:00
Ajay
3f29e11449 Fix submission and vote locks 2024-02-16 14:24:28 -05:00
Ajay
6d11e1c601 Support dragonfly with in memory cache 2024-02-09 18:16:28 -05:00
Ajay
9fa248037a Add to cache when calling set 2024-02-09 17:28:59 -05:00
Ajay
02a640d857 Use broadcast mode for redis 2024-02-09 15:34:36 -05:00
Ajay
17b002649e Add logging when too many active connections 2024-02-09 14:54:14 -05:00
Ajay
a74189b287 Fix cache invalidation with compression enabled 2024-02-09 14:19:56 -05:00
Ajay
09997d82ed Fix chrome extension user fetcher 2024-02-09 13:47:59 -05:00
Ajay
bf644d6899 Don't use broadcast mode for redis 2024-02-09 12:09:03 -05:00
Ajay
5929460239 Remove weighted randomness and change weight calculation 2024-02-09 12:08:52 -05:00
Ajay
09dd10ad6f Fix memory cache invalidation not invalidating every item 2024-02-09 00:34:12 -05:00
Ajay
af5e8cd68d Fix uncached misses tracking 2024-02-08 22:15:28 -05:00
Ajay
bd766ab430 Remove unused import 2024-02-08 22:12:53 -05:00
Ajay
bf1fe1ff61 Allow toggling redis compression and disable by default 2024-02-08 21:58:10 -05:00
Ajay
db225f8a84 Reuse running redis connections and handle redis race condition 2024-02-08 21:30:27 -05:00
Ajay
9364a7e654 Show general last invalidation message 2024-02-08 21:15:28 -05:00
Ajay
f3fffa56c9 Don't allow downvoting locked segments 2024-02-08 15:47:25 -05:00
Ajay
c478546128 Count invalidation only on successful delete 2024-02-08 15:12:48 -05:00
Ajay
e61f964d17 Add ttl to in memory cache cache 2024-02-08 14:37:01 -05:00
Ajay
5f8ef25d88 Use broadcast mode for client tracking and add new memory cache stat 2024-02-08 14:30:32 -05:00
Ajay
b76cfdf798 Allow more things to be cached 2024-02-08 03:40:41 -05:00
Ajay
3c6000f2da Rename config for clientCacheSize 2024-02-08 03:26:06 -05:00
Ajay
9944d70f6b Use size for lru limit instead of length 2024-02-08 03:23:55 -05:00
Ajay
27069cb5c2 Change what gets saved in memory cache 2024-02-08 03:08:02 -05:00
Ajay
8aa03c81a7 Improve cache miss calculation 2024-02-08 03:06:30 -05:00
Ajay
e8879f66b1 Add redis in memory cache stats 2024-02-08 02:58:51 -05:00
Ajay
acdbd3787b More specific on what should be client cached 2024-02-08 01:04:48 -05:00
Ajay
1f7156eb29 Don't crash if redis message invalid 2024-02-08 00:34:37 -05:00
Ajay
7405053b44 Reuse running reputation requests 2024-02-07 23:40:59 -05:00
Ajay
a929f69452 Fix same ip being fetched multiple times from postgres 2024-02-07 23:36:45 -05:00
Ajay
8574ec3a0c Fix is number check 2024-02-07 22:28:28 -05:00
Ajay
1475c91327 Clear cache again after setting up client tracking 2024-02-06 15:32:40 -05:00
Ajay
5b1b362bf0 Handle reconnects with client-side caching
Also upgrades redis to fix a library bug
2024-02-06 00:52:42 -05:00
Ajay
14da10bd8a Add client-side caching 2024-02-05 13:11:44 -05:00
Ajay
547632341a Add back redis compression optionally 2024-02-04 23:17:28 -05:00
Ajay
c54c25c73b Disable query cache for segment groups 2024-02-04 22:53:12 -05:00
Ajay
121cc7f481 Fix duplicate behavior with submitting full video labels 2024-01-31 13:05:47 -05:00
Ajay
e041b9c930 Don't throw 409 if only one segment was successfully submitted 2024-01-31 12:59:01 -05:00
Ajay
59d9ed390f Fix titles and thumbnails being unlocked 2024-01-28 22:05:04 -05:00
Ajay
4477ab7ca6 Remove bad test 2024-01-21 19:55:16 -05:00
Ajay
25ec9b0291 Revert adding redis compression
This reverts commit fce311377f and 2ad51842cc
2024-01-21 19:49:36 -05:00
Ajay
c3e00ac8b1 Merge branch 'master' of https://github.com/ajayyy/SponsorBlockServer 2024-01-21 19:46:18 -05:00
Ajay
2c9079f565 No more verification through sb submissions 2024-01-21 19:46:16 -05:00
Ajay Ramachandran
aee84a4b6e Merge pull request #572 from SashaXser/master
Promise.resolve and Using "forEach" instead of "map"
2024-01-20 00:08:55 -05:00
SashaXser
a8010b553d Merge branch 'master' into master 2024-01-20 07:07:03 +04:00
SashaXser
5b95aa8aba Resolve conflicts 2024-01-20 06:59:12 +04:00
Ajay
fce311377f Switch to lz4 compression 2024-01-19 15:16:50 -05:00
Ajay
dcb479f3d2 Fallback to allowing taking a lock if redis fails 2024-01-19 14:35:32 -05:00
Ajay
2ad51842cc Compress redis values 2024-01-19 14:34:18 -05:00
SashaXser
ea60947092 format fix 2024-01-19 14:31:03 +04:00
SashaXser
14b6f84f94 2 things
Consider using "forEach" instead of "map" as its return value is not being used here.
Replace this trivial promise with "Promise.resolve".
2024-01-19 08:50:45 +04:00
Ajay
8e13ec60d6 Fix other get missing throw 2024-01-18 11:57:50 -05:00
Ajay
c9f7275942 Only use redis timeout when db not under load 2024-01-18 09:22:00 -05:00
Ajay
d607d8b179 Don't fallback to db when too many redis connections 2024-01-15 14:07:34 -05:00
dependabot[bot]
5974b51391 Bump follow-redirects from 1.15.1 to 1.15.4
Bumps [follow-redirects](https://github.com/follow-redirects/follow-redirects) from 1.15.1 to 1.15.4.
- [Release notes](https://github.com/follow-redirects/follow-redirects/releases)
- [Commits](https://github.com/follow-redirects/follow-redirects/compare/v1.15.1...v1.15.4)

---
updated-dependencies:
- dependency-name: follow-redirects
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-01-10 02:08:26 +00:00
Ajay
7aaf000d99 Fix index for hashed video id dearrow fetching 2024-01-09 15:31:56 -05:00
Ajay
0edf0b9e1c Don't handle shadowhide on high load 2024-01-03 11:37:58 -05:00
Ajay
84fd7c170f Add test for VIP downvote without removing 2024-01-03 01:18:57 -05:00
Ajay
b04e0dcd97 DeArrow downvotes 2024-01-03 01:13:35 -05:00
Ajay
33dad0a5e4 Add option to submit without locking
Also fixes voting for an existing thumbnail not unlocking other thumbnails
2024-01-02 19:12:55 -05:00
Ajay
ad439fd368 Make sure latest dump is not deleted 2023-12-28 19:10:12 -05:00
Ajay
21bb893a47 Merge branch 'master' of https://github.com/ajayyy/SponsorBlockServer 2023-12-28 18:23:45 -05:00
Ajay
211ecf700b Reject on dump failure to trigger a retry 2023-12-28 18:23:41 -05:00
Ajay Ramachandran
951d678640 Merge pull request #569 from mchangrh/fix-shadowban
clean up shadowban code, exclude long running categories query
2023-12-21 20:36:28 -05:00
Michael C
15f19df8a4 clean up shadowban code, exclude long running categories query when possible 2023-12-21 18:37:24 -05:00
Ajay Ramachandran
4a4d5776a1 Merge pull request #568 from ajayyy/revert-566-dependabot/npm_and_yarn/axios-1.6.0
Revert "Bump axios from 1.1.3 to 1.6.0"
2023-12-06 00:17:05 -05:00
119 changed files with 6730 additions and 2712 deletions

View File

@@ -31,15 +31,15 @@ module.exports = {
},
overrides: [
{
files: ["src/**/*.ts"],
files: ["**/*.ts"],
parserOptions: {
project: ["./tsconfig.json"],
project: ["./tsconfig.eslint.json"],
},
rules: {
"@typescript-eslint/no-misused-promises": "warn",
"@typescript-eslint/no-floating-promises" : "warn"
"@typescript-eslint/no-misused-promises": "error",
"@typescript-eslint/no-floating-promises" : "error"
}
},
],

View File

@@ -22,10 +22,10 @@ jobs:
permissions:
packages: write
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
- name: Docker meta
id: meta
uses: docker/metadata-action@v4
uses: docker/metadata-action@v5
with:
images: |
ghcr.io/${{ inputs.username }}/${{ inputs.name }}
@@ -34,19 +34,19 @@ jobs:
flavor: |
latest=true
- name: Login to GHCR
uses: docker/login-action@v2
uses: docker/login-action@v3
with:
registry: ghcr.io
username: ${{ github.repository_owner }}
password: ${{ secrets.GH_TOKEN }}
- name: Set up QEMU
uses: docker/setup-qemu-action@v1
uses: docker/setup-qemu-action@v3
with:
platforms: arm,arm64
- name: Set up buildx
uses: docker/setup-buildx-action@v1
uses: docker/setup-buildx-action@v3
- name: push
uses: docker/build-push-action@v3
uses: docker/build-push-action@v6
with:
context: ${{ inputs.folder }}
platforms: linux/amd64,linux/arm64

16
.github/workflows/error-server.yml vendored Normal file
View File

@@ -0,0 +1,16 @@
name: Docker image builds
on:
push:
branches:
- master
workflow_dispatch:
jobs:
error-server:
uses: ./.github/workflows/docker-build.yml
with:
name: "error-server"
username: "ajayyy"
folder: "./containers/error-server"
secrets:
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}

View File

@@ -14,8 +14,8 @@ jobs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions/setup-node@v3
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
with:
node-version: 18
cache: npm
@@ -26,7 +26,7 @@ jobs:
- name: Run Server
timeout-minutes: 10
run: npm start
- uses: actions/upload-artifact@v3
- uses: actions/upload-artifact@v4
with:
name: SponsorTimesDB.db
path: databases/sponsorTimes.db

View File

@@ -12,8 +12,8 @@ jobs:
name: Lint with ESLint and build
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions/setup-node@v3
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
with:
node-version: 18
cache: npm
@@ -21,7 +21,7 @@ jobs:
- run: npm run lint
- run: npm run tsc
- name: cache dist build
uses: actions/cache/save@v3
uses: actions/cache/save@v4
with:
key: dist-${{ github.sha }}
path: |
@@ -32,13 +32,13 @@ jobs:
runs-on: ubuntu-latest
needs: lint-build
steps:
- uses: actions/checkout@v3
- uses: actions/setup-node@v3
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
with:
node-version: 18
cache: npm
- id: cache
uses: actions/cache/restore@v3
uses: actions/cache/restore@v4
with:
key: dist-${{ github.sha }}
path: |
@@ -46,11 +46,14 @@ jobs:
${{ github.workspace }}/node_modules
- if: steps.cache.outputs.cache-hit != 'true'
run: npm ci
env:
youTubeKeys_visitorData: ${{ secrets.YOUTUBEKEYS_VISITORDATA }}
youTubeKeys_poToken: ${{ secrets.YOUTUBEKEYS_POTOKEN }}
- name: Run SQLite Tests
timeout-minutes: 5
run: npx nyc --silent npm test
- name: cache nyc output
uses: actions/cache/save@v3
uses: actions/cache/save@v4
with:
key: nyc-sqlite-${{ github.sha }}
path: ${{ github.workspace }}/.nyc_output
@@ -59,20 +62,20 @@ jobs:
runs-on: ubuntu-latest
needs: lint-build
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
- name: Build the docker-compose stack
env:
PG_USER: ci_db_user
PG_PASS: ci_db_pass
run: docker-compose -f docker/docker-compose-ci.yml up -d
run: docker compose -f docker/docker-compose-ci.yml up -d
- name: Check running containers
run: docker ps
- uses: actions/setup-node@v3
- uses: actions/setup-node@v4
with:
node-version: 18
cache: npm
- id: cache
uses: actions/cache/restore@v3
uses: actions/cache/restore@v4
with:
key: dist-${{ github.sha }}
path: |
@@ -83,10 +86,12 @@ jobs:
- name: Run Postgres Tests
env:
TEST_POSTGRES: true
youTubeKeys_visitorData: ${{ secrets.YOUTUBEKEYS_VISITORDATA }}
youTubeKeys_poToken: ${{ secrets.YOUTUBEKEYS_POTOKEN }}
timeout-minutes: 5
run: npx nyc --silent npm test
- name: cache nyc output
uses: actions/cache/save@v3
uses: actions/cache/save@v4
with:
key: nyc-postgres-${{ github.sha }}
path: ${{ github.workspace }}/.nyc_output
@@ -95,22 +100,22 @@ jobs:
name: Run Codecov
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions/setup-node@v3
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
with:
node-version: 18
cache: npm
- run: npm ci
- name: restore postgres nyc output
uses: actions/cache/restore@v3
uses: actions/cache/restore@v4
with:
key: nyc-postgres-${{ github.sha }}
path: ${{ github.workspace }}/.nyc_output
- name: restore sqlite nyc output
uses: actions/cache/restore@v3
uses: actions/cache/restore@v4
with:
key: nyc-sqlite-${{ github.sha }}
path: ${{ github.workspace }}/.nyc_output
- run: npx nyc report --reporter=lcov
- name: Upload coverage reports to Codecov
uses: codecov/codecov-action@v3
uses: codecov/codecov-action@v4

4
.gitignore vendored
View File

@@ -48,4 +48,6 @@ working
# nyc coverage output
.nyc_output/
coverage/
coverage/
.vscode

View File

@@ -1,20 +1,29 @@
# SponsorTimesDB
[vipUsers](#vipUsers)
[sponsorTimes](#sponsorTimes)
[userNames](#userNames)
[categoryVotes](#categoryVotes)
[lockCategories](#lockCategories)
[warnings](#warnings)
[shadowBannedUsers](#shadowBannedUsers)
[unlistedVideos](#unlistedVideos)
[config](#config)
[archivedSponsorTimes](#archivedSponsorTimes)
- [vipUsers](#vipusers)
- [sponsorTimes](#sponsortimes)
- [userNames](#usernames)
- [categoryVotes](#categoryvotes)
- [lockCategories](#lockcategories)
- [warnings](#warnings)
- [shadowBannedUsers](#shadowbannedusers)
- [videoInfo](#videoinfo)
- [unlistedVideos](#unlistedvideos)
- [config](#config)
- [archivedSponsorTimes](#archivedsponsortimes)
- [ratings](#ratings)
- [userFeatures](#userFeatures)
- [shadowBannedIPs](#shadowBannedIPs)
- [titles](#titles)
- [titleVotes](#titleVotes)
- [thumbnails](#thumbnails)
- [thumbnailTimestamps](#thumbnailTimestamps)
- [thumbnailVotes](#thumbnailVotes)
### vipUsers
| Name | Type | |
| -- | :--: | -- |
| userID | TEXT | not null |
| userID | TEXT | not null, primary key |
| index | field |
| -- | :--: |
@@ -30,7 +39,7 @@
| votes | INTEGER | not null |
| locked | INTEGER | not null, default '0' |
| incorrectVotes | INTEGER | not null, default 1 |
| UUID | TEXT | not null, unique |
| UUID | TEXT | not null, unique, primary key |
| userID | TEXT | not null |
| timeSubmitted | INTEGER | not null |
| views | INTEGER | not null |
@@ -50,14 +59,16 @@
| sponsorTime_timeSubmitted | timeSubmitted |
| sponsorTime_userID | userID |
| sponsorTimes_UUID | UUID |
| sponsorTimes_hashedVideoID | hashedVideoID, category |
| sponsorTimes_videoID | videoID, service, category, timeSubmitted |
| sponsorTimes_hashedVideoID | service, hashedVideoID, startTime |
| sponsorTimes_videoID | service, videoID, startTime |
| sponsorTimes_videoID_category | videoID, category |
| sponsorTimes_description_gin | description, category |
### userNames
| Name | Type | |
| -- | :--: | -- |
| userID | TEXT | not null |
| userID | TEXT | not null, primary key |
| userName | TEXT | not null |
| locked | INTEGER | not nul, default '0' |
@@ -72,6 +83,7 @@
| UUID | TEXT | not null |
| category | TEXT | not null |
| votes | INTEGER | not null, default 0 |
| id | SERIAL | primary key
| index | field |
| -- | :--: |
@@ -88,6 +100,7 @@
| hashedVideoID | TEXT | not null, default '' |
| reason | TEXT | not null, default '' |
| service | TEXT | not null, default 'YouTube' |
| id | SERIAL | primary key
| index | field |
| -- | :--: |
@@ -102,17 +115,22 @@
| issuerUserID | TEXT | not null |
| enabled | INTEGER | not null |
| reason | TEXT | not null, default '' |
| type | INTEGER | default 0 |
| constraint | field |
| -- | :--: |
| PRIMARY KEY | userID, issueTime |
| index | field |
| -- | :--: |
| warnings_index | userID |
| warnings_index | userID, issueTime, enabled |
| warnings_issueTime | issueTime |
### shadowBannedUsers
| Name | Type | |
| -- | :--: | -- |
| userID | TEXT | not null |
| userID | TEXT | not null, primary key |
| index | field |
| -- | :--: |
@@ -129,8 +147,8 @@
| index | field |
| -- | :--: |
| videoInfo_videoID | timeSubmitted |
| videoInfo_channelID | userID |
| videoInfo_videoID | videoID |
| videoInfo_channelID | channelID |
### unlistedVideos
@@ -142,12 +160,13 @@
| channelID | TEXT | not null |
| timeSubmitted | INTEGER | not null |
| service | TEXT | not null, default 'YouTube' |
| id | SERIAL | primary key
### config
| Name | Type | |
| -- | :--: | -- |
| key | TEXT | not null, unique |
| key | TEXT | not null, unique, primary key |
| value | TEXT | not null |
### archivedSponsorTimes
@@ -160,7 +179,7 @@
| votes | INTEGER | not null |
| locked | INTEGER | not null, default '0' |
| incorrectVotes | INTEGER | not null, default 1 |
| UUID | TEXT | not null, unique |
| UUID | TEXT | not null, unique, primary key |
| userID | TEXT | not null |
| timeSubmitted | INTEGER | not null |
| views | INTEGER | not null |
@@ -173,6 +192,7 @@
| shadowHidden | INTEGER | not null |
| hashedVideoID | TEXT | not null, default '', sha256 |
| userAgent | TEXT | not null, default '' |
| description | TEXT | not null, default '' |
### ratings
@@ -183,6 +203,7 @@
| type | INTEGER | not null |
| count | INTEGER | not null |
| hashedVideoID | TEXT | not null |
| id | SERIAL | primary key
| index | field |
| -- | :--: |
@@ -190,15 +211,125 @@
| ratings_hashedVideoID | hashedVideoID, service |
| ratings_videoID | videoID, service |
### userFeatures
| Name | Type | |
| -- | :--: | -- |
| userID | TEXT | not null |
| feature | INTEGER | not null |
| issuerUserID | TEXT | not null |
| timeSubmitted | INTEGER | not null |
| constraint | field |
| -- | :--: |
| primary key | userID, feature |
| index | field |
| -- | :--: |
| userFeatures_userID | userID, feature |
### shadowBannedIPs
| Name | Type | |
| -- | :--: | -- |
| hashedIP | TEXT | not null, primary key |
### titles
| Name | Type | |
| -- | :--: | -- |
| videoID | TEXT | not null |
| title | TEXT | not null |
| original | INTEGER | default 0 |
| userID | TEXT | not null
| service | TEXT | not null |
| hashedVideoID | TEXT | not null |
| timeSubmitted | INTEGER | not null |
| UUID | TEXT | not null, primary key
| index | field |
| -- | :--: |
| titles_timeSubmitted | timeSubmitted |
| titles_userID_timeSubmitted | videoID, service, userID, timeSubmitted |
| titles_videoID | videoID, service |
| titles_hashedVideoID_2 | service, hashedVideoID, timeSubmitted |
### titleVotes
| Name | Type | |
| -- | :--: | -- |
| UUID | TEXT | not null, primary key |
| votes | INTEGER | not null, default 0 |
| locked | INTEGER | not null, default 0 |
| shadowHidden | INTEGER | not null, default 0 |
| verification | INTEGER | default 0 |
| downvotes | INTEGER | default 0 |
| removed | INTEGER | default 0 |
| constraint | field |
| -- | :--: |
| foreign key | UUID references "titles"("UUID")
| index | field |
| -- | :--: |
| titleVotes_votes | UUID, votes
### thumbnails
| Name | Type | |
| -- | :--: | -- |
| original | INTEGER | default 0 |
| userID | TEXT | not null |
| service | TEXT | not null |
| hashedVideoID | TEXT | not null |
| timeSubmitted | INTEGER | not null |
| UUID | TEXT | not null, primary key |
| index | field |
| -- | :--: |
| thumbnails_timeSubmitted | timeSubmitted |
| thumbnails_votes_timeSubmitted | videoID, service, userID, timeSubmitted |
| thumbnails_videoID | videoID, service |
| thumbnails_hashedVideoID_2 | service, hashedVideoID, timeSubmitted |
### thumbnailTimestamps
| index | field |
| -- | :--: |
| UUID | TEXT | not null, primary key
| timestamp | INTEGER | not null, default 0
| constraint | field |
| -- | :--: |
| foreign key | UUID references "thumbnails"("UUID")
### thumbnailVotes
| Name | Type | |
| -- | :--: | -- |
| UUID | TEXT | not null, primary key |
| votes | INTEGER | not null, default 0 |
| locked | INTEGER |not null, default 0 |
| shadowHidden | INTEGER | not null, default 0 |
| downvotes | INTEGER | default 0 |
| removed | INTEGER | default 0 |
| constraint | field |
| -- | :--: |
| foreign key | UUID references "thumbnails"("UUID")
| index | field |
| -- | :--: |
| thumbnailVotes_votes | UUID, votes
# Private
[votes](#votes)
[categoryVotes](#categoryVotes)
[sponsorTimes](#sponsorTimes)
[config](#config)
[ratings](#ratings)
[tempVipLog](#tempVipLog)
[userNameLogs](#userNameLogs)
- [votes](#votes)
- [categoryVotes](#categoryVotes)
- [sponsorTimes](#sponsorTimes)
- [config](#config)
- [ratings](#ratings)
- [tempVipLog](#tempVipLog)
- [userNameLogs](#userNameLogs)
### votes
@@ -209,6 +340,7 @@
| hashedIP | TEXT | not null |
| type | INTEGER | not null |
| originalVoteType | INTEGER | not null | # Since type was reused to also specify the number of votes removed when less than 0, this is being used for the actual type
| id | SERIAL | primary key |
| index | field |
| -- | :--: |
@@ -223,10 +355,11 @@
| hashedIP | TEXT | not null |
| category | TEXT | not null |
| timeSubmitted | INTEGER | not null |
| id | SERIAL | primary key |
| index | field |
| -- | :--: |
| categoryVotes_UUID | UUID, userID, hasedIP, category |
| categoryVotes_UUID | UUID, userID, hashedIP, category |
### sponsorTimes
@@ -236,17 +369,17 @@
| hashedIP | TEXT | not null |
| timeSubmitted | INTEGER | not null |
| service | TEXT | not null, default 'YouTube' |
| id | SERIAL | primary key |
| index | field |
| -- | :--: |
| sponsorTimes_hashedIP | hashedIP |
| privateDB_sponsorTimes_videoID_v2 | videoID, service |
| privateDB_sponsorTimes_v4 | videoID, service, timeSubmitted |
### config
| Name | Type | |
| -- | :--: | -- |
| key | TEXT | not null |
| key | TEXT | not null, primary key |
| value | TEXT | not null |
### ratings
@@ -259,6 +392,7 @@
| type | INTEGER | not null |
| timeSubmitted | INTEGER | not null |
| hashedIP | TEXT | not null |
| id | SERIAL | primary key |
| index | field |
| -- | :--: |
@@ -271,6 +405,7 @@
| targetUserID | TEXT | not null |
| enabled | BOOLEAN | not null |
| updatedAt | INTEGER | not null |
| id | SERIAL | primary key |
### userNameLogs
@@ -281,3 +416,4 @@
| oldUserName | TEXT | not null |
| updatedByAdmin | BOOLEAN | not null |
| updatedAt | INTEGER | not null |
| id | SERIAL | primary key |

View File

@@ -9,7 +9,7 @@ WORKDIR /usr/src/app
RUN apk add --no-cache git postgresql-client
COPY --from=builder ./node_modules ./node_modules
COPY --from=builder ./dist ./dist
COPY ./.git ./.git
COPY ./.git/ ./.git
COPY entrypoint.sh .
COPY databases/*.sql databases/
EXPOSE 8080

View File

@@ -56,7 +56,6 @@
]
}
],
"hoursAfterWarningExpires": 24,
"rateLimit": {
"vote": {
"windowMs": 900000,

View File

@@ -25,8 +25,6 @@
"webhooks": [],
"categoryList": ["sponsor", "intro", "outro", "interaction", "selfpromo", "preview", "music_offtopic", "poi_highlight"], // List of supported categories any other category will be rejected
"getTopUsersCacheTimeMinutes": 5, // cacheTime for getTopUsers result in minutes
"maxNumberOfActiveWarnings": 3, // Users with this number of warnings will be blocked until warnings expire
"hoursAfterWarningExpire": 24,
"rateLimit": {
"vote": {
"windowMs": 900000, // 15 minutes

View File

@@ -8,6 +8,6 @@ RUN apk add --no-cache postgresql-client restic
COPY --from=builder --chmod=755 /scripts /usr/src/app/
RUN echo '30 * * * * /usr/src/app/backup.sh' >> /etc/crontabs/root
RUN echo '10 0 * * 1 /usr/src/app/forget.sh' >> /etc/crontabs/root
RUN echo '10 0 * * */2 /usr/src/app/forget.sh' >> /etc/crontabs/root
CMD crond -l 2 -f

View File

@@ -1 +1 @@
restic forget --prune --keep-last 48 --keep-daily 7 --keep-weekly 8
restic forget --prune --keep-hourly 24 --keep-daily 7 --keep-weekly 8

View File

@@ -0,0 +1,4 @@
FROM nginx as app
EXPOSE 80
COPY nginx.conf /etc/nginx/nginx.conf
COPY default.conf /etc/nginx/conf.d/default.conf

View File

@@ -0,0 +1,9 @@
server {
listen 80;
listen [::]:80;
server_name localhost;
location / {
return 503;
}
}

View File

@@ -0,0 +1,19 @@
user nginx;
worker_processes auto;
error_log /var/log/nginx/error.log notice;
pid /var/run/nginx.pid;
events {
worker_connections 4096;
}
http {
include /etc/nginx/mime.types;
default_type application/octet-stream;
access_log off;
error_log /dev/null crit;
include /etc/nginx/conf.d/*.conf;
}

View File

@@ -1,6 +1,6 @@
FROM ghcr.io/ajayyy/sb-server:latest
EXPOSE 873/tcp
RUN apk add rsync>3.2.4-r0
RUN apk add rsync>3.4.1-r0
RUN mkdir /usr/src/app/database-export
CMD rsync --no-detach --daemon & ./entrypoint.sh
CMD rsync --no-detach --daemon & ./entrypoint.sh

View File

@@ -44,4 +44,15 @@ CREATE TABLE IF NOT EXISTS "thumbnailVotes" (
"type" INTEGER NOT NULL
);
CREATE TABLE IF NOT EXISTS "casualVotes" (
"UUID" SERIAL PRIMARY KEY,
"videoID" TEXT NOT NULL,
"service" TEXT NOT NULL,
"userID" TEXT NOT NULL,
"hashedIP" TEXT NOT NULL,
"category" TEXT NOT NULL,
"type" INTEGER NOT NULL,
"timeSubmitted" INTEGER NOT NULL
);
COMMIT;

View File

@@ -23,4 +23,16 @@ CREATE INDEX IF NOT EXISTS "categoryVotes_UUID"
CREATE INDEX IF NOT EXISTS "ratings_videoID"
ON public."ratings" USING btree
("videoID" COLLATE pg_catalog."default" ASC NULLS LAST, service COLLATE pg_catalog."default" ASC NULLS LAST, "userID" COLLATE pg_catalog."default" ASC NULLS LAST, "timeSubmitted" ASC NULLS LAST)
TABLESPACE pg_default;
-- casualVotes
CREATE INDEX IF NOT EXISTS "casualVotes_videoID"
ON public."casualVotes" USING btree
("videoID" COLLATE pg_catalog."default" ASC NULLS LAST, "service" COLLATE pg_catalog."default" ASC NULLS LAST, "userID" COLLATE pg_catalog."default" ASC NULLS LAST)
TABLESPACE pg_default;
CREATE INDEX IF NOT EXISTS "casualVotes_userID"
ON public."casualVotes" USING btree
("userID" COLLATE pg_catalog."default" ASC NULLS LAST)
TABLESPACE pg_default;

View File

@@ -84,6 +84,26 @@ CREATE TABLE IF NOT EXISTS "thumbnailVotes" (
FOREIGN KEY("UUID") REFERENCES "thumbnails"("UUID")
);
CREATE TABLE IF NOT EXISTS "casualVotes" (
"UUID" TEXT PRIMARY KEY,
"videoID" TEXT NOT NULL,
"service" TEXT NOT NULL,
"hashedVideoID" TEXT NOT NULL,
"category" TEXT NOT NULL,
"upvotes" INTEGER NOT NULL default 0,
"downvotes" INTEGER NOT NULL default 0,
"timeSubmitted" INTEGER NOT NULL
);
CREATE TABLE IF NOT EXISTS "casualVoteTitles" (
"videoID" TEXT NOT NULL,
"service" TEXT NOT NULL,
"id" INTEGER NOT NULL,
"hashedVideoID" TEXT NOT NULL,
"title" TEXT NOT NULL,
PRIMARY KEY("videoID", "service", "id")
);
CREATE EXTENSION IF NOT EXISTS pgcrypto; --!sqlite-ignore
CREATE EXTENSION IF NOT EXISTS pg_trgm; --!sqlite-ignore

View File

@@ -134,9 +134,9 @@ CREATE INDEX IF NOT EXISTS "titles_videoID"
("videoID" COLLATE pg_catalog."default" ASC NULLS LAST, "service" COLLATE pg_catalog."default" ASC NULLS LAST)
TABLESPACE pg_default;
CREATE INDEX IF NOT EXISTS "titles_hashedVideoID"
CREATE INDEX IF NOT EXISTS "titles_hashedVideoID_2"
ON public."titles" USING btree
("hashedVideoID" COLLATE pg_catalog."default" ASC NULLS LAST, "service" COLLATE pg_catalog."default" ASC NULLS LAST)
(service COLLATE pg_catalog."default" ASC NULLS LAST, "hashedVideoID" text_pattern_ops ASC NULLS LAST, "timeSubmitted" ASC NULLS LAST)
TABLESPACE pg_default;
-- titleVotes
@@ -163,9 +163,9 @@ CREATE INDEX IF NOT EXISTS "thumbnails_videoID"
("videoID" COLLATE pg_catalog."default" ASC NULLS LAST, "service" COLLATE pg_catalog."default" ASC NULLS LAST)
TABLESPACE pg_default;
CREATE INDEX IF NOT EXISTS "thumbnails_hashedVideoID"
CREATE INDEX IF NOT EXISTS "thumbnails_hashedVideoID_2"
ON public."thumbnails" USING btree
("hashedVideoID" COLLATE pg_catalog."default" ASC NULLS LAST, "service" COLLATE pg_catalog."default" ASC NULLS LAST)
(service COLLATE pg_catalog."default" ASC NULLS LAST, "hashedVideoID" text_pattern_ops ASC NULLS LAST, "timeSubmitted" ASC NULLS LAST)
TABLESPACE pg_default;
-- thumbnailVotes
@@ -173,4 +173,26 @@ CREATE INDEX IF NOT EXISTS "thumbnails_hashedVideoID"
CREATE INDEX IF NOT EXISTS "thumbnailVotes_votes"
ON public."thumbnailVotes" USING btree
("UUID" COLLATE pg_catalog."default" ASC NULLS LAST, "votes" DESC NULLS LAST)
TABLESPACE pg_default;
-- casualVotes
CREATE INDEX IF NOT EXISTS "casualVotes_timeSubmitted"
ON public."casualVotes" USING btree
("timeSubmitted" ASC NULLS LAST)
TABLESPACE pg_default;
CREATE INDEX IF NOT EXISTS "casualVotes_userID_timeSubmitted"
ON public."casualVotes" USING btree
("videoID" COLLATE pg_catalog."default" ASC NULLS LAST, "service" COLLATE pg_catalog."default" ASC NULLS LAST, "timeSubmitted" DESC NULLS LAST)
TABLESPACE pg_default;
CREATE INDEX IF NOT EXISTS "casualVotes_videoID"
ON public."casualVotes" USING btree
("videoID" COLLATE pg_catalog."default" ASC NULLS LAST, "service" COLLATE pg_catalog."default" ASC NULLS LAST)
TABLESPACE pg_default;
CREATE INDEX IF NOT EXISTS "casualVotes_hashedVideoID_2"
ON public."casualVotes" USING btree
(service COLLATE pg_catalog."default" ASC NULLS LAST, "hashedVideoID" text_pattern_ops ASC NULLS LAST, "timeSubmitted" ASC NULLS LAST)
TABLESPACE pg_default;

View File

@@ -0,0 +1,7 @@
BEGIN TRANSACTION;
ALTER TABLE "casualVotes" DROP COLUMN "type";
UPDATE "config" SET value = 12 WHERE key = 'version';
COMMIT;

View File

@@ -0,0 +1,7 @@
BEGIN TRANSACTION;
ALTER TABLE "casualVotes" ADD "titleID" INTEGER default 0;
UPDATE "config" SET value = 13 WHERE key = 'version';
COMMIT;

View File

@@ -0,0 +1,11 @@
BEGIN TRANSACTION;
ALTER TABLE "titleVotes" ADD "downvotes" INTEGER default 0;
ALTER TABLE "titleVotes" ADD "removed" INTEGER default 0;
ALTER TABLE "thumbnailVotes" ADD "downvotes" INTEGER default 0;
ALTER TABLE "thumbnailVotes" ADD "removed" INTEGER default 0;
UPDATE "config" SET value = 39 WHERE key = 'version';
COMMIT;

View File

@@ -0,0 +1,8 @@
BEGIN TRANSACTION;
DROP INDEX IF EXISTS "titles_hashedVideoID";
DROP INDEX IF EXISTS "thumbnails_hashedVideoID";
UPDATE "config" SET value = 40 WHERE key = 'version';
COMMIT;

View File

@@ -0,0 +1,8 @@
BEGIN TRANSACTION;
ALTER TABLE "titles" ADD "casualMode" INTEGER default 0;
ALTER TABLE "thumbnails" ADD "casualMode" INTEGER default 0;
UPDATE "config" SET value = 41 WHERE key = 'version';
COMMIT;

View File

@@ -0,0 +1,7 @@
BEGIN TRANSACTION;
ALTER TABLE "casualVotes" DROP COLUMN "downvotes";
UPDATE "config" SET value = 42 WHERE key = 'version';
COMMIT;

View File

@@ -0,0 +1,7 @@
BEGIN TRANSACTION;
ALTER TABLE "casualVotes" ADD "titleID" INTEGER default 0;
UPDATE "config" SET value = 43 WHERE key = 'version';
COMMIT;

View File

@@ -0,0 +1,8 @@
BEGIN TRANSACTION;
ALTER TABLE "titles" ADD "userAgent" TEXT NOT NULL default '';
ALTER TABLE "thumbnails" ADD "userAgent" TEXT NOT NULL default '';
UPDATE "config" SET value = 44 WHERE key = 'version';
COMMIT;

View File

@@ -0,0 +1,7 @@
BEGIN TRANSACTION;
ALTER TABLE "warnings" ADD "disableTime" INTEGER NULL;
UPDATE "config" SET value = 45 WHERE key = 'version';
COMMIT;

View File

@@ -1,6 +1,6 @@
BEGIN TRANSACTION;
/* Add new voting field */
/* Add 'locked' field */
CREATE TABLE "sqlb_temp_table_6" (
"videoID" TEXT NOT NULL,
"startTime" REAL NOT NULL,

View File

@@ -1,6 +1,6 @@
BEGIN TRANSACTION;
/* Add Service field */
/* Add 'videoDuration' field */
CREATE TABLE "sqlb_temp_table_8" (
"videoID" TEXT NOT NULL,
"startTime" REAL NOT NULL,

View File

@@ -1,6 +1,6 @@
BEGIN TRANSACTION;
/* Add Service field */
/* Change 'videoDuration' field from INTEGER to REAL */
CREATE TABLE "sqlb_temp_table_9" (
"videoID" TEXT NOT NULL,
"startTime" REAL NOT NULL,

1776
package-lock.json generated

File diff suppressed because it is too large Load Diff

View File

@@ -19,17 +19,19 @@
"author": "Ajay Ramachandran",
"license": "AGPL-3.0-only",
"dependencies": {
"axios": "^1.1.3",
"better-sqlite3": "^8.0.1",
"axios": "^1.12.1",
"better-sqlite3": "^11.2.1",
"cron": "^2.1.0",
"express": "^4.18.2",
"express": "^4.21.2",
"express-promise-router": "^4.1.1",
"express-rate-limit": "^6.7.0",
"form-data": "^4.0.0",
"form-data": "^4.0.4",
"lodash": "^4.17.21",
"lru-cache": "^10.2.0",
"lz4-napi": "^2.2.0",
"pg": "^8.8.0",
"rate-limit-redis": "^3.0.1",
"redis": "^4.5.0",
"redis": "^4.6.13",
"seedrandom": "^3.0.5"
},
"devDependencies": {
@@ -47,8 +49,8 @@
"@typescript-eslint/parser": "^5.44.0",
"axios-mock-adapter": "^1.21.2",
"eslint": "^8.28.0",
"mocha": "^10.1.0",
"nodemon": "^2.0.20",
"mocha": "^10.8.2",
"nodemon": "^3.1.9",
"nyc": "^15.1.0",
"sinon": "^14.0.2",
"ts-mock-imports": "^1.3.8",

View File

@@ -56,6 +56,12 @@ import { hostHeader } from "./middleware/hostHeader";
import { getBrandingStats } from "./routes/getBrandingStats";
import { getTopBrandingUsers } from "./routes/getTopBrandingUsers";
import { getFeatureFlag } from "./routes/getFeatureFlag";
import { getReady } from "./routes/getReady";
import { getMetrics } from "./routes/getMetrics";
import { getSegmentID } from "./routes/getSegmentID";
import { postCasual } from "./routes/postCasual";
import { getConfigEndpoint } from "./routes/getConfig";
import { setConfig } from "./routes/setConfig";
export function createServer(callback: () => void): Server {
// Create a service (the app object is just a callback).
@@ -81,13 +87,15 @@ export function createServer(callback: () => void): Server {
// Set production mode
app.set("env", config.mode || "production");
setupRoutes(router);
const server = app.listen(config.port, callback);
return app.listen(config.port, callback);
setupRoutes(router, server);
return server;
}
/* eslint-disable @typescript-eslint/no-misused-promises */
function setupRoutes(router: Router) {
function setupRoutes(router: Router, server: Server) {
// Rate limit endpoint lists
const voteEndpoints: RequestHandler[] = [voteOnSponsorTime];
const viewEndpoints: RequestHandler[] = [viewedVideoSponsorTime];
@@ -118,6 +126,8 @@ function setupRoutes(router: Router) {
router.get("/api/viewedVideoSponsorTime", ...viewEndpoints);
router.post("/api/viewedVideoSponsorTime", ...viewEndpoints);
router.get("/api/segmentID", getSegmentID);
//To set your username for the stats view
router.post("/api/setUsername", setUsername);
@@ -200,8 +210,11 @@ function setupRoutes(router: Router) {
router.get("/api/chapterNames", getChapterNames);
// get status
router.get("/api/status/:value", getStatus);
router.get("/api/status", getStatus);
router.get("/api/status/:value", (req, res) => getStatus(req, res, server));
router.get("/api/status", (req, res) => getStatus(req, res, server));
router.get("/metrics", (req, res) => getMetrics(req, res, server));
router.get("/api/ready", (req, res) => getReady(req, res, server));
router.get("/api/youtubeApiProxy", youtubeApiProxy);
// get user category stats
@@ -224,6 +237,11 @@ function setupRoutes(router: Router) {
router.get("/api/branding/:prefix", getBrandingByHashEndpoint);
router.post("/api/branding", postBranding);
router.get("/api/config", getConfigEndpoint);
router.post("/api/config", setConfig);
router.post("/api/casual", postCasual);
/* istanbul ignore next */
if (config.postgres?.enabled) {
router.get("/database", (req, res) => dumpDatabase(req, res, true));

View File

@@ -1,7 +1,6 @@
import fs from "fs";
import { SBSConfig } from "./types/config.model";
import packageJson from "../package.json";
import { isNumber } from "lodash";
const isTestMode = process.env.npm_lifecycle_script === packageJson.scripts.test;
const configFile = process.env.TEST_POSTGRES ? "ci.json"
@@ -20,7 +19,8 @@ addDefaults(config, {
privateDBSchema: "./databases/_private.db.sql",
readOnly: false,
webhooks: [],
categoryList: ["sponsor", "selfpromo", "exclusive_access", "interaction", "intro", "outro", "preview", "music_offtopic", "filler", "poi_highlight", "chapter"],
categoryList: ["sponsor", "selfpromo", "exclusive_access", "interaction", "intro", "outro", "preview", "hook", "music_offtopic", "filler", "poi_highlight", "chapter"],
casualCategoryList: ["funny", "creative", "clever", "descriptive", "other"],
categorySupport: {
sponsor: ["skip", "mute", "full"],
selfpromo: ["skip", "mute", "full"],
@@ -29,6 +29,7 @@ addDefaults(config, {
intro: ["skip", "mute"],
outro: ["skip", "mute"],
preview: ["skip", "mute"],
hook: ["skip", "mute"],
filler: ["skip", "mute"],
music_offtopic: ["skip"],
poi_highlight: ["poi"],
@@ -36,8 +37,6 @@ addDefaults(config, {
},
deArrowTypes: ["title", "thumbnail"],
maxTitleLength: 110,
maxNumberOfActiveWarnings: 1,
hoursAfterWarningExpires: 16300000,
adminUserID: "",
discordCompletelyIncorrectReportWebhookURL: null,
discordFirstTimeSubmissionsWebhookURL: null,
@@ -46,6 +45,9 @@ addDefaults(config, {
discordReportChannelWebhookURL: null,
discordMaliciousReportWebhookURL: null,
discordDeArrowLockedWebhookURL: null,
discordDeArrowWarnedWebhookURL: null,
discordNewUserWebhookURL: null,
discordRejectedNewUserWebhookURL: null,
minReputationToSubmitChapter: 0,
minReputationToSubmitFiller: 0,
getTopUsersCacheTimeMinutes: 240,
@@ -67,6 +69,7 @@ addDefaults(config, {
message: "OK",
}
},
requestValidatorRules: [],
userCounterURL: null,
userCounterRatio: 10,
newLeafURLs: null,
@@ -83,7 +86,8 @@ addDefaults(config, {
maxTries: 3,
maxActiveRequests: 0,
timeout: 60000,
highLoadThreshold: 10
highLoadThreshold: 10,
redisTimeoutThreshold: 1000
},
postgresReadOnly: {
enabled: false,
@@ -147,6 +151,13 @@ addDefaults(config, {
},
{
name: "thumbnailVotes"
},
{
name: "casualVotes",
order: "timeSubmitted"
},
{
name: "casualVoteTitles"
}]
},
diskCacheURL: null,
@@ -165,7 +176,11 @@ addDefaults(config, {
commandsQueueMaxLength: 3000,
stopWritingAfterResponseTime: 50,
responseTimePause: 1000,
disableHashCache: false
maxReadResponseTime: 500,
disableHashCache: false,
clientCacheSize: 2000,
useCompression: false,
dragonflyMode: false
},
redisRead: {
enabled: false,
@@ -188,7 +203,18 @@ addDefaults(config, {
},
tokenSeed: "",
minUserIDLength: 30,
deArrowPaywall: false
deArrowPaywall: false,
useCacheForSegmentGroups: false,
maxConnections: 100,
maxResponseTime: 1000,
maxResponseTimeWhileLoadingCache: 2000,
etagExpiry: 5000,
youTubeKeys: {
visitorData: null,
poToken: null,
floatieUrl: null,
floatieAuth: null
}
});
loadFromEnv(config);
migrate(config);
@@ -236,15 +262,17 @@ function loadFromEnv(config: SBSConfig, prefix = "") {
loadFromEnv(data, fullKey);
} else if (process.env[fullKey]) {
const value = process.env[fullKey];
if (isNumber(value)) {
if (value !== "" && !isNaN(value as unknown as number)) {
config[key] = parseFloat(value);
} else if (value.toLowerCase() === "true" || value.toLowerCase() === "false") {
config[key] = value === "true";
} else if (key === "newLeafURLs") {
config[key] = [value];
} else if (key === "requestValidatorRules") {
config[key] = JSON.parse(value) ?? [];
} else {
config[key] = value;
}
}
}
}
}

View File

@@ -3,7 +3,6 @@ import { CronJob } from "cron";
import { config as serverConfig } from "../config";
import { Logger } from "../utils/logger";
import { db } from "../databases/databases";
import { DBSegment } from "../types/segments.model";
const jobConfig = serverConfig?.crons?.downvoteSegmentArchive;
@@ -14,18 +13,18 @@ export const archiveDownvoteSegment = async (dayLimit: number, voteLimit: number
Logger.info(`DownvoteSegmentArchiveJob starts at ${timeNow}`);
try {
// insert into archive sponsorTime
await db.prepare(
"run",
`INSERT INTO "archivedSponsorTimes"
SELECT *
FROM "sponsorTimes"
WHERE "votes" < ? AND (? - "timeSubmitted") > ?`,
[
voteLimit,
timeNow,
threshold
]
) as DBSegment[];
await db.prepare(
"run",
`INSERT INTO "archivedSponsorTimes"
SELECT *
FROM "sponsorTimes"
WHERE "votes" < ? AND (? - "timeSubmitted") > ?`,
[
voteLimit,
timeNow,
threshold
]
);
} catch (err) {
Logger.error("Execption when insert segment in archivedSponsorTimes");
@@ -35,15 +34,15 @@ export const archiveDownvoteSegment = async (dayLimit: number, voteLimit: number
// remove from sponsorTime
try {
await db.prepare(
"run",
'DELETE FROM "sponsorTimes" WHERE "votes" < ? AND (? - "timeSubmitted") > ?',
[
voteLimit,
timeNow,
threshold
]
) as DBSegment[];
await db.prepare(
"run",
'DELETE FROM "sponsorTimes" WHERE "votes" < ? AND (? - "timeSubmitted") > ?',
[
voteLimit,
timeNow,
threshold
]
);
} catch (err) {
Logger.error("Execption when deleting segment in sponsorTimes");

View File

@@ -6,9 +6,14 @@ export interface QueryOption {
export interface IDatabase {
init(): Promise<void>;
prepare(type: QueryType, query: string, params?: any[], options?: QueryOption): Promise<any | any[] | void>;
prepare(type: "run", query: string, params?: any[], options?: QueryOption): Promise<void>;
prepare(type: "get", query: string, params?: any[], options?: QueryOption): Promise<any>;
prepare(type: "all", query: string, params?: any[], options?: QueryOption): Promise<any[]>;
prepare(type: QueryType, query: string, params?: any[], options?: QueryOption): Promise<any>;
highLoad(): boolean;
shouldUseRedisTimeout(): boolean;
}
export type QueryType = "get" | "all" | "run";
export type QueryType = "get" | "all" | "run";

View File

@@ -109,7 +109,7 @@ export class Postgres implements IDatabase {
}
}
async prepare(type: QueryType, query: string, params?: any[], options: QueryOption = {}): Promise<any[]> {
async prepare(type: QueryType, query: string, params?: any[], options: QueryOption = {}): Promise<any> {
// Convert query to use numbered parameters
let count = 1;
for (let char = 0; char < query.length; char++) {
@@ -283,4 +283,8 @@ export class Postgres implements IDatabase {
highLoad() {
return this.activePostgresRequests > this.config.postgres.highLoadThreshold;
}
shouldUseRedisTimeout() {
return this.activePostgresRequests < this.config.postgres.redisTimeoutThreshold;
}
}

View File

@@ -13,7 +13,7 @@ export class Sqlite implements IDatabase {
}
// eslint-disable-next-line require-await
async prepare(type: QueryType, query: string, params: any[] = []): Promise<any[]> {
async prepare(type: QueryType, query: string, params: any[] = []): Promise<any> {
// Logger.debug(`prepare (sqlite): type: ${type}, query: ${query}, params: ${params}`);
const preparedQuery = this.db.prepare(Sqlite.processQuery(query));
@@ -102,12 +102,18 @@ export class Sqlite implements IDatabase {
}
private static processUpgradeQuery(query: string): string {
return query.replace(/^.*--!sqlite-ignore/gm, "");
return query
.replace(/SERIAL PRIMARY KEY/gi, "INTEGER PRIMARY KEY AUTOINCREMENT")
.replace(/^.*--!sqlite-ignore/gm, "");
}
highLoad() {
return false;
}
shouldUseRedisTimeout() {
return false;
}
}
export interface SqliteConfig {

View File

@@ -3,6 +3,6 @@ import { NextFunction, Request, Response } from "express";
export function corsMiddleware(req: Request, res: Response, next: NextFunction): void {
res.header("Access-Control-Allow-Origin", "*");
res.header("Access-Control-Allow-Methods", "GET, POST, OPTIONS, DELETE");
res.header("Access-Control-Allow-Headers", "Content-Type, If-None-Match");
res.header("Access-Control-Allow-Headers", "Content-Type, If-None-Match, x-client-name");
next();
}

View File

@@ -1,10 +1,10 @@
import { NextFunction, Request, Response } from "express";
import { VideoID, VideoIDHash, Service } from "../types/segments.model";
import { QueryCacher } from "../utils/queryCacher";
import { skipSegmentsHashKey, skipSegmentsKey, videoLabelsHashKey, videoLabelsKey } from "../utils/redisKeys";
import { brandingHashKey, brandingKey, skipSegmentsHashKey, skipSegmentsKey, skipSegmentsLargerHashKey, videoLabelsHashKey, videoLabelsKey, videoLabelsLargerHashKey } from "../utils/redisKeys";
type hashType = "skipSegments" | "skipSegmentsHash" | "videoLabel" | "videoLabelHash";
type ETag = `${hashType};${VideoIDHash};${Service};${number}`;
type hashType = "skipSegments" | "skipSegmentsHash" | "skipSegmentsLargerHash" | "videoLabel" | "videoLabelHash" | "videoLabelsLargerHash" | "branding" | "brandingHash";
type ETag = `"${hashType};${VideoIDHash};${Service};${number}"`;
type hashKey = string | VideoID | VideoIDHash;
export function cacheMiddlware(req: Request, res: Response, next: NextFunction): void {
@@ -12,13 +12,13 @@ export function cacheMiddlware(req: Request, res: Response, next: NextFunction):
// if weak etag, do not handle
if (!reqEtag || reqEtag.startsWith("W/")) return next();
// split into components
const [hashType, hashKey, service, lastModified] = reqEtag.split(";");
const [hashType, hashKey, service, lastModified] = reqEtag.replace(/^"|"$/g, "").split(";");
// fetch last-modified
getLastModified(hashType as hashType, hashKey as VideoIDHash, service as Service)
.then(redisLastModified => {
if (redisLastModified <= new Date(Number(lastModified) + 1000)) {
// match cache, generate etag
const etag = `${hashType};${hashKey};${service};${redisLastModified.getTime()}` as ETag;
const etag = `"${hashType};${hashKey};${service};${redisLastModified.getTime()}"` as ETag;
res.status(304).set("etag", etag).send();
}
else next();
@@ -30,15 +30,19 @@ function getLastModified(hashType: hashType, hashKey: hashKey, service: Service)
let redisKey: string | null;
if (hashType === "skipSegments") redisKey = skipSegmentsKey(hashKey as VideoID, service);
else if (hashType === "skipSegmentsHash") redisKey = skipSegmentsHashKey(hashKey as VideoIDHash, service);
else if (hashType === "skipSegmentsLargerHash") redisKey = skipSegmentsLargerHashKey(hashKey as VideoIDHash, service);
else if (hashType === "videoLabel") redisKey = videoLabelsKey(hashKey as VideoID, service);
else if (hashType === "videoLabelHash") redisKey = videoLabelsHashKey(hashKey as VideoIDHash, service);
else if (hashType === "videoLabelsLargerHash") redisKey = videoLabelsLargerHashKey(hashKey as VideoIDHash, service);
else if (hashType === "branding") redisKey = brandingKey(hashKey as VideoID, service);
else if (hashType === "brandingHash") redisKey = brandingHashKey(hashKey as VideoIDHash, service);
else return Promise.reject();
return QueryCacher.getKeyLastModified(redisKey);
}
export async function getEtag(hashType: hashType, hashKey: hashKey, service: Service): Promise<ETag> {
const lastModified = await getLastModified(hashType, hashKey, service);
return `${hashType};${hashKey};${service};${lastModified.getTime()}` as ETag;
return `"${hashType};${hashKey};${service};${lastModified.getTime()}"` as ETag;
}
/* example usage

View File

@@ -96,10 +96,12 @@ function removeOutdatedDumps(exportPath: string): Promise<void> {
for (const tableName in tableFiles) {
const files = tableFiles[tableName].sort((a, b) => b.timestamp - a.timestamp);
for (let i = 2; i < files.length; i++) {
// remove old file
await unlink(files[i].file).catch((error: any) => {
Logger.error(`[dumpDatabase] Garbage collection failed ${error}`);
});
if (!latestDumpFiles.some((file) => file.fileName === files[i].file.match(/[^/]+$/)[0])) {
// remove old file
await unlink(files[i].file).catch((error: any) => {
Logger.error(`[dumpDatabase] Garbage collection failed ${error}`);
});
}
}
}
resolve();
@@ -234,11 +236,11 @@ async function queueDump(): Promise<void> {
const fileName = `${table.name}_${startTime}.csv`;
const file = `${appExportPath}/${fileName}`;
await new Promise<string>((resolve) => {
await new Promise<string>((resolve, reject) => {
exec(`psql -c "\\copy (SELECT * FROM \\"${table.name}\\"${table.order ? ` ORDER BY \\"${table.order}\\"` : ``})`
+ ` TO '${file}' WITH (FORMAT CSV, HEADER true);"`, credentials, (error, stdout, stderr) => {
if (error) {
Logger.error(`[dumpDatabase] Failed to dump ${table.name} to ${file} due to ${stderr}`);
reject(`[dumpDatabase] Failed to dump ${table.name} to ${file} due to ${stderr}`);
}
resolve(error ? stderr : stdout);
@@ -253,10 +255,10 @@ async function queueDump(): Promise<void> {
latestDumpFiles = [...dumpFiles];
lastUpdate = startTime;
updateQueued = false;
} catch(e) {
Logger.error(e as string);
} finally {
updateQueued = false;
updateRunning = false;
}
}

View File

@@ -3,7 +3,7 @@ import { isEmpty } from "lodash";
import { config } from "../config";
import { db, privateDB } from "../databases/databases";
import { Postgres } from "../databases/Postgres";
import { BrandingDBSubmission, BrandingDBSubmissionData, BrandingHashDBResult, BrandingResult, BrandingSegmentDBResult, BrandingSegmentHashDBResult, ThumbnailDBResult, ThumbnailResult, TitleDBResult, TitleResult } from "../types/branding.model";
import { BrandingDBSubmission, BrandingDBSubmissionData, BrandingHashDBResult, BrandingResult, BrandingSegmentDBResult, BrandingSegmentHashDBResult, CasualVoteDBResult, CasualVoteHashDBResult, ThumbnailDBResult, ThumbnailResult, TitleDBResult, TitleResult } from "../types/branding.model";
import { HashedIP, IPAddress, Service, VideoID, VideoIDHash, Visibility } from "../types/segments.model";
import { shuffleArray } from "../utils/array";
import { getHashCache } from "../utils/getHashCache";
@@ -15,27 +15,28 @@ import { promiseOrTimeout } from "../utils/promise";
import { QueryCacher } from "../utils/queryCacher";
import { brandingHashKey, brandingIPKey, brandingKey } from "../utils/redisKeys";
import * as SeedRandom from "seedrandom";
import { getEtag } from "../middleware/etag";
enum BrandingSubmissionType {
Title = "title",
Thumbnail = "thumbnail"
}
export async function getVideoBranding(res: Response, videoID: VideoID, service: Service, ip: IPAddress, returnUserID: boolean): Promise<BrandingResult> {
export async function getVideoBranding(res: Response, videoID: VideoID, service: Service, ip: IPAddress, returnUserID: boolean, fetchAll: boolean): Promise<BrandingResult> {
const getTitles = () => db.prepare(
"all",
`SELECT "titles"."title", "titles"."original", "titleVotes"."votes", "titleVotes"."locked", "titleVotes"."shadowHidden", "titles"."UUID", "titles"."videoID", "titles"."hashedVideoID", "titleVotes"."verification", "titles"."userID"
`SELECT "titles"."title", "titles"."original", "titleVotes"."votes", "titleVotes"."downvotes", "titleVotes"."locked", "titleVotes"."shadowHidden", "titles"."UUID", "titles"."videoID", "titles"."hashedVideoID", "titleVotes"."verification", "titles"."userID"
FROM "titles" JOIN "titleVotes" ON "titles"."UUID" = "titleVotes"."UUID"
WHERE "titles"."videoID" = ? AND "titles"."service" = ? AND "titleVotes"."votes" > -1`,
WHERE "titles"."videoID" = ? AND "titles"."service" = ? AND "titleVotes"."votes" > -1 AND "titleVotes"."votes" - "titleVotes"."downvotes" > -2 AND "titleVotes"."removed" = 0`,
[videoID, service],
{ useReplica: true }
) as Promise<TitleDBResult[]>;
const getThumbnails = () => db.prepare(
"all",
`SELECT "thumbnailTimestamps"."timestamp", "thumbnails"."original", "thumbnailVotes"."votes", "thumbnailVotes"."locked", "thumbnailVotes"."shadowHidden", "thumbnails"."UUID", "thumbnails"."videoID", "thumbnails"."hashedVideoID", "thumbnails"."userID"
`SELECT "thumbnailTimestamps"."timestamp", "thumbnails"."original", "thumbnailVotes"."votes", "thumbnailVotes"."downvotes", "thumbnailVotes"."locked", "thumbnailVotes"."shadowHidden", "thumbnails"."UUID", "thumbnails"."videoID", "thumbnails"."hashedVideoID", "thumbnails"."userID"
FROM "thumbnails" LEFT JOIN "thumbnailVotes" ON "thumbnails"."UUID" = "thumbnailVotes"."UUID" LEFT JOIN "thumbnailTimestamps" ON "thumbnails"."UUID" = "thumbnailTimestamps"."UUID"
WHERE "thumbnails"."videoID" = ? AND "thumbnails"."service" = ? AND "thumbnailVotes"."votes" > -2
WHERE "thumbnails"."videoID" = ? AND "thumbnails"."service" = ? AND "thumbnailVotes"."votes" - "thumbnailVotes"."downvotes" > -2 AND "thumbnailVotes"."removed" = 0
ORDER BY "thumbnails"."timeSubmitted" ASC`,
[videoID, service],
{ useReplica: true }
@@ -44,24 +45,37 @@ export async function getVideoBranding(res: Response, videoID: VideoID, service:
const getSegments = () => db.prepare(
"all",
`SELECT "startTime", "endTime", "category", "videoDuration" FROM "sponsorTimes"
WHERE "votes" > -2 AND "shadowHidden" = 0 AND "hidden" = 0 AND "actionType" = 'skip' AND "videoID" = ? AND "service" = ?`,
WHERE "votes" > -2 AND "shadowHidden" = 0 AND "hidden" = 0 AND "actionType" = 'skip' AND "videoID" = ? AND "service" = ?
ORDER BY "timeSubmitted" ASC`,
[videoID, service],
{ useReplica: true }
) as Promise<BrandingSegmentDBResult[]>;
const getCasualVotes = () => db.prepare(
"all",
`SELECT "casualVotes"."category", "casualVotes"."upvotes", "casualVoteTitles"."title"
FROM "casualVotes" LEFT JOIN "casualVoteTitles" ON "casualVotes"."videoID" = "casualVoteTitles"."videoID" AND "casualVotes"."service" = "casualVoteTitles"."service" AND "casualVotes"."titleID" = "casualVoteTitles"."id"
WHERE "casualVotes"."videoID" = ? AND "casualVotes"."service" = ?
ORDER BY "casualVotes"."timeSubmitted" ASC`,
[videoID, service],
{ useReplica: true }
) as Promise<CasualVoteDBResult[]>;
const getBranding = async () => {
const titles = getTitles();
const thumbnails = getThumbnails();
const segments = getSegments();
const casualVotes = getCasualVotes();
for (const title of await titles) {
title.title = title.title.replace("<", "");
title.title = title.title.replaceAll("<", "");
}
return {
titles: await titles,
thumbnails: await thumbnails,
segments: await segments
segments: await segments,
casualVotes: await casualVotes
};
};
@@ -83,24 +97,25 @@ export async function getVideoBranding(res: Response, videoID: VideoID, service:
currentIP: null as Promise<HashedIP> | null
};
return filterAndSortBranding(videoID, returnUserID, branding.titles, branding.thumbnails, branding.segments, ip, cache);
return filterAndSortBranding(videoID, returnUserID, fetchAll, branding.titles,
branding.thumbnails, branding.segments, branding.casualVotes, ip, cache);
}
export async function getVideoBrandingByHash(videoHashPrefix: VideoIDHash, service: Service, ip: IPAddress, returnUserID: boolean): Promise<Record<VideoID, BrandingResult>> {
export async function getVideoBrandingByHash(videoHashPrefix: VideoIDHash, service: Service, ip: IPAddress, returnUserID: boolean, fetchAll: boolean): Promise<Record<VideoID, BrandingResult>> {
const getTitles = () => db.prepare(
"all",
`SELECT "titles"."title", "titles"."original", "titleVotes"."votes", "titleVotes"."locked", "titleVotes"."shadowHidden", "titles"."UUID", "titles"."videoID", "titles"."hashedVideoID", "titleVotes"."verification"
`SELECT "titles"."title", "titles"."original", "titleVotes"."votes", "titleVotes"."downvotes", "titleVotes"."locked", "titleVotes"."shadowHidden", "titles"."UUID", "titles"."videoID", "titles"."hashedVideoID", "titleVotes"."verification"
FROM "titles" JOIN "titleVotes" ON "titles"."UUID" = "titleVotes"."UUID"
WHERE "titles"."hashedVideoID" LIKE ? AND "titles"."service" = ? AND "titleVotes"."votes" > -2`,
WHERE "titles"."hashedVideoID" LIKE ? AND "titles"."service" = ? AND "titleVotes"."votes" > -1 AND "titleVotes"."votes" - "titleVotes"."downvotes" > -2 AND "titleVotes"."removed" = 0`,
[`${videoHashPrefix}%`, service],
{ useReplica: true }
) as Promise<TitleDBResult[]>;
const getThumbnails = () => db.prepare(
"all",
`SELECT "thumbnailTimestamps"."timestamp", "thumbnails"."original", "thumbnailVotes"."votes", "thumbnailVotes"."locked", "thumbnailVotes"."shadowHidden", "thumbnails"."UUID", "thumbnails"."videoID", "thumbnails"."hashedVideoID"
`SELECT "thumbnailTimestamps"."timestamp", "thumbnails"."original", "thumbnailVotes"."votes", "thumbnailVotes"."downvotes", "thumbnailVotes"."locked", "thumbnailVotes"."shadowHidden", "thumbnails"."UUID", "thumbnails"."videoID", "thumbnails"."hashedVideoID"
FROM "thumbnails" LEFT JOIN "thumbnailVotes" ON "thumbnails"."UUID" = "thumbnailVotes"."UUID" LEFT JOIN "thumbnailTimestamps" ON "thumbnails"."UUID" = "thumbnailTimestamps"."UUID"
WHERE "thumbnails"."hashedVideoID" LIKE ? AND "thumbnails"."service" = ? AND "thumbnailVotes"."votes" > -2
WHERE "thumbnails"."hashedVideoID" LIKE ? AND "thumbnails"."service" = ? AND "thumbnailVotes"."votes" - "thumbnailVotes"."downvotes" > -2 AND "thumbnailVotes"."removed" = 0
ORDER BY "thumbnails"."timeSubmitted" ASC`,
[`${videoHashPrefix}%`, service],
{ useReplica: true }
@@ -109,17 +124,29 @@ export async function getVideoBrandingByHash(videoHashPrefix: VideoIDHash, servi
const getSegments = () => db.prepare(
"all",
`SELECT "videoID", "startTime", "endTime", "category", "videoDuration" FROM "sponsorTimes"
WHERE "votes" > -2 AND "shadowHidden" = 0 AND "hidden" = 0 AND "actionType" = 'skip' AND "hashedVideoID" LIKE ? AND "service" = ?`,
WHERE "votes" > -2 AND "shadowHidden" = 0 AND "hidden" = 0 AND "actionType" = 'skip' AND "hashedVideoID" LIKE ? AND "service" = ?
ORDER BY "timeSubmitted" ASC`,
[`${videoHashPrefix}%`, service],
{ useReplica: true }
) as Promise<BrandingSegmentHashDBResult[]>;
const getCasualVotes = () => db.prepare(
"all",
`SELECT "casualVotes"."videoID", "casualVotes"."category", "casualVotes"."upvotes", "casualVoteTitles"."title"
FROM "casualVotes" LEFT JOIN "casualVoteTitles" ON "casualVotes"."videoID" = "casualVoteTitles"."videoID" AND "casualVotes"."service" = "casualVoteTitles"."service" AND "casualVotes"."titleID" = "casualVoteTitles"."id"
WHERE "casualVotes"."hashedVideoID" LIKE ? AND "casualVotes"."service" = ?
ORDER BY "casualVotes"."timeSubmitted" ASC`,
[`${videoHashPrefix}%`, service],
{ useReplica: true }
) as Promise<CasualVoteHashDBResult[]>;
const branding = await QueryCacher.get(async () => {
// Make sure they are both called in parallel
const branding = {
titles: getTitles(),
thumbnails: getThumbnails(),
segments: getSegments()
segments: getSegments(),
casualVotes: getCasualVotes()
};
const dbResult: Record<VideoID, BrandingHashDBResult> = {};
@@ -127,26 +154,32 @@ export async function getVideoBrandingByHash(videoHashPrefix: VideoIDHash, servi
dbResult[submission.videoID] = dbResult[submission.videoID] || {
titles: [],
thumbnails: [],
segments: []
segments: [],
casualVotes: []
};
};
(await branding.titles).map((title) => {
title.title = title.title.replace("<", "");
(await branding.titles).forEach((title) => {
title.title = title.title.replaceAll("<", "");
initResult(title);
dbResult[title.videoID].titles.push(title);
});
(await branding.thumbnails).map((thumbnail) => {
(await branding.thumbnails).forEach((thumbnail) => {
initResult(thumbnail);
dbResult[thumbnail.videoID].thumbnails.push(thumbnail);
});
(await branding.segments).map((segment) => {
(await branding.segments).forEach((segment) => {
initResult(segment);
dbResult[segment.videoID].segments.push(segment);
});
(await branding.casualVotes).forEach((casualVote) => {
initResult(casualVote);
dbResult[casualVote.videoID].casualVotes.push(casualVote);
});
return dbResult;
}, brandingHashKey(videoHashPrefix, service));
@@ -158,15 +191,15 @@ export async function getVideoBrandingByHash(videoHashPrefix: VideoIDHash, servi
const processedResult: Record<VideoID, BrandingResult> = {};
await Promise.all(Object.keys(branding).map(async (key) => {
const castedKey = key as VideoID;
processedResult[castedKey] = await filterAndSortBranding(castedKey, returnUserID, branding[castedKey].titles,
branding[castedKey].thumbnails, branding[castedKey].segments, ip, cache);
processedResult[castedKey] = await filterAndSortBranding(castedKey, returnUserID, fetchAll, branding[castedKey].titles,
branding[castedKey].thumbnails, branding[castedKey].segments, branding[castedKey].casualVotes, ip, cache);
}));
return processedResult;
}
async function filterAndSortBranding(videoID: VideoID, returnUserID: boolean, dbTitles: TitleDBResult[],
dbThumbnails: ThumbnailDBResult[], dbSegments: BrandingSegmentDBResult[],
async function filterAndSortBranding(videoID: VideoID, returnUserID: boolean, fetchAll: boolean, dbTitles: TitleDBResult[],
dbThumbnails: ThumbnailDBResult[], dbSegments: BrandingSegmentDBResult[], dbCasualVotes: CasualVoteDBResult[],
ip: IPAddress, cache: { currentIP: Promise<HashedIP> | null }): Promise<BrandingResult> {
const shouldKeepTitles = shouldKeepSubmission(dbTitles, BrandingSubmissionType.Title, ip, cache);
@@ -176,11 +209,12 @@ async function filterAndSortBranding(videoID: VideoID, returnUserID: boolean, db
.map((r) => ({
title: r.title,
original: r.original === 1,
votes: r.votes + r.verification,
votes: r.votes + r.verification - r.downvotes,
locked: r.locked === 1,
UUID: r.UUID,
userID: returnUserID ? r.userID : undefined
}))
.filter((a) => fetchAll || a.votes >= 0 || a.locked)
.sort((a, b) => b.votes - a.votes)
.sort((a, b) => +b.locked - +a.locked) as TitleResult[];
@@ -191,17 +225,28 @@ async function filterAndSortBranding(videoID: VideoID, returnUserID: boolean, db
.map((r) => ({
timestamp: r.timestamp,
original: r.original === 1,
votes: r.votes,
votes: r.votes - r.downvotes,
locked: r.locked === 1,
UUID: r.UUID,
userID: returnUserID ? r.userID : undefined
})) as ThumbnailResult[];
}))
.filter((a) => (fetchAll && !a.original) || a.votes >= 1 || (a.votes >= 0 && !a.original) || a.locked) as ThumbnailResult[];
const casualDownvotes = dbCasualVotes.filter((r) => r.category === "downvote")[0];
const casualVotes = dbCasualVotes.filter((r) => r.category !== "downvote").map((r) => ({
id: r.category,
count: r.upvotes - (casualDownvotes?.upvotes ?? 0),
title: r.title
})).filter((a) => a.count > 0);
const videoDuration = dbSegments.filter(s => s.videoDuration !== 0)[0]?.videoDuration ?? null;
return {
titles,
thumbnails,
randomTime: findRandomTime(videoID, dbSegments),
videoDuration: dbSegments[0]?.videoDuration ?? null
casualVotes,
randomTime: findRandomTime(videoID, dbSegments, videoDuration),
videoDuration: videoDuration,
};
}
@@ -209,7 +254,7 @@ async function shouldKeepSubmission(submissions: BrandingDBSubmission[], type: B
cache: { currentIP: Promise<HashedIP> | null }): Promise<(_: unknown, index: number) => boolean> {
const shouldKeep = await Promise.all(submissions.map(async (s) => {
if (s.shadowHidden != Visibility.HIDDEN) return true;
if (s.shadowHidden === Visibility.VISIBLE) return true;
const table = type === BrandingSubmissionType.Title ? "titleVotes" : "thumbnailVotes";
const fetchData = () => privateDB.prepare("get", `SELECT "hashedIP" FROM "${table}" WHERE "UUID" = ?`,
[s.UUID], { useReplica: true }) as Promise<{ hashedIP: HashedIP }>;
@@ -218,9 +263,11 @@ async function shouldKeepSubmission(submissions: BrandingDBSubmission[], type: B
if (cache.currentIP === null) cache.currentIP = getHashCache((ip + config.globalSalt) as IPAddress);
const hashedIP = await cache.currentIP;
return submitterIP.hashedIP === hashedIP;
return submitterIP?.hashedIP === hashedIP;
} catch (e) {
// give up on shadow hide for now
Logger.error(`getBranding: Error while trying to find IP: ${e}`);
return false;
}
}));
@@ -228,7 +275,7 @@ async function shouldKeepSubmission(submissions: BrandingDBSubmission[], type: B
return (_, index) => shouldKeep[index];
}
export function findRandomTime(videoID: VideoID, segments: BrandingSegmentDBResult[]): number {
export function findRandomTime(videoID: VideoID, segments: BrandingSegmentDBResult[], videoDuration: number): number {
let randomTime = SeedRandom.alea(videoID)();
// Don't allow random times past 90% of the video if no endcard
@@ -238,7 +285,7 @@ export function findRandomTime(videoID: VideoID, segments: BrandingSegmentDBResu
if (segments.length === 0) return randomTime;
const videoDuration = segments[0].videoDuration || Math.max(...segments.map((s) => s.endTime));
videoDuration ||= Math.max(...segments.map((s) => s.endTime)); // use highest end time as a fallback here
// There are segments, treat this as a relative time in the chopped up video
const sorted = segments.sort((a, b) => a.startTime - b.startTime);
@@ -280,6 +327,7 @@ export async function getBranding(req: Request, res: Response) {
const videoID: VideoID = req.query.videoID as VideoID;
const service: Service = getService(req.query.service as string);
const returnUserID = req.query.returnUserID === "true";
const fetchAll = req.query.fetchAll === "true";
if (!videoID) {
return res.status(400).send("Missing parameter: videoID");
@@ -287,9 +335,13 @@ export async function getBranding(req: Request, res: Response) {
const ip = getIP(req);
try {
const result = await getVideoBranding(res, videoID, service, ip, returnUserID);
const result = await getVideoBranding(res, videoID, service, ip, returnUserID, fetchAll);
const status = result.titles.length > 0 || result.thumbnails.length > 0 ? 200 : 404;
await getEtag("branding", (videoID as string), service)
.then(etag => res.set("ETag", etag))
.catch(() => null);
const status = result.titles.length > 0 || result.thumbnails.length > 0 || result.casualVotes.length > 0 ? 200 : 404;
return res.status(status).json(result);
} catch (e) {
Logger.error(e as string);
@@ -307,9 +359,14 @@ export async function getBrandingByHashEndpoint(req: Request, res: Response) {
const service: Service = getService(req.query.service as string);
const ip = getIP(req);
const returnUserID = req.query.returnUserID === "true";
const fetchAll = req.query.fetchAll === "true";
try {
const result = await getVideoBrandingByHash(hashPrefix, service, ip, returnUserID);
const result = await getVideoBrandingByHash(hashPrefix, service, ip, returnUserID, fetchAll);
await getEtag("brandingHash", (hashPrefix as string), service)
.then(etag => res.set("ETag", etag))
.catch(() => null);
const status = !isEmpty(result) ? 200 : 404;
return res.status(status).json(result);
@@ -317,4 +374,4 @@ export async function getBrandingByHashEndpoint(req: Request, res: Response) {
Logger.error(e as string);
return res.status(500).send([]);
}
}
}

View File

@@ -4,10 +4,10 @@ import { db } from "../databases/databases";
import { Request, Response } from "express";
import axios from "axios";
import { Logger } from "../utils/logger";
import { getCWSUsers } from "../utils/getCWSUsers";
import { getCWSUsers, getChromeUsers } from "../utils/getCWSUsers";
// A cache of the number of chrome web store users
let chromeUsersCache = 0;
let chromeUsersCache = 30000;
let firefoxUsersCache = 0;
interface DBStatsData {
@@ -64,7 +64,7 @@ async function getStats(): Promise<DBStatsData> {
function updateExtensionUsers() {
const mozillaAddonsUrl = "https://addons.mozilla.org/api/v3/addons/addon/dearrow/";
const chromeExtensionUrl = "https://chrome.google.com/webstore/detail/enamippconapkdmgfgjchkhakpfinmaj";
const chromeExtensionUrl = "https://chromewebstore.google.com/detail/dearrow-better-titles-and/enamippconapkdmgfgjchkhakpfinmaj";
const chromeExtId = "enamippconapkdmgfgjchkhakpfinmaj";
axios.get(mozillaAddonsUrl)
@@ -79,27 +79,4 @@ function updateExtensionUsers() {
getChromeUsers(chromeExtensionUrl)
.then(res => chromeUsersCache = res)
);
}
/* istanbul ignore next */
function getChromeUsers(chromeExtensionUrl: string): Promise<number> {
return axios.get(chromeExtensionUrl)
.then(res => {
const body = res.data;
// 2021-01-05
// [...]<span><meta itemprop="interactionCount" content="UserDownloads:100.000+"/><meta itemprop="opera[...]
const matchingString = '"UserDownloads:';
const matchingStringLen = matchingString.length;
const userDownloadsStartIndex = body.indexOf(matchingString);
/* istanbul ignore else */
if (userDownloadsStartIndex >= 0) {
const closingQuoteIndex = body.indexOf('"', userDownloadsStartIndex + matchingStringLen);
const userDownloadsStr = body.substr(userDownloadsStartIndex + matchingStringLen, closingQuoteIndex - userDownloadsStartIndex).replace(",", "").replace(".", "");
return parseInt(userDownloadsStr);
}
})
.catch(/* istanbul ignore next */ () => {
Logger.debug(`Failing to connect to ${chromeExtensionUrl}`);
return 0;
});
}

View File

@@ -27,10 +27,11 @@ export async function getChapterNames(req: Request, res: Response): Promise<Resp
FROM "videoInfo"
WHERE "channelID" = ?
) AND "description" != ''
AND similarity("description", ?) >= 0.1
GROUP BY "description"
ORDER BY SUM("votes"), similarity("description", ?) DESC
LIMIT 5;`
, [channelID, description]) as { description: string }[];
, [channelID, description, description]) as { description: string }[];
if (descriptions?.length > 0) {
return res.status(200).json(descriptions.map(d => ({

35
src/routes/getConfig.ts Normal file
View File

@@ -0,0 +1,35 @@
import { getHashCache } from "../utils/getHashCache";
import { Request, Response } from "express";
import { isUserVIP } from "../utils/isUserVIP";
import { UserID } from "../types/user.model";
import { Logger } from "../utils/logger";
import { getServerConfig } from "../utils/serverConfig";
export async function getConfigEndpoint(req: Request, res: Response): Promise<Response> {
const userID = req.query.userID as string;
const key = req.query.key as string;
if (!userID || !key) {
// invalid request
return res.sendStatus(400);
}
// hash the userID
const hashedUserID = await getHashCache(userID as UserID);
const isVIP = (await isUserVIP(hashedUserID));
if (!isVIP) {
// not authorized
return res.sendStatus(403);
}
try {
return res.status(200).json({
value: await getServerConfig(key)
});
} catch (e) {
Logger.error(e as string);
return res.sendStatus(500);
}
}

106
src/routes/getMetrics.ts Normal file
View File

@@ -0,0 +1,106 @@
import { db, privateDB } from "../databases/databases";
import { Request, Response } from "express";
import os from "os";
import redis, { getRedisStats } from "../utils/redis";
import { Postgres } from "../databases/Postgres";
import { Server } from "http";
export async function getMetrics(req: Request, res: Response, server: Server): Promise<Response> {
const redisStats = getRedisStats();
return res.type("text").send([
`# HELP sb_uptime Uptime of this instance`,
`# TYPE sb_uptime counter`,
`sb_uptime ${process.uptime()}`,
`# HELP sb_db_version The version of the database`,
`# TYPE sb_db_version counter`,
`sb_db_version ${await db.prepare("get", "SELECT key, value FROM config where key = ?", ["version"]).then(e => e.value).catch(() => -1)}`,
`# HELP sb_start_time The time this instance was started`,
`# TYPE sb_start_time gauge`,
`sb_start_time ${Date.now()}`,
`# HELP sb_loadavg_5 The 5 minute load average of the system`,
`# TYPE sb_loadavg_5 gauge`,
`sb_loadavg_5 ${os.loadavg()[0]}`,
`# HELP sb_loadavg_15 The 15 minute load average of the system`,
`# TYPE sb_loadavg_15 gauge`,
`sb_loadavg_15 ${os.loadavg()[1]}`,
`# HELP sb_connections The number of connections to this instance`,
`# TYPE sb_connections gauge`,
`sb_connections ${await new Promise((resolve) => server.getConnections((_, count) => resolve(count)) as any)}`,
`# HELP sb_status_requests The number of status requests made to this instance`,
`# TYPE sb_status_requests gauge`,
`sb_status_requests ${await redis.increment("statusRequest").then(e => e[0]).catch(() => -1)}`,
`# HELP sb_postgres_active_requests The number of active requests to the postgres database`,
`# TYPE sb_postgres_active_requests gauge`,
`sb_postgres_active_requests ${(db as Postgres)?.getStats?.()?.activeRequests ?? -1}`,
`# HELP sb_postgres_avg_read_time The average read time of the postgres database`,
`# TYPE sb_postgres_avg_read_time gauge`,
`sb_postgres_avg_read_time ${(db as Postgres)?.getStats?.()?.avgReadTime ?? -1}`,
`# HELP sb_postgres_avg_write_time The average write time of the postgres database`,
`# TYPE sb_postgres_avg_write_time gauge`,
`sb_postgres_avg_write_time ${(db as Postgres)?.getStats?.()?.avgWriteTime ?? -1}`,
`# HELP sb_postgres_avg_failed_time The average failed time of the postgres database`,
`# TYPE sb_postgres_avg_failed_time gauge`,
`sb_postgres_avg_failed_time ${(db as Postgres)?.getStats?.()?.avgFailedTime ?? -1}`,
`# HELP sb_postgres_pool_total The total number of connections in the postgres pool`,
`# TYPE sb_postgres_pool_total gauge`,
`sb_postgres_pool_total ${(db as Postgres)?.getStats?.()?.pool?.total ?? -1}`,
`# HELP sb_postgres_pool_idle The number of idle connections in the postgres pool`,
`# TYPE sb_postgres_pool_idle gauge`,
`sb_postgres_pool_idle ${(db as Postgres)?.getStats?.()?.pool?.idle ?? -1}`,
`# HELP sb_postgres_pool_waiting The number of connections waiting in the postgres pool`,
`# TYPE sb_postgres_pool_waiting gauge`,
`sb_postgres_pool_waiting ${(db as Postgres)?.getStats?.()?.pool?.waiting ?? -1}`,
`# HELP sb_postgres_private_active_requests The number of active requests to the private postgres database`,
`# TYPE sb_postgres_private_active_requests gauge`,
`sb_postgres_private_active_requests ${(privateDB as Postgres)?.getStats?.()?.activeRequests ?? -1}`,
`# HELP sb_postgres_private_avg_read_time The average read time of the private postgres database`,
`# TYPE sb_postgres_private_avg_read_time gauge`,
`sb_postgres_private_avg_read_time ${(privateDB as Postgres)?.getStats?.()?.avgReadTime ?? -1}`,
`# HELP sb_postgres_private_avg_write_time The average write time of the private postgres database`,
`# TYPE sb_postgres_private_avg_write_time gauge`,
`sb_postgres_private_avg_write_time ${(privateDB as Postgres)?.getStats?.()?.avgWriteTime ?? -1}`,
`# HELP sb_postgres_private_avg_failed_time The average failed time of the private postgres database`,
`# TYPE sb_postgres_private_avg_failed_time gauge`,
`sb_postgres_private_avg_failed_time ${(privateDB as Postgres)?.getStats?.()?.avgFailedTime ?? -1}`,
`# HELP sb_postgres_private_pool_total The total number of connections in the private postgres pool`,
`# TYPE sb_postgres_private_pool_total gauge`,
`sb_postgres_private_pool_total ${(privateDB as Postgres)?.getStats?.()?.pool?.total ?? -1}`,
`# HELP sb_postgres_private_pool_idle The number of idle connections in the private postgres pool`,
`# TYPE sb_postgres_private_pool_idle gauge`,
`sb_postgres_private_pool_idle ${(privateDB as Postgres)?.getStats?.()?.pool?.idle ?? -1}`,
`# HELP sb_postgres_private_pool_waiting The number of connections waiting in the private postgres pool`,
`# TYPE sb_postgres_private_pool_waiting gauge`,
`sb_postgres_private_pool_waiting ${(privateDB as Postgres)?.getStats?.()?.pool?.waiting ?? -1}`,
`# HELP sb_redis_active_requests The number of active requests to redis`,
`# TYPE sb_redis_active_requests gauge`,
`sb_redis_active_requests ${redisStats.activeRequests}`,
`# HELP sb_redis_write_requests The number of write requests to redis`,
`# TYPE sb_redis_write_requests gauge`,
`sb_redis_write_requests ${redisStats.writeRequests}`,
`# HELP sb_redis_avg_read_time The average read time of redis`,
`# TYPE sb_redis_avg_read_time gauge`,
`sb_redis_avg_read_time ${redisStats?.avgReadTime}`,
`# HELP sb_redis_avg_write_time The average write time of redis`,
`# TYPE sb_redis_avg_write_time gauge`,
`sb_redis_avg_write_time ${redisStats.avgWriteTime}`,
`# HELP sb_redis_memory_cache_hits The cache hit ratio in redis`,
`# TYPE sb_redis_memory_cache_hits gauge`,
`sb_redis_memory_cache_hits ${redisStats.memoryCacheHits}`,
`# HELP sb_redis_memory_cache_total_hits The cache hit ratio in redis including uncached items`,
`# TYPE sb_redis_memory_cache_total_hits gauge`,
`sb_redis_memory_cache_total_hits ${redisStats.memoryCacheTotalHits}`,
`# HELP sb_redis_memory_cache_length The length of the memory cache in redis`,
`# TYPE sb_redis_memory_cache_length gauge`,
`sb_redis_memory_cache_length ${redisStats.memoryCacheLength}`,
`# HELP sb_redis_memory_cache_size The size of the memory cache in redis`,
`# TYPE sb_redis_memory_cache_size gauge`,
`sb_redis_memory_cache_size ${redisStats.memoryCacheSize}`,
`# HELP sb_redis_last_invalidation The time of the last successful invalidation in redis`,
`# TYPE sb_redis_last_invalidation gauge`,
`sb_redis_last_invalidation ${redisStats.lastInvalidation}`,
`# HELP sb_redis_last_invalidation_message The time of the last invalidation message in redis`,
`# TYPE sb_redis_last_invalidation_message gauge`,
`sb_redis_last_invalidation_message ${redisStats.lastInvalidationMessage}`,
].join("\n"));
}

26
src/routes/getReady.ts Normal file
View File

@@ -0,0 +1,26 @@
import { Request, Response } from "express";
import { Server } from "http";
import { config } from "../config";
import { getRedisStats } from "../utils/redis";
import { Postgres } from "../databases/Postgres";
import { db } from "../databases/databases";
export async function getReady(req: Request, res: Response, server: Server): Promise<Response> {
const connections = await new Promise((resolve) => server.getConnections((_, count) => resolve(count))) as number;
const redisStats = getRedisStats();
const postgresStats = (db as Postgres).getStats?.();
if (!connections
|| (connections < config.maxConnections
&& (!config.redis || redisStats.activeRequests < config.redis.maxConnections * 0.8)
&& (!config.redis || redisStats.activeRequests < 1 || redisStats.avgReadTime < config.maxResponseTime
|| (redisStats.memoryCacheSize < config.redis.clientCacheSize * 0.8 && redisStats.avgReadTime < config.maxResponseTimeWhileLoadingCache))
&& (!config.postgres || postgresStats.activeRequests < config.postgres.maxActiveRequests * 0.8)
&& (!config.postgres || postgresStats.avgReadTime < config.maxResponseTime
|| (redisStats.memoryCacheSize < config.redis.clientCacheSize * 0.8 && postgresStats.avgReadTime < config.maxResponseTimeWhileLoadingCache)))) {
return res.sendStatus(200);
} else {
return res.sendStatus(500);
}
}

View File

@@ -0,0 +1,22 @@
import { db } from "../databases/databases";
import { Request, Response } from "express";
import { getService } from "../utils/getService";
export async function getSegmentID(req: Request, res: Response): Promise<Response> {
const partialUUID = req.query?.UUID;
const videoID = req.query?.videoID;
const service = getService(req.query?.service as string);
if (!partialUUID || !videoID) {
//invalid request
return res.sendStatus(400);
}
const data = await db.prepare("get", `SELECT "UUID" from "sponsorTimes" WHERE "UUID" LIKE ? AND "videoID" = ? AND "service" = ?`, [`${partialUUID}%`, videoID, service]);
if (data) {
return res.status(200).send(data.UUID);
} else {
return res.sendStatus(404);
}
}

View File

@@ -2,7 +2,7 @@ import { Request, Response } from "express";
import { partition } from "lodash";
import { config } from "../config";
import { db, privateDB } from "../databases/databases";
import { skipSegmentsHashKey, skipSegmentsKey, skipSegmentGroupsKey, shadowHiddenIPKey } from "../utils/redisKeys";
import { skipSegmentsHashKey, skipSegmentsKey, skipSegmentGroupsKey, shadowHiddenIPKey, skipSegmentsLargerHashKey } from "../utils/redisKeys";
import { SBRecord } from "../types/lib.model";
import { ActionType, Category, DBSegment, HashedIP, IPAddress, OverlappingSegmentGroup, Segment, SegmentCache, SegmentUUID, Service, VideoData, VideoID, VideoIDHash, Visibility, VotableObject } from "../types/segments.model";
import { getHashCache } from "../utils/getHashCache";
@@ -14,6 +14,9 @@ import { getService } from "../utils/getService";
import { promiseOrTimeout } from "../utils/promise";
import { parseSkipSegments } from "../utils/parseSkipSegments";
import { getEtag } from "../middleware/etag";
import { shuffleArray } from "../utils/array";
import { Postgres } from "../databases/Postgres";
import { getRedisStats } from "../utils/redis";
async function prepareCategorySegments(req: Request, videoID: VideoID, service: Service, segments: DBSegment[], cache: SegmentCache = { shadowHiddenSegmentIPs: {} }, useCache: boolean): Promise<Segment[]> {
const shouldFilter: boolean[] = await Promise.all(segments.map(async (segment) => {
@@ -21,7 +24,9 @@ async function prepareCategorySegments(req: Request, videoID: VideoID, service:
return true; //required - always send
}
if (segment.hidden || segment.votes < -1) {
if (segment.hidden
|| segment.votes < -1
|| segment.shadowHidden === Visibility.MORE_HIDDEN) {
return false; //too untrustworthy, just ignore it
}
@@ -41,20 +46,41 @@ async function prepareCategorySegments(req: Request, videoID: VideoID, service:
const fetchData = () => privateDB.prepare("all", 'SELECT "hashedIP" FROM "sponsorTimes" WHERE "videoID" = ? AND "timeSubmitted" = ? AND "service" = ?',
[videoID, segment.timeSubmitted, service], { useReplica: true }) as Promise<{ hashedIP: HashedIP }[]>;
try {
cache.shadowHiddenSegmentIPs[videoID][segment.timeSubmitted] = await promiseOrTimeout(QueryCacher.get(fetchData, shadowHiddenIPKey(videoID, segment.timeSubmitted, service)), 150);
if (db.highLoad() || privateDB.highLoad()) {
Logger.error("High load, not handling shadowhide");
if (db instanceof Postgres && privateDB instanceof Postgres) {
Logger.error(`Postgres stats: ${JSON.stringify(db.getStats())}`);
Logger.error(`Postgres private stats: ${JSON.stringify(privateDB.getStats())}`);
}
Logger.error(`Redis stats: ${JSON.stringify(getRedisStats())}`);
return false;
}
cache.shadowHiddenSegmentIPs[videoID][segment.timeSubmitted] = promiseOrTimeout(QueryCacher.get(fetchData, shadowHiddenIPKey(videoID, segment.timeSubmitted, service)), 150);
} catch (e) {
// give up on shadowhide for now
cache.shadowHiddenSegmentIPs[videoID][segment.timeSubmitted] = null;
}
}
const ipList = cache.shadowHiddenSegmentIPs[videoID][segment.timeSubmitted];
let ipList = [];
try {
ipList = await cache.shadowHiddenSegmentIPs[videoID][segment.timeSubmitted];
} catch (e) {
Logger.error(`skipSegments: Error while trying to find IP: ${e}`);
if (db instanceof Postgres && privateDB instanceof Postgres) {
Logger.error(`Postgres stats: ${JSON.stringify(db.getStats())}`);
Logger.error(`Postgres private stats: ${JSON.stringify(privateDB.getStats())}`);
}
return false;
}
if (ipList?.length > 0 && cache.userHashedIP === undefined) {
cache.userHashedIP = await cache.userHashedIPPromise;
}
//if this isn't their ip, don't send it to them
const shouldShadowHide = cache.shadowHiddenSegmentIPs[videoID][segment.timeSubmitted]?.some(
const shouldShadowHide = ipList?.some(
(shadowHiddenSegment) => shadowHiddenSegment.hashedIP === cache.userHashedIP) ?? false;
if (shouldShadowHide) useCache = false;
@@ -124,7 +150,7 @@ async function getSegmentsByVideoID(req: Request, videoID: VideoID, categories:
}
async function getSegmentsByHash(req: Request, hashedVideoIDPrefix: VideoIDHash, categories: Category[],
actionTypes: ActionType[], requiredSegments: SegmentUUID[], service: Service): Promise<SBRecord<VideoID, VideoData>> {
actionTypes: ActionType[], trimUUIDs: number, requiredSegments: SegmentUUID[], service: Service): Promise<SBRecord<VideoID, VideoData>> {
const cache: SegmentCache = { shadowHiddenSegmentIPs: {} };
const segments: SBRecord<VideoID, VideoData> = {};
@@ -156,13 +182,32 @@ async function getSegmentsByHash(req: Request, hashedVideoIDPrefix: VideoIDHash,
};
const canUseCache = requiredSegments.length === 0;
data.segments = (await prepareCategorySegments(req, videoID as VideoID, service, videoData.segments, cache, canUseCache))
.filter((segment: Segment) => categories.includes(segment?.category) && actionTypes.includes(segment?.actionType))
const filteredSegments = (await prepareCategorySegments(req, videoID as VideoID, service, videoData.segments, cache, canUseCache))
.filter((segment: Segment) => categories.includes(segment?.category) && actionTypes.includes(segment?.actionType));
// Make sure no hash duplicates exist
if (trimUUIDs) {
const seen = new Set<string>();
for (const segment of filteredSegments) {
const shortUUID = segment.UUID.substring(0, trimUUIDs);
if (seen.has(shortUUID)) {
// Duplicate found, disable trimming
trimUUIDs = undefined;
break;
}
seen.add(shortUUID);
}
seen.clear();
}
data.segments = filteredSegments
.map((segment) => ({
category: segment.category,
actionType: segment.actionType,
segment: segment.segment,
UUID: segment.UUID,
UUID: trimUUIDs ? segment.UUID.substring(0, trimUUIDs) as SegmentUUID : segment.UUID,
videoDuration: segment.videoDuration,
locked: segment.locked,
votes: segment.votes,
@@ -183,7 +228,7 @@ async function getSegmentsByHash(req: Request, hashedVideoIDPrefix: VideoIDHash,
return segments;
} catch (err) /* istanbul ignore next */ {
Logger.error(err as string);
Logger.error(`get segments by hash error: ${err}`);
return null;
}
}
@@ -200,6 +245,8 @@ async function getSegmentsFromDBByHash(hashedVideoIDPrefix: VideoIDHash, service
if (hashedVideoIDPrefix.length === 4) {
return await QueryCacher.get(fetchFromDB, skipSegmentsHashKey(hashedVideoIDPrefix, service));
} else if (hashedVideoIDPrefix.length === 5) {
return await QueryCacher.get(fetchFromDB, skipSegmentsLargerHashKey(hashedVideoIDPrefix, service));
}
return await fetchFromDB();
@@ -218,11 +265,11 @@ async function getSegmentsFromDBByVideoID(videoID: VideoID, service: Service): P
return await QueryCacher.get(fetchFromDB, skipSegmentsKey(videoID, service));
}
// Gets a weighted random choice from the choices array based on their `votes` property.
// Gets the best choice from the choices array based on their `votes` property.
// amountOfChoices specifies the maximum amount of choices to return, 1 or more.
// Choices are unique
// If a predicate is given, it will only filter choices following it, and will leave the rest in the list
function getWeightedRandomChoice<T extends VotableObject>(choices: T[], amountOfChoices: number, filterLocked = false, predicate?: (choice: T) => void): T[] {
function getBestChoice<T extends VotableObject>(choices: T[], amountOfChoices: number, filterLocked = false, predicate?: (choice: T) => void): T[] {
//trivial case: no need to go through the whole process
if (amountOfChoices >= choices.length) {
return choices;
@@ -245,39 +292,22 @@ function getWeightedRandomChoice<T extends VotableObject>(choices: T[], amountOf
}
//assign a weight to each choice
let totalWeight = 0;
const choicesWithWeights: TWithWeight[] = filteredChoices.map(choice => {
const boost = Math.min(choice.reputation, 4);
//The 3 makes -2 the minimum votes before being ignored completely
//this can be changed if this system increases in popularity.
const repFactor = choice.votes > 0 ? Math.max(1, choice.reputation + 1) : 1;
const weight = Math.exp(choice.votes * repFactor + 3 + boost);
totalWeight += Math.max(weight, 0);
const choicesWithWeights: TWithWeight[] = shuffleArray(filteredChoices.map(choice => {
const boost = choice.reputation;
const weight = choice.votes + boost;
return { ...choice, weight };
});
})).sort((a, b) => b.weight - a.weight);
// Nothing to filter for
if (amountOfChoices >= choicesWithWeights.length) {
return [...forceIncludedChoices, ...filteredChoices];
}
//iterate and find amountOfChoices choices
// Pick the top options
const chosen = [...forceIncludedChoices];
while (amountOfChoices-- > 0) {
//weighted random draw of one element of choices
const randomNumber = Math.random() * totalWeight;
let stackWeight = choicesWithWeights[0].weight;
let i = 0;
while (stackWeight < randomNumber) {
stackWeight += choicesWithWeights[++i].weight;
}
//add it to the chosen ones and remove it from the choices before the next iteration
for (let i = 0; i < amountOfChoices; i++) {
chosen.push(choicesWithWeights[i]);
totalWeight -= choicesWithWeights[i].weight;
choicesWithWeights.splice(i, 1);
}
return chosen;
@@ -286,20 +316,20 @@ function getWeightedRandomChoice<T extends VotableObject>(choices: T[], amountOf
async function chooseSegments(videoID: VideoID, service: Service, segments: DBSegment[], useCache: boolean): Promise<DBSegment[]> {
const fetchData = async () => await buildSegmentGroups(segments);
const groups = useCache
const groups = useCache && config.useCacheForSegmentGroups
? await QueryCacher.get(fetchData, skipSegmentGroupsKey(videoID, service))
: await fetchData();
// Filter for only 1 item for POI categories and Full video
let chosenGroups = getWeightedRandomChoice(groups, 1, true, (choice) => choice.segments[0].actionType === ActionType.Full);
chosenGroups = getWeightedRandomChoice(chosenGroups, 1, true, (choice) => choice.segments[0].actionType === ActionType.Poi);
return chosenGroups.map(//randomly choose 1 good segment per group and return them
group => getWeightedRandomChoice(group.segments, 1)[0]
let chosenGroups = getBestChoice(groups, 1, true, (choice) => choice.segments[0].actionType === ActionType.Full);
chosenGroups = getBestChoice(chosenGroups, 1, true, (choice) => choice.segments[0].actionType === ActionType.Poi);
return chosenGroups.map(// choose 1 good segment per group and return them
group => getBestChoice(group.segments, 1)[0]
);
}
//This function will find segments that are contained inside of eachother, called similar segments
//Only one similar time will be returned, randomly generated based on the sqrt of votes.
//Only one similar time will be returned, based on its score
//This allows new less voted items to still sometimes appear to give them a chance at getting votes.
//Segments with less than -1 votes are already ignored before this function is called
async function buildSegmentGroups(segments: DBSegment[]): Promise<OverlappingSegmentGroup[]> {
@@ -413,7 +443,7 @@ async function getSkipSegments(req: Request, res: Response): Promise<Response> {
await getEtag("skipSegments", (videoID as string), service)
.then(etag => res.set("ETag", etag))
.catch(() => null);
.catch(() => ({}));
return res.send(segments);
}

View File

@@ -17,13 +17,14 @@ export async function getSkipSegmentsByHash(req: Request, res: Response): Promis
if (parseResult.errors.length > 0) {
return res.status(400).send(parseResult.errors);
}
const { categories, actionTypes, requiredSegments, service } = parseResult;
const { categories, actionTypes, trimUUIDs, requiredSegments, service } = parseResult;
// Get all video id's that match hash prefix
const segments = await getSegmentsByHash(req, hashPrefix, categories, actionTypes, requiredSegments, service);
const segments = await getSegmentsByHash(req, hashPrefix, categories, actionTypes, trimUUIDs, requiredSegments, service);
try {
await getEtag("skipSegmentsHash", hashPrefix, service)
const hashKey = hashPrefix.length === 4 ? "skipSegmentsHash" : "skipSegmentsLargerHash";
await getEtag(hashKey, hashPrefix, service)
.then(etag => res.set("ETag", etag))
.catch(/* istanbul ignore next */ () => null);
const output = Object.entries(segments).map(([videoID, data]) => ({

View File

@@ -5,15 +5,16 @@ import os from "os";
import redis, { getRedisStats } from "../utils/redis";
import { promiseOrTimeout } from "../utils/promise";
import { Postgres } from "../databases/Postgres";
import { Server } from "http";
export async function getStatus(req: Request, res: Response): Promise<Response> {
export async function getStatus(req: Request, res: Response, server: Server): Promise<Response> {
const startTime = Date.now();
let value = req.params.value as string[] | string;
value = Array.isArray(value) ? value[0] : value;
let processTime, redisProcessTime = -1;
try {
const dbStartTime = Date.now();
const dbVersion = await promiseOrTimeout(db.prepare("get", "SELECT key, value FROM config where key = ?", ["version"]), 5000)
const dbVersion = await promiseOrTimeout(db.prepare("get", "SELECT key, value FROM config where key = ?", ["version"]), 1000)
.then(e => {
processTime = Date.now() - dbStartTime;
return e.value;
@@ -24,12 +25,12 @@ export async function getStatus(req: Request, res: Response): Promise<Response>
});
let statusRequests: unknown = 0;
const redisStartTime = Date.now();
const numberRequests = await promiseOrTimeout(redis.increment("statusRequest"), 5000)
const numberRequests = await promiseOrTimeout(redis.increment("statusRequest"), 1000)
.then(e => {
redisProcessTime = Date.now() - redisStartTime;
return e;
}).catch(e => /* istanbul ignore next */ {
Logger.error(`status: redis increment timed out ${e}`);
Logger.error(`status: redis increment timed out ${e}\nload: ${os.loadavg().slice(1)} with ${JSON.stringify(getRedisStats())}\n${JSON.stringify((db as Postgres)?.getStats?.())}`);
return [-1];
});
statusRequests = numberRequests?.[0];
@@ -42,6 +43,7 @@ export async function getStatus(req: Request, res: Response): Promise<Response>
processTime,
redisProcessTime,
loadavg: os.loadavg().slice(1), // only return 5 & 15 minute load average
connections: await new Promise((resolve) => server.getConnections((_, count) => resolve(count))),
statusRequests,
hostname: os.hostname(),
postgresStats: (db as Postgres)?.getStats?.(),

View File

@@ -3,7 +3,7 @@ import { config } from "../config";
import { Request, Response } from "express";
import axios from "axios";
import { Logger } from "../utils/logger";
import { getCWSUsers } from "../utils/getCWSUsers";
import { getCWSUsers, getChromeUsers } from "../utils/getCWSUsers";
// A cache of the number of chrome web store users
let chromeUsersCache = 0;
@@ -97,29 +97,4 @@ function updateExtensionUsers() {
getChromeUsers(chromeExtensionUrl)
.then(res => chromeUsersCache = res)
);
}
/* istanbul ignore next */
function getChromeUsers(chromeExtensionUrl: string): Promise<number> {
return axios.get(chromeExtensionUrl)
.then(res => {
const body = res.data;
// 2021-01-05
// [...]<span><meta itemprop="interactionCount" content="UserDownloads:100.000+"/><meta itemprop="opera[...]
const matchingString = '"UserDownloads:';
const matchingStringLen = matchingString.length;
const userDownloadsStartIndex = body.indexOf(matchingString);
/* istanbul ignore else */
if (userDownloadsStartIndex >= 0) {
const closingQuoteIndex = body.indexOf('"', userDownloadsStartIndex + matchingStringLen);
const userDownloadsStr = body.substr(userDownloadsStartIndex + matchingStringLen, closingQuoteIndex - userDownloadsStartIndex).replace(",", "").replace(".", "");
return parseInt(userDownloadsStr);
} else {
lastUserCountCheck = 0;
}
})
.catch(/* istanbul ignore next */ () => {
Logger.debug(`Failing to connect to ${chromeExtensionUrl}`);
return 0;
});
}

View File

@@ -1,4 +1,4 @@
import { db } from "../databases/databases";
import { db, privateDB } from "../databases/databases";
import { getHashCache } from "../utils/getHashCache";
import { isUserVIP } from "../utils/isUserVIP";
import { Request, Response } from "express";
@@ -144,6 +144,16 @@ async function getThumbnailSubmissionCount(userID: HashedUserID): Promise<number
}
}
async function getCasualSubmissionCount(userID: HashedUserID): Promise<number> {
try {
const row = await privateDB.prepare("get", `SELECT COUNT(DISTINCT "videoID") as "casualSubmissionCount" FROM "casualVotes" WHERE "userID" = ?`, [userID], { useReplica: true });
return row?.casualSubmissionCount ?? 0;
} catch (err) /* istanbul ignore next */ {
return null;
}
}
type cases = Record<string, any>
const executeIfFunction = (f: any) =>
@@ -173,6 +183,7 @@ const dbGetValue = (userID: HashedUserID, property: string): Promise<string|Segm
freeChaptersAccess: () => true,
titleSubmissionCount: () => getTitleSubmissionCount(userID),
thumbnailSubmissionCount: () => getThumbnailSubmissionCount(userID),
casualSubmissionCount: () => getCasualSubmissionCount(userID),
})("")(property);
};
@@ -183,7 +194,7 @@ async function getUserInfo(req: Request, res: Response): Promise<Response> {
"viewCount", "ignoredViewCount", "warnings", "warningReason", "reputation",
"vip", "lastSegmentID"];
const allProperties: string[] = [...defaultProperties, "banned", "permissions", "freeChaptersAccess",
"ignoredSegmentCount", "titleSubmissionCount", "thumbnailSubmissionCount", "deArrowWarningReason"];
"ignoredSegmentCount", "titleSubmissionCount", "thumbnailSubmissionCount", "casualSubmissionCount", "deArrowWarningReason"];
let paramValues: string[] = req.query.values
? JSON.parse(req.query.values as string)
: req.query.value

View File

@@ -1,27 +1,28 @@
import { Request, Response } from "express";
import { db } from "../databases/databases";
import { videoLabelsHashKey, videoLabelsKey } from "../utils/redisKeys";
import { videoLabelsHashKey, videoLabelsKey, videoLabelsLargerHashKey } from "../utils/redisKeys";
import { SBRecord } from "../types/lib.model";
import { DBSegment, Segment, Service, VideoData, VideoID, VideoIDHash } from "../types/segments.model";
import { ActionType, Category, DBSegment, Service, VideoID, VideoIDHash } from "../types/segments.model";
import { Logger } from "../utils/logger";
import { QueryCacher } from "../utils/queryCacher";
import { getService } from "../utils/getService";
function transformDBSegments(segments: DBSegment[]): Segment[] {
interface FullVideoSegment {
category: Category;
}
interface FullVideoSegmentVideoData {
segments: FullVideoSegment[];
hasStartSegment: boolean;
}
function transformDBSegments(segments: DBSegment[]): FullVideoSegment[] {
return segments.map((chosenSegment) => ({
category: chosenSegment.category,
actionType: chosenSegment.actionType,
segment: [chosenSegment.startTime, chosenSegment.endTime],
UUID: chosenSegment.UUID,
locked: chosenSegment.locked,
votes: chosenSegment.votes,
videoDuration: chosenSegment.videoDuration,
userID: chosenSegment.userID,
description: chosenSegment.description
category: chosenSegment.category
}));
}
async function getLabelsByVideoID(videoID: VideoID, service: Service): Promise<Segment[]> {
async function getLabelsByVideoID(videoID: VideoID, service: Service): Promise<FullVideoSegmentVideoData> {
try {
const segments: DBSegment[] = await getSegmentsFromDBByVideoID(videoID, service);
return chooseSegment(segments);
@@ -33,8 +34,8 @@ async function getLabelsByVideoID(videoID: VideoID, service: Service): Promise<S
}
}
async function getLabelsByHash(hashedVideoIDPrefix: VideoIDHash, service: Service): Promise<SBRecord<VideoID, VideoData>> {
const segments: SBRecord<VideoID, VideoData> = {};
async function getLabelsByHash(hashedVideoIDPrefix: VideoIDHash, service: Service, checkHasStartSegment: boolean): Promise<SBRecord<VideoID, FullVideoSegmentVideoData>> {
const segments: SBRecord<VideoID, FullVideoSegmentVideoData> = {};
try {
type SegmentWithHashPerVideoID = SBRecord<VideoID, { hash: VideoIDHash, segments: DBSegment[] }>;
@@ -53,11 +54,13 @@ async function getLabelsByHash(hashedVideoIDPrefix: VideoIDHash, service: Servic
}, {});
for (const [videoID, videoData] of Object.entries(segmentPerVideoID)) {
const data: VideoData = {
segments: chooseSegment(videoData.segments),
const result = chooseSegment(videoData.segments);
const data: FullVideoSegmentVideoData = {
segments: result.segments,
hasStartSegment: checkHasStartSegment ? result.hasStartSegment : undefined
};
if (data.segments.length > 0) {
if (data.segments.length > 0 || (data.hasStartSegment && checkHasStartSegment)) {
segments[videoID] = data;
}
}
@@ -74,12 +77,14 @@ async function getSegmentsFromDBByHash(hashedVideoIDPrefix: VideoIDHash, service
.prepare(
"all",
`SELECT "startTime", "endTime", "videoID", "votes", "locked", "UUID", "userID", "category", "actionType", "hashedVideoID", "description" FROM "sponsorTimes"
WHERE "hashedVideoID" LIKE ? AND "service" = ? AND "actionType" = 'full' AND "hidden" = 0 AND "shadowHidden" = 0`,
WHERE "hashedVideoID" LIKE ? AND "service" = ? AND "hidden" = 0 AND "shadowHidden" = 0`,
[`${hashedVideoIDPrefix}%`, service]
) as Promise<DBSegment[]>;
if (hashedVideoIDPrefix.length === 3) {
return await QueryCacher.get(fetchFromDB, videoLabelsHashKey(hashedVideoIDPrefix, service));
} else if (hashedVideoIDPrefix.length === 4) {
return await QueryCacher.get(fetchFromDB, videoLabelsLargerHashKey(hashedVideoIDPrefix, service));
}
return await fetchFromDB();
@@ -90,22 +95,34 @@ async function getSegmentsFromDBByVideoID(videoID: VideoID, service: Service): P
.prepare(
"all",
`SELECT "startTime", "endTime", "votes", "locked", "UUID", "userID", "category", "actionType", "description" FROM "sponsorTimes"
WHERE "videoID" = ? AND "service" = ? AND "actionType" = 'full' AND "hidden" = 0 AND "shadowHidden" = 0`,
WHERE "videoID" = ? AND "service" = ? AND "hidden" = 0 AND "shadowHidden" = 0`,
[videoID, service]
) as Promise<DBSegment[]>;
return await QueryCacher.get(fetchFromDB, videoLabelsKey(videoID, service));
}
function chooseSegment<T extends DBSegment>(choices: T[]): Segment[] {
function chooseSegment<T extends DBSegment>(choices: T[]): FullVideoSegmentVideoData {
// filter out -2 segments
choices = choices.filter((segment) => segment.votes > -2);
const hasStartSegment = !!choices.some((segment) => segment.startTime < 5
&& (segment.actionType === ActionType.Skip || segment.actionType === ActionType.Mute));
choices = choices.filter((segment) => segment.actionType === ActionType.Full);
const results = [];
// trivial decisions
if (choices.length === 0) {
return [];
return {
segments: [],
hasStartSegment
};
} else if (choices.length === 1) {
return transformDBSegments(choices);
return {
segments: transformDBSegments(choices),
hasStartSegment
};
}
// if locked, only choose from locked
const locked = choices.filter((segment) => segment.locked);
@@ -114,7 +131,10 @@ function chooseSegment<T extends DBSegment>(choices: T[]): Segment[] {
}
//no need to filter, just one label
if (choices.length === 1) {
return transformDBSegments(choices);
return {
segments: transformDBSegments(choices),
hasStartSegment
};
}
// sponsor > exclusive > selfpromo
const findCategory = (category: string) => choices.find((segment) => segment.category === category);
@@ -122,25 +142,36 @@ function chooseSegment<T extends DBSegment>(choices: T[]): Segment[] {
const categoryResult = findCategory("sponsor") ?? findCategory("exclusive_access") ?? findCategory("selfpromo");
if (categoryResult) results.push(categoryResult);
return transformDBSegments(results);
return {
segments: transformDBSegments(results),
hasStartSegment
};
}
async function handleGetLabel(req: Request, res: Response): Promise<Segment[] | false> {
async function handleGetLabel(req: Request, res: Response): Promise<FullVideoSegmentVideoData | FullVideoSegment[] | false> {
const videoID = req.query.videoID as VideoID;
if (!videoID) {
res.status(400).send("videoID not specified");
return false;
}
const hasStartSegment = req.query.hasStartSegment === "true";
const service = getService(req.query.service, req.body.service);
const segments = await getLabelsByVideoID(videoID, service);
const segmentData = await getLabelsByVideoID(videoID, service);
const segments = segmentData.segments;
if (!segments || segments.length === 0) {
res.sendStatus(404);
return false;
}
return segments;
if (hasStartSegment) {
return segmentData;
} else {
return segments;
}
}
async function endpoint(req: Request, res: Response): Promise<Response> {

View File

@@ -11,16 +11,19 @@ export async function getVideoLabelsByHash(req: Request, res: Response): Promise
}
hashPrefix = hashPrefix.toLowerCase() as VideoIDHash;
const checkHasStartSegment = req.query.hasStartSegment === "true";
const service: Service = getService(req.query.service, req.body.service);
// Get all video id's that match hash prefix
const segments = await getLabelsByHash(hashPrefix, service);
const segments = await getLabelsByHash(hashPrefix, service, checkHasStartSegment);
if (!segments) return res.status(404).json([]);
const output = Object.entries(segments).map(([videoID, data]) => ({
videoID,
segments: data.segments,
hasStartSegment: data.hasStartSegment
}));
return res.status(output.length === 0 ? 404 : 200).json(output);
}

View File

@@ -2,7 +2,7 @@ import { Request, Response } from "express";
import { config } from "../config";
import { db, privateDB } from "../databases/databases";
import { BrandingSubmission, BrandingUUID, TimeThumbnailSubmission } from "../types/branding.model";
import { BrandingSubmission, BrandingUUID, TimeThumbnailSubmission, TitleSubmission } from "../types/branding.model";
import { HashedIP, IPAddress, VideoID } from "../types/segments.model";
import { Feature, HashedUserID } from "../types/user.model";
import { getHashCache } from "../utils/getHashCache";
@@ -18,12 +18,20 @@ import { checkBanStatus } from "../utils/checkBan";
import axios from "axios";
import { getMaxResThumbnail } from "../utils/youtubeApi";
import { getVideoDetails } from "../utils/getVideoDetails";
import { canSubmitDeArrow } from "../utils/permissions";
import { parseUserAgent } from "../utils/userAgent";
import { isRequestInvalid } from "../utils/requestValidator";
enum BrandingType {
Title,
Thumbnail
}
enum BrandingVoteType {
Upvote = 1,
Downvote = 2
}
interface ExistingVote {
UUID: BrandingUUID;
type: number;
@@ -31,8 +39,9 @@ interface ExistingVote {
}
export async function postBranding(req: Request, res: Response) {
const { videoID, userID, title, thumbnail } = req.body as BrandingSubmission;
const { videoID, userID, title, thumbnail, autoLock, downvote, videoDuration, wasWarned, casualMode } = req.body as BrandingSubmission;
const service = getService(req.body.service);
const userAgent = req.body.userAgent ?? parseUserAgent(req.get("user-agent")) ?? "";
if (!videoID || !userID || userID.length < 30 || !service
|| ((!title || !title.title)
@@ -45,10 +54,53 @@ export async function postBranding(req: Request, res: Response) {
try {
const hashedUserID = await getHashCache(userID);
const isVip = await isUserVIP(hashedUserID);
const shouldLock = isVip && autoLock !== false;
const hashedVideoID = await getHashCache(videoID, 1);
const hashedIP = await getHashCache(getIP(req) + config.globalSalt as IPAddress);
const isBanned = await checkBanStatus(hashedUserID, hashedIP);
const matchedRule = isRequestInvalid({
userAgent,
userAgentHeader: req.headers["user-agent"],
videoDuration,
videoID,
userID,
service,
dearrow: {
title,
thumbnail,
downvote,
},
endpoint: "dearrow-postBranding",
});
if (matchedRule !== null) {
sendNewUserWebhook(config.discordRejectedNewUserWebhookURL, hashedUserID, videoID, userAgent, req, videoDuration, title, `Caught by rule: ${matchedRule}`);
Logger.warn(`Dearrow submission rejected by request validator: ${hashedUserID} ${videoID} ${videoDuration} ${userAgent} ${req.headers["user-agent"]} ${title.title} ${thumbnail.timestamp}`);
res.status(200).send("OK");
return;
}
// treat banned users as existing users who "can submit" for the purposes of these checks
// this is to avoid their titles from being logged and them taking up "new user" slots with every submission
const permission = isBanned ? {
canSubmit: true,
newUser: false,
reason: "",
} : await canSubmitDeArrow(hashedUserID);
if (!permission.canSubmit) {
Logger.warn(`New user trying to submit dearrow: ${hashedUserID} ${videoID} ${videoDuration} ${Object.keys(req.body)} ${userAgent} ${title?.title} ${req.headers["user-agent"]}`);
res.status(403).send(permission.reason);
return;
} else if (permission.newUser) {
sendNewUserWebhook(config.discordNewUserWebhookURL, hashedUserID, videoID, userAgent, req, videoDuration, title, undefined);
}
if (videoDuration && thumbnail && await checkForWrongVideoDuration(videoID, videoDuration)) {
res.status(403).send("YouTube is currently testing a new anti-adblock technique called server-side ad-injection. This causes skips and submissions to be offset by the duration of the ad. It seems that you are affected by this A/B test, so until a fix is developed, we cannot accept submissions from your device due to them potentially being inaccurate.");
return;
}
const lock = await acquireLock(`postBranding:${videoID}.${hashedUserID}`);
if (!lock.status) {
res.status(429).send("Vote already in progress");
@@ -56,7 +108,7 @@ export async function postBranding(req: Request, res: Response) {
}
const now = Date.now();
const voteType = 1;
const voteType: BrandingVoteType = downvote ? BrandingVoteType.Downvote : BrandingVoteType.Upvote;
if (title && !isVip && title.title.length > config.maxTitleLength) {
lock.unlock();
@@ -64,73 +116,100 @@ export async function postBranding(req: Request, res: Response) {
return;
}
let errorCode = 0;
await Promise.all([(async () => {
if (title) {
// ignore original submissions from banned users - hiding those would cause issues
if (title.original && isBanned) return;
const existingUUID = (await db.prepare("get", `SELECT "UUID" from "titles" where "videoID" = ? AND "title" = ?`, [videoID, title.title]))?.UUID;
const existingIsLocked = !!existingUUID && (await db.prepare("get", `SELECT "locked" from "titleVotes" where "UUID" = ?`, [existingUUID]))?.locked;
if (existingUUID != undefined && isBanned) return; // ignore votes on existing details from banned users
if (downvote && existingIsLocked && !isVip) {
if (!isBanned) sendWebhooks(videoID, existingUUID, voteType, wasWarned, shouldLock).catch((e) => Logger.error(e));
errorCode = 403;
return;
}
const UUID = existingUUID || crypto.randomUUID();
const existingVote = await handleExistingVotes(BrandingType.Title, videoID, hashedUserID, UUID, hashedIP, voteType);
await handleExistingVotes(BrandingType.Title, videoID, hashedUserID, UUID, hashedIP, voteType);
if (existingUUID) {
await updateVoteTotals(BrandingType.Title, existingVote, UUID, isVip);
await updateVoteTotals(BrandingType.Title, UUID, hashedUserID, shouldLock, !!downvote);
} else {
await db.prepare("run", `INSERT INTO "titles" ("videoID", "title", "original", "userID", "service", "hashedVideoID", "timeSubmitted", "UUID") VALUES (?, ?, ?, ?, ?, ?, ?, ?)`,
[videoID, title.title, title.original ? 1 : 0, hashedUserID, service, hashedVideoID, now, UUID]);
if (downvote) {
throw new Error("Title submission doesn't exist");
}
await db.prepare("run", `INSERT INTO "titles" ("videoID", "title", "original", "userID", "service", "hashedVideoID", "timeSubmitted", "UUID", "casualMode", "userAgent") VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?)`,
[videoID, title.title, title.original ? 1 : 0, hashedUserID, service, hashedVideoID, now, UUID, casualMode ? 1 : 0, userAgent]);
const verificationValue = await getVerificationValue(hashedUserID, isVip);
await db.prepare("run", `INSERT INTO "titleVotes" ("UUID", "votes", "locked", "shadowHidden", "verification") VALUES (?, 0, ?, ?, ?);`,
[UUID, isVip ? 1 : 0, isBanned ? 1 : 0, verificationValue]);
[UUID, shouldLock ? 1 : 0, isBanned ? 1 : 0, verificationValue]);
await verifyOldSubmissions(hashedUserID, verificationValue);
}
if (isVip) {
if (isVip && !downvote && shouldLock) {
// unlock all other titles
await db.prepare("run", `UPDATE "titleVotes" as tv SET "locked" = 0 FROM "titles" t WHERE tv."UUID" = t."UUID" AND tv."UUID" != ? AND t."videoID" = ?`, [UUID, videoID]);
}
sendWebhooks(videoID, UUID).catch((e) => Logger.error(e));
if (!isBanned) sendWebhooks(videoID, UUID, voteType, wasWarned, shouldLock).catch((e) => Logger.error(e));
}
})(), (async () => {
if (thumbnail) {
// ignore original submissions from banned users - hiding those would cause issues
if (thumbnail.original && isBanned) return;
if (thumbnail.original && (isBanned || !await canSubmitOriginal(hashedUserID, isVip))) return;
const existingUUID = thumbnail.original
? (await db.prepare("get", `SELECT "UUID" from "thumbnails" where "videoID" = ? AND "original" = 1`, [videoID]))?.UUID
: (await db.prepare("get", `SELECT "thumbnails"."UUID" from "thumbnailTimestamps" JOIN "thumbnails" ON "thumbnails"."UUID" = "thumbnailTimestamps"."UUID"
WHERE "thumbnailTimestamps"."timestamp" = ? AND "thumbnails"."videoID" = ?`, [(thumbnail as TimeThumbnailSubmission).timestamp, videoID]))?.UUID;
const existingIsLocked = !!existingUUID && (await db.prepare("get", `SELECT "locked" from "thumbnailVotes" where "UUID" = ?`, [existingUUID]))?.locked;
if (existingUUID != undefined && isBanned) return; // ignore votes on existing details from banned users
if (downvote && existingIsLocked && !isVip) {
errorCode = 403;
return;
}
const UUID = existingUUID || crypto.randomUUID();
const existingVote = await handleExistingVotes(BrandingType.Thumbnail, videoID, hashedUserID, UUID, hashedIP, voteType);
await handleExistingVotes(BrandingType.Thumbnail, videoID, hashedUserID, UUID, hashedIP, voteType);
if (existingUUID) {
await updateVoteTotals(BrandingType.Thumbnail, existingVote, UUID, isVip);
await updateVoteTotals(BrandingType.Thumbnail, UUID, hashedUserID, shouldLock, !!downvote);
} else {
await db.prepare("run", `INSERT INTO "thumbnails" ("videoID", "original", "userID", "service", "hashedVideoID", "timeSubmitted", "UUID") VALUES (?, ?, ?, ?, ?, ?, ?)`,
[videoID, thumbnail.original ? 1 : 0, hashedUserID, service, hashedVideoID, now, UUID]);
if (downvote) {
throw new Error("Thumbnail submission doesn't exist");
}
await db.prepare("run", `INSERT INTO "thumbnails" ("videoID", "original", "userID", "service", "hashedVideoID", "timeSubmitted", "UUID", "casualMode", "userAgent") VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?)`,
[videoID, thumbnail.original ? 1 : 0, hashedUserID, service, hashedVideoID, now, UUID, casualMode ? 1 : 0, userAgent]);
await db.prepare("run", `INSERT INTO "thumbnailVotes" ("UUID", "votes", "locked", "shadowHidden") VALUES (?, 0, ?, ?)`,
[UUID, isVip ? 1 : 0, isBanned ? 1 : 0]);
[UUID, shouldLock ? 1 : 0, isBanned ? 1 : 0]);
if (!thumbnail.original) {
await db.prepare("run", `INSERT INTO "thumbnailTimestamps" ("UUID", "timestamp") VALUES (?, ?)`,
[UUID, (thumbnail as TimeThumbnailSubmission).timestamp]);
}
}
if (isVip) {
// unlock all other titles
await db.prepare("run", `UPDATE "thumbnailVotes" as tv SET "locked" = 0 FROM "thumbnails" t WHERE tv."UUID" = t."UUID" AND tv."UUID" != ? AND t."videoID" = ?`, [UUID, videoID]);
}
if (isVip && !downvote && shouldLock) {
// unlock all other titles
await db.prepare("run", `UPDATE "thumbnailVotes" as tv SET "locked" = 0 FROM "thumbnails" t WHERE tv."UUID" = t."UUID" AND tv."UUID" != ? AND t."videoID" = ?`, [UUID, videoID]);
}
}
})()]);
QueryCacher.clearBrandingCache({ videoID, hashedVideoID, service });
res.status(200).send("OK");
if (errorCode) {
res.status(errorCode).send();
} else {
res.status(200).send("OK");
}
lock.unlock();
} catch (e) {
Logger.error(e as string);
@@ -138,51 +217,120 @@ export async function postBranding(req: Request, res: Response) {
}
}
function sendNewUserWebhook(webhookUrl: string, hashedUserID: HashedUserID, videoID: VideoID, userAgent: any, req: Request, videoDuration: number, title: TitleSubmission, footerText: string | undefined) {
if (!webhookUrl) return;
axios.post(webhookUrl, {
"embeds": [{
"title": hashedUserID,
"url": `https://www.youtube.com/watch?v=${videoID}`,
"description": `**User Agent**: ${userAgent}\
\n**Sent User Agent**: ${req.body.userAgent}\
\n**Real User Agent**: ${req.headers["user-agent"]}\
\n**Video Duration**: ${videoDuration}\
\n**Title**: ${title?.title}`,
"color": 1184701,
"thumbnail": {
"url": getMaxResThumbnail(videoID),
},
"footer": footerText === undefined ? null : {
"text": footerText,
},
}],
})
.then(res => {
if (res.status >= 400) {
Logger.error("Error sending reported submission Discord hook");
Logger.error(JSON.stringify((res.data)));
Logger.error("\n");
}
})
.catch(err => {
Logger.error("Failed to send reported submission Discord hook.");
Logger.error(JSON.stringify(err));
Logger.error("\n");
});
}
/**
* Finds an existing vote, if found, and it's for a different submission, it undoes it, and points to the new submission.
* If no existing vote, it adds one.
*/
async function handleExistingVotes(type: BrandingType, videoID: VideoID,
hashedUserID: HashedUserID, UUID: BrandingUUID, hashedIP: HashedIP, voteType: number): Promise<ExistingVote> {
hashedUserID: HashedUserID, UUID: BrandingUUID, hashedIP: HashedIP, voteType: BrandingVoteType) {
const table = type === BrandingType.Title ? `"titleVotes"` : `"thumbnailVotes"`;
const idsDealtWith: BrandingUUID[] = [];
const existingVote = await privateDB.prepare("get", `SELECT "id", "UUID", "type" from ${table} where "videoID" = ? AND "userID" = ?`, [videoID, hashedUserID]);
if (existingVote && existingVote.UUID !== UUID) {
if (existingVote.type === 1) {
await db.prepare("run", `UPDATE ${table} SET "votes" = "votes" - 1 WHERE "UUID" = ?`, [existingVote.UUID]);
// Either votes of the same type, or on the same submission (undo a downvote)
const existingVotes = await privateDB.prepare("all", `SELECT "id", "UUID", "type" from ${table} where "videoID" = ? AND "userID" = ? AND ("type" = ? OR "UUID" = ?)`, [videoID, hashedUserID, voteType, UUID]) as ExistingVote[];
if (existingVotes.length > 0) {
// Only one upvote per video
for (const existingVote of existingVotes) {
// For downvotes, only undo for this specific submission (multiple downvotes on one submission not allowed)
if (voteType === BrandingVoteType.Downvote && existingVote.UUID !== UUID) continue;
switch (existingVote.type) {
case BrandingVoteType.Upvote:
// Old case where there are duplicate rows in private db
if (!idsDealtWith.includes(existingVote.UUID)) {
idsDealtWith.push(existingVote.UUID);
await db.prepare("run", `UPDATE ${table} SET "votes" = "votes" - 1 WHERE "UUID" = ?`, [existingVote.UUID]);
}
await privateDB.prepare("run", `DELETE FROM ${table} WHERE "id" = ?`, [existingVote.id]);
break;
case BrandingVoteType.Downvote: {
await db.prepare("run", `UPDATE ${table} SET "downvotes" = "downvotes" - 1 WHERE "UUID" = ?`, [existingVote.UUID]);
await privateDB.prepare("run", `DELETE FROM ${table} WHERE "id" = ?`, [existingVote.id]);
break;
}
}
}
await privateDB.prepare("run", `UPDATE ${table} SET "type" = ?, "UUID" = ? WHERE "id" = ?`, [voteType, UUID, existingVote.id]);
} else if (!existingVote) {
await privateDB.prepare("run", `INSERT INTO ${table} ("videoID", "UUID", "userID", "hashedIP", "type") VALUES (?, ?, ?, ?, ?)`,
[videoID, UUID, hashedUserID, hashedIP, voteType]);
}
return existingVote;
await privateDB.prepare("run", `INSERT INTO ${table} ("videoID", "UUID", "userID", "hashedIP", "type") VALUES (?, ?, ?, ?, ?)`,
[videoID, UUID, hashedUserID, hashedIP, voteType]);
}
/**
* Only called if an existing vote exists.
* Will update public vote totals and locked status.
*/
async function updateVoteTotals(type: BrandingType, existingVote: ExistingVote, UUID: BrandingUUID, isVip: boolean): Promise<void> {
async function updateVoteTotals(type: BrandingType, UUID: BrandingUUID, userID: HashedUserID, shouldLock: boolean, downvote: boolean): Promise<void> {
const table = type === BrandingType.Title ? `"titleVotes"` : `"thumbnailVotes"`;
const table2 = type === BrandingType.Title ? `"titles"` : `"thumbnails"`;
// Don't upvote if we vote on the same submission
if (!existingVote || existingVote.UUID !== UUID) {
if (downvote) {
// Only downvote if it is not their submission
const isUsersSubmission = (await db.prepare("get", `SELECT "userID" FROM ${table2} WHERE "UUID" = ?`, [UUID]))?.userID === userID;
if (!isUsersSubmission) {
await db.prepare("run", `UPDATE ${table} SET "downvotes" = "downvotes" + 1 WHERE "UUID" = ?`, [UUID]);
}
} else {
await db.prepare("run", `UPDATE ${table} SET "votes" = "votes" + 1 WHERE "UUID" = ?`, [UUID]);
if (type === BrandingType.Title) {
const votedSubmitterUserID = (await db.prepare("get", `SELECT "userID" FROM ${table2} WHERE "UUID" = ?`, [UUID]))?.userID;
if (votedSubmitterUserID) {
await verifyOldSubmissions(votedSubmitterUserID, await getVerificationValue(votedSubmitterUserID, await isUserVIP(votedSubmitterUserID)));
}
}
}
if (isVip) {
await db.prepare("run", `UPDATE ${table} SET "locked" = 1 WHERE "UUID" = ?`, [UUID]);
if (shouldLock) {
if (downvote) {
await db.prepare("run", `UPDATE ${table} SET "removed" = 1 WHERE "UUID" = ?`, [UUID]);
} else {
await db.prepare("run", `UPDATE ${table} SET "locked" = 1, "removed" = 0 WHERE "UUID" = ?`, [UUID]);
}
}
}
export async function getVerificationValue(hashedUserID: HashedUserID, isVip: boolean): Promise<number> {
const voteSum = await db.prepare("get", `SELECT SUM("maxVotes") as "voteSum" FROM (SELECT MAX("votes") as "maxVotes" from "titles" JOIN "titleVotes" ON "titles"."UUID" = "titleVotes"."UUID" WHERE "titles"."userID" = ? GROUP BY "titles"."videoID") t`, [hashedUserID]);
const sbSubmissions = () => db.prepare("get", `SELECT COUNT(*) as count FROM "sponsorTimes" WHERE "userID" = ? AND "votes" > 0 LIMIT 3`, [hashedUserID]);
if (voteSum.voteSum >= 1 || isVip || (await sbSubmissions()).count > 2 || await hasFeature(hashedUserID, Feature.DeArrowTitleSubmitter)) {
if (voteSum.voteSum >= 1 || isVip || await hasFeature(hashedUserID, Feature.DeArrowTitleSubmitter)) {
return 0;
} else {
return -1;
@@ -207,14 +355,70 @@ export async function verifyOldSubmissions(hashedUserID: HashedUserID, verificat
}
}
async function sendWebhooks(videoID: VideoID, UUID: BrandingUUID) {
const lockedSubmission = await db.prepare("get", `SELECT "titleVotes"."votes", "titles"."title", "titles"."userID" FROM "titles" JOIN "titleVotes" ON "titles"."UUID" = "titleVotes"."UUID" WHERE "titles"."videoID" = ? AND "titles"."UUID" != ? AND "titleVotes"."locked" = 1`, [videoID, UUID]);
async function canSubmitOriginal(hashedUserID: HashedUserID, isVip: boolean): Promise<boolean> {
const upvotedThumbs = (await db.prepare("get", `SELECT count(*) as "upvotedThumbs" FROM "thumbnails" JOIN "thumbnailVotes" ON "thumbnails"."UUID" = "thumbnailVotes"."UUID" WHERE "thumbnailVotes"."votes" > 0 AND "thumbnails"."original" = 0 AND "thumbnails"."userID" = ?`, [hashedUserID])).upvotedThumbs;
const customThumbs = (await db.prepare("get", `SELECT count(*) as "customThumbs" FROM "thumbnails" JOIN "thumbnailVotes" ON "thumbnails"."UUID" = "thumbnailVotes"."UUID" WHERE "thumbnailVotes"."votes" >= 0 AND "thumbnails"."original" = 0 AND "thumbnails"."userID" = ?`, [hashedUserID])).customThumbs;
const originalThumbs = (await db.prepare("get", `SELECT count(*) as "originalThumbs" FROM "thumbnails" JOIN "thumbnailVotes" ON "thumbnails"."UUID" = "thumbnailVotes"."UUID" WHERE "thumbnailVotes"."votes" >= 0 AND "thumbnails"."original" = 1 AND "thumbnails"."userID" = ?`, [hashedUserID])).originalThumbs;
if (lockedSubmission) {
const currentSubmission = await db.prepare("get", `SELECT "titleVotes"."votes", "titles"."title" FROM "titles" JOIN "titleVotes" ON "titles"."UUID" = "titleVotes"."UUID" WHERE "titles"."UUID" = ?`, [UUID]);
return isVip || (upvotedThumbs > 1 && customThumbs > 1 && originalThumbs / customThumbs < 0.4);
}
async function sendWebhooks(videoID: VideoID, UUID: BrandingUUID, voteType: BrandingVoteType, wasWarned: boolean, vipAction: boolean) {
const currentSubmission = await db.prepare(
"get",
`SELECT
"titles"."title",
"titleVotes"."locked",
"titles"."userID",
"titleVotes"."votes"-"titleVotes"."downvotes"+"titleVotes"."verification" AS "score"
FROM "titles" JOIN "titleVotes" ON "titles"."UUID" = "titleVotes"."UUID"
WHERE "titles"."UUID" = ?`,
[UUID]);
if (wasWarned && voteType === BrandingVoteType.Upvote) {
const data = await getVideoDetails(videoID);
axios.post(config.discordDeArrowWarnedWebhookURL, {
"embeds": [{
"title": data?.title,
"url": `https://www.youtube.com/watch?v=${videoID}`,
"description": `**Submitted title:** ${currentSubmission.title}\
\n\n**Submitted by:** ${currentSubmission.userID}`,
"color": 10813440,
"thumbnail": {
"url": getMaxResThumbnail(videoID),
},
}],
})
.then(res => {
if (res.status >= 400) {
Logger.error("Error sending reported submission Discord hook");
Logger.error(JSON.stringify((res.data)));
Logger.error("\n");
}
})
.catch(err => {
Logger.error("Failed to send reported submission Discord hook.");
Logger.error(JSON.stringify(err));
Logger.error("\n");
});
}
// Unlocked title getting more upvotes than the locked one
if (voteType === BrandingVoteType.Upvote) {
const lockedSubmission = await db.prepare(
"get",
`SELECT
"titles"."title",
"titles"."userID",
"titleVotes"."votes"-"titleVotes"."downvotes"+"titleVotes"."verification" AS "score"
FROM "titles" JOIN "titleVotes" ON "titles"."UUID" = "titleVotes"."UUID"
WHERE "titles"."videoID" = ?
AND "titles"."UUID" != ?
AND "titleVotes"."locked" = 1`,
[videoID, UUID]);
// Time to warn that there may be an issue
if (currentSubmission.votes - lockedSubmission.votes > 2) {
if (lockedSubmission && currentSubmission.score - lockedSubmission.score > 2) {
const usernameRow = await db.prepare("get", `SELECT "userName" FROM "userNames" WHERE "userID" = ?`, [lockedSubmission.userID]);
const data = await getVideoDetails(videoID);
@@ -222,7 +426,7 @@ async function sendWebhooks(videoID: VideoID, UUID: BrandingUUID) {
"embeds": [{
"title": data?.title,
"url": `https://www.youtube.com/watch?v=${videoID}`,
"description": `**${lockedSubmission.votes}** Votes vs **${currentSubmission.votes}**\
"description": `**${lockedSubmission.score}** score vs **${currentSubmission.score}**\
\n\n**Locked title:** ${lockedSubmission.title}\
\n**New title:** ${currentSubmission.title}\
\n\n**Submitted by:** ${usernameRow?.userName ?? ""}\n${lockedSubmission.userID}`,
@@ -246,4 +450,43 @@ async function sendWebhooks(videoID: VideoID, UUID: BrandingUUID) {
});
}
}
}
// Downvotes on locked title
if (voteType === BrandingVoteType.Downvote && currentSubmission.locked === 1) {
const usernameRow = await db.prepare("get", `SELECT "userName" FROM "userNames" WHERE "userID" = ?`, [currentSubmission.userID]);
const data = await getVideoDetails(videoID);
axios.post(config.discordDeArrowLockedWebhookURL, {
"embeds": [{
"title": data?.title,
"url": `https://www.youtube.com/watch?v=${videoID}`,
"description": `Locked title ${vipAction ? "was removed by a VIP" : `with **${currentSubmission.score}** score received a downvote`}\
\n\n**Locked title:** ${currentSubmission.title}\
\n**Submitted by:** ${usernameRow?.userName ?? ""}\n${currentSubmission.userID}`,
"color": 10813440,
"thumbnail": {
"url": getMaxResThumbnail(videoID),
},
}],
})
.then(res => {
if (res.status >= 400) {
Logger.error("Error sending reported submission Discord hook");
Logger.error(JSON.stringify((res.data)));
Logger.error("\n");
}
})
.catch(err => {
Logger.error("Failed to send reported submission Discord hook.");
Logger.error(JSON.stringify(err));
Logger.error("\n");
});
}
}
async function checkForWrongVideoDuration(videoID: VideoID, duration: number): Promise<boolean> {
const apiVideoDetails = await getVideoDetails(videoID, true);
const apiDuration = apiVideoDetails?.duration;
return apiDuration && apiDuration > 2 && duration && duration > 2 && Math.abs(apiDuration - duration) > 3;
}

153
src/routes/postCasual.ts Normal file
View File

@@ -0,0 +1,153 @@
import { Request, Response } from "express";
import { config } from "../config";
import { db, privateDB } from "../databases/databases";
import { BrandingUUID, CasualCategory, CasualVoteSubmission } from "../types/branding.model";
import { HashedIP, IPAddress, Service, VideoID } from "../types/segments.model";
import { HashedUserID } from "../types/user.model";
import { getHashCache } from "../utils/getHashCache";
import { getIP } from "../utils/getIP";
import { getService } from "../utils/getService";
import { Logger } from "../utils/logger";
import crypto from "crypto";
import { QueryCacher } from "../utils/queryCacher";
import { acquireLock } from "../utils/redisLock";
import { checkBanStatus } from "../utils/checkBan";
import { canSubmitDeArrow } from "../utils/permissions";
import { isRequestInvalid } from "../utils/requestValidator";
import { parseUserAgent } from "../utils/userAgent";
interface ExistingVote {
UUID: BrandingUUID;
type: number;
}
export async function postCasual(req: Request, res: Response) {
const { videoID, userID, downvote } = req.body as CasualVoteSubmission;
const userAgent = req.body.userAgent ?? parseUserAgent(req.get("user-agent")) ?? "";
let categories = req.body.categories as CasualCategory[];
const title = (req.body.title as string)?.toLowerCase();
const service = getService(req.body.service);
if (downvote) {
categories = ["downvote" as CasualCategory];
} else if (!categories.every((c) => config.casualCategoryList.includes(c))) {
return res.status(400).send("Invalid category");
}
if (!videoID || !userID || userID.length < 30 || !service || !categories || !Array.isArray(categories)) {
return res.status(400).send("Bad Request");
}
if (isRequestInvalid({
userID,
videoID,
userAgent,
userAgentHeader: req.headers["user-agent"],
casualCategories: categories,
service,
endpoint: "dearrow-postCasual",
})) {
Logger.warn(`Casual vote rejected by request validator: ${userAgent} ${req.headers["user-agent"]} ${categories} ${service} ${videoID}`);
return res.status(200).send("OK");
}
try {
const hashedUserID = await getHashCache(userID);
const hashedVideoID = await getHashCache(videoID, 1);
const hashedIP = await getHashCache(getIP(req) + config.globalSalt as IPAddress);
const isBanned = await checkBanStatus(hashedUserID, hashedIP);
const permission = await canSubmitDeArrow(hashedUserID);
if (!permission.canSubmit) {
res.status(403).send(permission.reason);
return;
}
const lock = await acquireLock(`postCasual:${videoID}.${hashedUserID}`);
if (!lock.status) {
res.status(429).send("Vote already in progress");
return;
}
if (isBanned) {
return res.status(200).send("OK");
}
let titleID = 0;
if (title) {
// See if title needs to be added
const titles = await db.prepare("all", `SELECT "title", "id" from "casualVoteTitles" WHERE "videoID" = ? AND "service" = ? ORDER BY "id"`, [videoID, service]) as { title: string, id: number }[];
if (titles.length > 0) {
const existingTitle = titles.find((t) => t.title === title);
if (existingTitle) {
titleID = existingTitle.id;
} else {
titleID = titles[titles.length - 1].id + 1;
await db.prepare("run", `INSERT INTO "casualVoteTitles" ("videoID", "service", "hashedVideoID", "id", "title") VALUES (?, ?, ?, ?, ?)`, [videoID, service, hashedVideoID, titleID, title]);
}
} else {
await db.prepare("run", `INSERT INTO "casualVoteTitles" ("videoID", "service", "hashedVideoID", "id", "title") VALUES (?, ?, ?, ?, ?)`, [videoID, service, hashedVideoID, titleID, title]);
}
} else {
const titles = await db.prepare("all", `SELECT "title", "id" from "casualVoteTitles" WHERE "videoID" = ? AND "service" = ? ORDER BY "id"`, [videoID, service]) as { title: string, id: number }[];
if (titles.length > 0) {
titleID = titles[titles.length - 1].id;
}
}
const now = Date.now();
for (const category of categories) {
const existingUUID = (await db.prepare("get", `SELECT "UUID" from "casualVotes" where "videoID" = ? AND "service" = ? AND "titleID" = ? AND "category" = ?`, [videoID, service, titleID, category]))?.UUID;
const UUID = existingUUID || crypto.randomUUID();
const alreadyVotedTheSame = await handleExistingVotes(videoID, service, titleID, hashedUserID, hashedIP, category, downvote, now);
if (existingUUID) {
if (!alreadyVotedTheSame) {
await db.prepare("run", `UPDATE "casualVotes" SET "upvotes" = "upvotes" + 1 WHERE "UUID" = ?`, [UUID]);
}
} else {
await db.prepare("run", `INSERT INTO "casualVotes" ("videoID", "service", "titleID", "hashedVideoID", "timeSubmitted", "UUID", "category", "upvotes") VALUES (?, ?, ?, ?, ?, ?, ?, ?)`,
[videoID, service, titleID, hashedVideoID, now, UUID, category, 1]);
}
}
QueryCacher.clearBrandingCache({ videoID, hashedVideoID, service });
res.status(200).send("OK");
lock.unlock();
} catch (e) {
Logger.error(e as string);
res.status(500).send("Internal Server Error");
}
}
async function handleExistingVotes(videoID: VideoID, service: Service, titleID: number,
hashedUserID: HashedUserID, hashedIP: HashedIP, category: CasualCategory, downvote: boolean, now: number): Promise<boolean> {
const existingVote = await privateDB.prepare("get", `SELECT "UUID" from "casualVotes" WHERE "videoID" = ? AND "service" = ? AND "titleID" = ? AND "userID" = ? AND "category" = ?`, [videoID, service, titleID, hashedUserID, category]) as ExistingVote;
if (existingVote) {
return true;
} else {
if (downvote) {
// Remove upvotes for all categories on this video
const existingUpvotes = await privateDB.prepare("all", `SELECT "category" from "casualVotes" WHERE "category" != 'downvote' AND "videoID" = ? AND "service" = ? AND "titleID" = ? AND "userID" = ?`, [videoID, service, titleID, hashedUserID]);
for (const existingUpvote of existingUpvotes) {
await db.prepare("run", `UPDATE "casualVotes" SET "upvotes" = "upvotes" - 1 WHERE "videoID" = ? AND "service" = ? AND "titleID" = ? AND "category" = ?`, [videoID, service, titleID, existingUpvote.category]);
await privateDB.prepare("run", `DELETE FROM "casualVotes" WHERE "videoID" = ? AND "service" = ? AND "titleID" = ? AND "userID" = ? AND "category" = ?`, [videoID, service, titleID, hashedUserID, existingUpvote.category]);
}
} else {
// Undo a downvote if it exists
const existingDownvote = await privateDB.prepare("get", `SELECT "UUID" from "casualVotes" WHERE "category" = 'downvote' AND "videoID" = ? AND "service" = ? AND "titleID" = ? AND "userID" = ?`, [videoID, service, titleID, hashedUserID]) as ExistingVote;
if (existingDownvote) {
await db.prepare("run", `UPDATE "casualVotes" SET "upvotes" = "upvotes" - 1 WHERE "category" = 'downvote' AND "videoID" = ? AND "service" = ? AND "titleID" = ?`, [videoID, service, titleID]);
await privateDB.prepare("run", `DELETE FROM "casualVotes" WHERE "category" = 'downvote' AND "videoID" = ? AND "service" = ? AND "titleID" = ? AND "userID" = ?`, [videoID, service, titleID, hashedUserID]);
}
}
}
await privateDB.prepare("run", `INSERT INTO "casualVotes" ("videoID", "service", "titleID", "userID", "hashedIP", "category", "timeSubmitted") VALUES (?, ?, ?, ?, ?, ?, ?)`,
[videoID, service, titleID, hashedUserID, hashedIP, category, now]);
return false;
}

View File

@@ -23,7 +23,7 @@ export async function postClearCache(req: Request, res: Response): Promise<Respo
if (invalidFields.length !== 0) {
// invalid request
const fields = invalidFields.reduce((p, c, i) => p + (i !== 0 ? ", " : "") + c, "");
const fields = invalidFields.join(", ");
return res.status(400).send(`No valid ${fields} field(s) provided`);
}

View File

@@ -20,11 +20,12 @@ import { parseUserAgent } from "../utils/userAgent";
import { getService } from "../utils/getService";
import axios from "axios";
import { vote } from "./voteOnSponsorTime";
import { canSubmit } from "../utils/permissions";
import { canSubmit, canSubmitGlobal } from "../utils/permissions";
import { getVideoDetails, videoDetails } from "../utils/getVideoDetails";
import * as youtubeID from "../utils/youtubeID";
import { acquireLock } from "../utils/redisLock";
import { checkBanStatus } from "../utils/checkBan";
import { isRequestInvalid } from "../utils/requestValidator";
type CheckResult = {
pass: boolean,
@@ -129,14 +130,19 @@ async function autoModerateSubmission(apiVideoDetails: videoDetails,
// return false on undefined or 0
if (!duration) return false;
if (apiDuration && apiDuration > 2 && duration && duration > 2 && Math.abs(apiDuration - duration) > 3) {
// YouTube server-side ad injection might be active, reject
return "YouTube is currently testing a new anti-adblock technique called server-side ad-injection. This causes skips and submissions to be offset by the duration of the ad. It seems that you are affected by this A/B test, so until a fix is developed, we cannot accept submissions from your device due to them potentially being inaccurate.";
}
const segments = submission.segments;
// map all times to float array
const allSegmentTimes = segments.filter((s) => s.actionType !== ActionType.Chapter)
.map(segment => [parseFloat(segment.segment[0]), parseFloat(segment.segment[1])]);
// add previous submissions by this user
const allSubmittedByUser = await db.prepare("all", `SELECT "startTime", "endTime" FROM "sponsorTimes" WHERE "userID" = ? AND "videoID" = ? AND "votes" > -1 AND "actionType" != 'chapter' AND "hidden" = 0`
, [submission.userID, submission.videoID]) as { startTime: string, endTime: string }[];
const allSubmittedByUser = await db.prepare("all", `SELECT "startTime", "endTime" FROM "sponsorTimes" WHERE "userID" = ? AND "videoID" = ? AND "service" = ? AND "votes" > -1 AND "actionType" != 'chapter' AND "hidden" = 0`
, [submission.userID, submission.videoID, submission.service]) as { startTime: string, endTime: string }[];
if (allSubmittedByUser) {
//add segments the user has previously submitted
@@ -158,20 +164,15 @@ async function autoModerateSubmission(apiVideoDetails: videoDetails,
}
async function checkUserActiveWarning(userID: HashedUserID): Promise<CheckResult> {
const MILLISECONDS_IN_HOUR = 3600000;
const now = Date.now();
const warnings = (await db.prepare("all",
const warning = await db.prepare("get",
`SELECT "reason"
FROM warnings
WHERE "userID" = ? AND "issueTime" > ? AND enabled = 1 AND type = 0
WHERE "userID" = ? AND enabled = 1 AND type = 0
ORDER BY "issueTime" DESC`,
[
userID,
Math.floor(now - (config.hoursAfterWarningExpires * MILLISECONDS_IN_HOUR))
],
) as {reason: string}[]).sort((a, b) => (b?.reason?.length ?? 0) - (a?.reason?.length ?? 0));
[userID],
) as {reason: string};
if (warnings?.length >= config.maxNumberOfActiveWarnings) {
if (warning != null) {
const defaultMessage = "Submission rejected due to a tip from a moderator. This means that we noticed you were making some common mistakes"
+ " that are not malicious, and we just want to clarify the rules. "
+ "Could you please send a message in discord.gg/SponsorBlock or matrix.to/#/#sponsor:ajay.app so we can further help you? "
@@ -179,7 +180,7 @@ async function checkUserActiveWarning(userID: HashedUserID): Promise<CheckResult
return {
pass: false,
errorMessage: defaultMessage + (warnings[0]?.reason?.length > 0 ? `\n\nTip message: '${warnings[0].reason}'` : ""),
errorMessage: defaultMessage + (warning.reason?.length > 0 ? `\n\nTip message: '${warning.reason}'` : ""),
errorCode: 403
};
}
@@ -194,12 +195,17 @@ async function checkInvalidFields(videoID: VideoID, userID: UserID, hashedUserID
if (typeof videoID !== "string" || videoID?.length == 0) {
invalidFields.push("videoID");
}
if (service === Service.YouTube && config.mode !== "test") {
const sanitizedVideoID = youtubeID.validate(videoID) ? videoID : youtubeID.sanitize(videoID);
if (!youtubeID.validate(sanitizedVideoID)) {
invalidFields.push("videoID");
errors.push("YouTube videoID could not be extracted");
if (service === Service.YouTube) {
if (config.mode !== "test") {
const sanitizedVideoID = youtubeID.validate(videoID) ? videoID : youtubeID.sanitize(videoID);
if (!youtubeID.validate(sanitizedVideoID)) {
invalidFields.push("videoID");
errors.push("YouTube videoID could not be extracted");
}
}
} else if (service !== Service.Spotify) {
invalidFields.push("service");
errors.push("Service is not supported");
}
const minLength = config.minUserIDLength;
if (typeof userID !== "string" || userID?.length < minLength) {
@@ -237,11 +243,11 @@ async function checkInvalidFields(videoID: VideoID, userID: UserID, hashedUserID
if (invalidFields.length !== 0) {
// invalid request
const formattedFields = invalidFields.reduce((p, c, i) => p + (i !== 0 ? ", " : "") + c, "");
const formattedErrors = errors.reduce((p, c, i) => p + (i !== 0 ? ". " : " ") + c, "");
const formattedFields = invalidFields.join(", ");
const formattedErrors = errors.join(". ");
return {
pass: false,
errorMessage: `No valid ${formattedFields}.${formattedErrors}`,
errorMessage: `No valid ${formattedFields}. ${formattedErrors}`,
errorCode: 400
};
}
@@ -279,7 +285,7 @@ async function checkEachSegmentValid(rawIP: IPAddress, paramUserID: UserID, user
errorMessage:
`Users have voted that all the segments required for this video have already been submitted for the following category: ` +
`'${segments[i].category}'\n` +
`${lockedCategoryList[lockIndex].reason?.length !== 0 ? `\nReason: '${lockedCategoryList[lockIndex].reason}\n'` : ""}` +
`${lockedCategoryList[lockIndex].reason?.length !== 0 ? `\nReason: '${lockedCategoryList[lockIndex].reason}'\n` : ""}` +
`You may need to refresh if you don't see the segments.\n` +
`${(segments[i].category === "sponsor" ? "\nMaybe the segment you are submitting is a different category that you have not enabled and is not a sponsor. " +
"Categories that aren't sponsor, such as self-promotion can be enabled in the options.\n" : "")}` +
@@ -324,23 +330,26 @@ async function checkEachSegmentValid(rawIP: IPAddress, paramUserID: UserID, user
const duplicateCheck2Row = await db.prepare("get", `SELECT "UUID" FROM "sponsorTimes" WHERE "startTime" = ?
and "endTime" = ? and "category" = ? and "actionType" = ? and "description" = ? and "videoID" = ? and "service" = ?`, [startTime, endTime, segments[i].category, segments[i].actionType, segments[i].description, videoID, service]);
if (duplicateCheck2Row) {
segments[i].ignoreSegment = true;
if (segments[i].actionType === ActionType.Full) {
// Forward as vote
await vote(rawIP, duplicateCheck2Row.UUID, paramUserID, 1);
segments[i].ignoreSegment = true;
continue;
} else {
return { pass: false, errorMessage: "Segment has already been submitted before.", errorCode: 409 };
}
}
}
if (segments.every((s) => s.ignoreSegment && s.actionType !== ActionType.Full)) {
return { pass: false, errorMessage: "Segment has already been submitted before.", errorCode: 409 };
}
return CHECK_PASS;
}
async function checkByAutoModerator(videoID: VideoID, userID: HashedUserID, segments: IncomingSegment[], service: Service, apiVideoDetails: videoDetails, videoDuration: number): Promise<CheckResult> {
// Auto moderator check
if (service == Service.YouTube) {
if (service == Service.YouTube && apiVideoDetails) {
const autoModerateResult = await autoModerateSubmission(apiVideoDetails, { videoID, userID, segments, service, videoDuration });
if (autoModerateResult) {
return {
@@ -356,6 +365,15 @@ async function checkByAutoModerator(videoID: VideoID, userID: HashedUserID, segm
async function updateDataIfVideoDurationChange(videoID: VideoID, service: Service, videoDuration: VideoDuration, videoDurationParam: VideoDuration) {
let lockedCategoryList = await db.prepare("all", 'SELECT category, "actionType", reason from "lockCategories" where "videoID" = ? AND "service" = ?', [videoID, service]);
if (service === Service.Spotify) {
// Don't handle changed durations
return {
videoDuration,
apiVideoDetails: null,
lockedCategoryList
};
}
const previousSubmissions = await db.prepare("all",
`SELECT "videoDuration", "UUID"
FROM "sponsorTimes"
@@ -385,9 +403,12 @@ async function updateDataIfVideoDurationChange(videoID: VideoID, service: Servic
// Only treat as difference if both the api duration and submitted duration have changed
if (videoDurationChanged(videoDuration) && (!videoDurationParam || videoDurationChanged(videoDurationParam))) {
// Hide all previous submissions
for (const submission of previousSubmissions) {
await db.prepare("run", `UPDATE "sponsorTimes" SET "hidden" = 1 WHERE "UUID" = ?`, [submission.UUID]);
}
await db.prepare("run", `UPDATE "sponsorTimes" SET "hidden" = 1
WHERE "videoID" = ? AND "service" = ? AND "videoDuration" != ?
AND "hidden" = 0 AND "shadowHidden" = 0 AND
"actionType" != 'full' AND "votes" > -2`,
[videoID, service, videoDuration]);
lockedCategoryList = [];
deleteLockCategories(videoID, null, null, service).catch((e) => Logger.error(`deleting lock categories: ${e}`));
}
@@ -498,6 +519,22 @@ export async function postSkipSegments(req: Request, res: Response): Promise<Res
}
const userID: HashedUserID = await getHashCache(paramUserID);
const matchedRule = isRequestInvalid({
userAgent,
userAgentHeader: req.headers["user-agent"],
videoDuration,
videoID,
userID: paramUserID,
service,
segments,
endpoint: "sponsorblock-postSkipSegments"
});
if (matchedRule !== null) {
sendNewUserWebhook(config.discordRejectedNewUserWebhookURL, userID, videoID, userAgent, req, videoDurationParam, matchedRule);
Logger.warn(`Sponsorblock submission rejected by request validator: ${userID} ${videoID} ${videoDurationParam} ${userAgent} ${req.headers["user-agent"]}`);
return res.status(200).send("OK");
}
const invalidCheckResult = await checkInvalidFields(videoID, paramUserID, userID, segments, videoDurationParam, userAgent, service);
if (!invalidCheckResult.pass) {
return res.status(invalidCheckResult.errorCode).send(invalidCheckResult.errorMessage);
@@ -534,11 +571,20 @@ export async function postSkipSegments(req: Request, res: Response): Promise<Res
if (!(isVIP || isTempVIP)) {
const autoModerateCheckResult = await checkByAutoModerator(videoID, userID, segments, service, apiVideoDetails, videoDurationParam);
if (!autoModerateCheckResult.pass) {
lock.unlock();
return res.status(autoModerateCheckResult.errorCode).send(autoModerateCheckResult.errorMessage);
}
}
const permission = await canSubmitGlobal(userID);
if (!permission.canSubmit) {
lock.unlock();
Logger.warn(`New user trying to submit: ${userID} ${videoID} ${Object.keys(segments?.[0] ?? {})} ${Object.keys(req.query)} ${videoDurationParam} ${userAgent} ${req.headers["user-agent"]}`);
return res.status(403).send(permission.reason);
} else if (permission.newUser) {
sendNewUserWebhook(config.discordNewUserWebhookURL, userID, videoID, userAgent, req, videoDurationParam, undefined);
}
// Will be filled when submitting
const UUIDs = [];
const newSegments = [];
@@ -585,10 +631,12 @@ export async function postSkipSegments(req: Request, res: Response): Promise<Res
//add to private db as well
await privateDB.prepare("run", `INSERT INTO "sponsorTimes" VALUES(?, ?, ?, ?)`, [videoID, hashedIP, timeSubmitted, service]);
await db.prepare("run", `INSERT INTO "videoInfo" ("videoID", "channelID", "title", "published")
SELECT ?, ?, ?, ?
WHERE NOT EXISTS (SELECT 1 FROM "videoInfo" WHERE "videoID" = ?)`, [
videoID, apiVideoDetails?.authorId || "", apiVideoDetails?.title || "", apiVideoDetails?.published || 0, videoID]);
if (service === Service.YouTube) {
await db.prepare("run", `INSERT INTO "videoInfo" ("videoID", "channelID", "title", "published")
SELECT ?, ?, ?, ?
WHERE NOT EXISTS (SELECT 1 FROM "videoInfo" WHERE "videoID" = ?)`, [
videoID, apiVideoDetails?.authorId || "", apiVideoDetails?.title || "", apiVideoDetails?.published || 0, videoID]);
}
// Clear redis cache for this video
QueryCacher.clearSegmentCache({
@@ -625,6 +673,40 @@ export async function postSkipSegments(req: Request, res: Response): Promise<Res
}
}
function sendNewUserWebhook(webhookUrl: string, userID: HashedUserID, videoID: any, userAgent: any, req: Request, videoDurationParam: VideoDuration, ruleName: string | undefined) {
if (!webhookUrl) return;
axios.post(webhookUrl, {
"embeds": [{
"title": userID,
"url": `https://www.youtube.com/watch?v=${videoID}`,
"description": `**User Agent**: ${userAgent}\
\n**Sent User Agent**: ${req.query.userAgent ?? req.body.userAgent}\
\n**Real User Agent**: ${req.headers["user-agent"]}\
\n**Video Duration**: ${videoDurationParam}`,
"color": 10813440,
"thumbnail": {
"url": getMaxResThumbnail(videoID),
},
"footer": {
"text": ruleName === undefined ? "Caught by permission check" : `Caught by rule '${ruleName}'`,
},
}],
})
.then(res => {
if (res.status >= 400) {
Logger.error("Error sending reported submission Discord hook");
Logger.error(JSON.stringify((res.data)));
Logger.error("\n");
}
})
.catch(err => {
Logger.error("Failed to send reported submission Discord hook.");
Logger.error(JSON.stringify(err));
Logger.error("\n");
});
}
// Takes an array of arrays:
// ex)
// [

View File

@@ -4,7 +4,6 @@ import { db } from "../databases/databases";
import { isUserVIP } from "../utils/isUserVIP";
import { getHashCache } from "../utils/getHashCache";
import { HashedUserID, UserID } from "../types/user.model";
import { config } from "../config";
import { generateWarningDiscord, warningData, dispatchEvent } from "../utils/webhookUtils";
import { WarningType } from "../types/warning.model";
@@ -16,12 +15,7 @@ type warningEntry = {
reason: string
}
function checkExpiredWarning(warning: warningEntry): boolean {
const MILLISECONDS_IN_HOUR = 3600000;
const now = Date.now();
const expiry = Math.floor(now - (config.hoursAfterWarningExpires * MILLISECONDS_IN_HOUR));
return warning.issueTime > expiry && !warning.enabled;
}
const MAX_EDIT_DELAY = 900000; // 15 mins
const getUsername = (userID: HashedUserID) => db.prepare("get", `SELECT "userName" FROM "userNames" WHERE "userID" = ?`, [userID], { useReplica: true });
@@ -44,30 +38,30 @@ export async function postWarning(req: Request, res: Response): Promise<Response
try {
if (enabled) {
const previousWarning = await db.prepare("get", 'SELECT * FROM "warnings" WHERE "userID" = ? AND "issuerUserID" = ? AND "type" = ?', [userID, issuerUserID, type]) as warningEntry;
if (!reason) {
return res.status(400).json({ "message": "Missing warning reason" });
}
const previousWarning = await db.prepare("get", 'SELECT * FROM "warnings" WHERE "userID" = ? AND "type" = ? AND "enabled" = 1', [userID, type]) as warningEntry;
if (!previousWarning) {
if (!reason) {
return res.status(400).json({ "message": "Missing warning reason" });
}
await db.prepare(
"run",
'INSERT INTO "warnings" ("userID", "issueTime", "issuerUserID", "enabled", "reason", "type") VALUES (?, ?, ?, 1, ?, ?)',
[userID, issueTime, issuerUserID, reason, type]
);
resultStatus = "issued to";
// check if warning is still within issue time and warning is not enabled
} else if (checkExpiredWarning(previousWarning) ) {
// allow a warning to be edited by the same vip within 15 mins of issuing
} else if (issuerUserID === previousWarning.issuerUserID && (Date.now() - MAX_EDIT_DELAY) < previousWarning.issueTime) {
await db.prepare(
"run", 'UPDATE "warnings" SET "enabled" = 1, "reason" = ? WHERE "userID" = ? AND "issueTime" = ?',
"run", 'UPDATE "warnings" SET "reason" = ? WHERE "userID" = ? AND "issueTime" = ?',
[reason, userID, previousWarning.issueTime]
);
resultStatus = "re-enabled for";
resultStatus = "edited for";
} else {
return res.sendStatus(409);
}
} else {
await db.prepare("run", 'UPDATE "warnings" SET "enabled" = 0 WHERE "userID" = ? AND "type" = ?', [userID, type]);
await db.prepare("run", 'UPDATE "warnings" SET "enabled" = 0, "disableTime" = ? WHERE "userID" = ? AND "type" = ? AND "enabled" = 1', [issueTime, userID, type]);
resultStatus = "removed from";
}

48
src/routes/setConfig.ts Normal file
View File

@@ -0,0 +1,48 @@
import { getHashCache } from "../utils/getHashCache";
import { db } from "../databases/databases";
import { Request, Response } from "express";
import { isUserVIP } from "../utils/isUserVIP";
import { UserID } from "../types/user.model";
import { Logger } from "../utils/logger";
interface SetConfigRequest extends Request {
body: {
userID: UserID;
key: string;
value: string;
}
}
const allowedConfigs = [
"old-submitter-block-date",
"max-users-per-minute",
"max-users-per-minute-dearrow"
];
export async function setConfig(req: SetConfigRequest, res: Response): Promise<Response> {
const { body: { userID, key, value } } = req;
if (!userID || !allowedConfigs.includes(key)) {
// invalid request
return res.sendStatus(400);
}
// hash the userID
const hashedUserID = await getHashCache(userID as UserID);
const isVIP = (await isUserVIP(hashedUserID));
if (!isVIP) {
// not authorized
return res.sendStatus(403);
}
try {
await db.prepare("run", `INSERT INTO "config" ("key", "value") VALUES(?, ?) ON CONFLICT ("key") DO UPDATE SET "value" = ?`, [key, value, value]);
return res.sendStatus(200);
} catch (e) {
Logger.error(e as string);
return res.sendStatus(500);
}
}

View File

@@ -5,8 +5,9 @@ import { getHashCache } from "../utils/getHashCache";
import { Request, Response } from "express";
import { isUserBanned } from "../utils/checkBan";
import { HashedUserID } from "../types/user.model";
import { isRequestInvalid } from "../utils/requestValidator";
function logUserNameChange(userID: string, newUserName: string, oldUserName: string, updatedByAdmin: boolean): Promise<Response> {
function logUserNameChange(userID: string, newUserName: string, oldUserName: string, updatedByAdmin: boolean): Promise<void> {
return privateDB.prepare("run",
`INSERT INTO "userNameLogs"("userID", "newUserName", "oldUserName", "updatedByAdmin", "updatedAt") VALUES(?, ?, ?, ?, ?)`,
[userID, newUserName, oldUserName, + updatedByAdmin, new Date().getTime()]
@@ -15,7 +16,7 @@ function logUserNameChange(userID: string, newUserName: string, oldUserName: str
export async function setUsername(req: Request, res: Response): Promise<Response> {
const userIDInput = req.query.userID as string;
const adminUserIDInput = req.query.adminUserID as string;
const adminUserIDInput = req.query.adminUserID as string | undefined;
let userName = req.query.username as string;
let hashedUserID: HashedUserID;
@@ -29,16 +30,22 @@ export async function setUsername(req: Request, res: Response): Promise<Response
return res.sendStatus(200);
}
const timings = [Date.now()];
// remove unicode control characters from username (example: \n, \r, \t etc.)
// source: https://en.wikipedia.org/wiki/Control_character#In_Unicode
// eslint-disable-next-line no-control-regex
userName = userName.replace(/[\u0000-\u001F\u007F-\u009F]/g, "");
try {
timings.push(Date.now());
if (isRequestInvalid({
userAgentHeader: req.headers["user-agent"],
userID: adminUserIDInput ?? userIDInput,
newUsername: userName,
endpoint: "setUsername",
})) {
Logger.warn(`Username change rejected by request validator: ${userName} ${req.headers["user-agent"]}`);
return res.sendStatus(200);
}
try {
if (adminUserIDInput != undefined) {
//this is the admin controlling the other users account, don't hash the controling account's ID
hashedUserID = userIDInput as HashedUserID;
@@ -55,15 +62,11 @@ export async function setUsername(req: Request, res: Response): Promise<Response
//hash the userID
hashedUserID = await getHashCache(userIDInput) as HashedUserID;
timings.push(Date.now());
const row = await db.prepare("get", `SELECT count(*) as "userCount" FROM "userNames" WHERE "userID" = ? AND "locked" = 1`, [hashedUserID]);
if (row.userCount > 0) {
return res.sendStatus(200);
}
timings.push(Date.now());
if (await isUserBanned(hashedUserID)) {
return res.sendStatus(200);
}
@@ -80,8 +83,6 @@ export async function setUsername(req: Request, res: Response): Promise<Response
const locked = adminUserIDInput === undefined ? 0 : 1;
let oldUserName = "";
timings.push(Date.now());
if (row?.userName !== undefined) {
//already exists, update this row
oldUserName = row.userName;
@@ -90,19 +91,16 @@ export async function setUsername(req: Request, res: Response): Promise<Response
} else {
await db.prepare("run", `UPDATE "userNames" SET "userName" = ?, "locked" = ? WHERE "userID" = ?`, [userName, locked, hashedUserID]);
}
} else if (userName === hashedUserID) {
return res.sendStatus(200);
} else {
//add to the db
await db.prepare("run", `INSERT INTO "userNames"("userID", "userName", "locked") VALUES(?, ?, ?)`, [hashedUserID, userName, locked]);
}
timings.push(Date.now());
await logUserNameChange(hashedUserID, userName, oldUserName, adminUserIDInput !== undefined);
timings.push(Date.now());
return res.status(200).send(timings.join(", "));
return res.sendStatus(200);
} catch (err) /* istanbul ignore next */ {
Logger.error(err as string);
return res.sendStatus(500);

View File

@@ -1,8 +1,8 @@
import { db, privateDB } from "../databases/databases";
import { db } from "../databases/databases";
import { getHashCache } from "../utils/getHashCache";
import { Request, Response } from "express";
import { config } from "../config";
import { Category, DeArrowType, HashedIP, Service, VideoID, VideoIDHash } from "../types/segments.model";
import { Category, DeArrowType, Service, VideoID, VideoIDHash } from "../types/segments.model";
import { UserID } from "../types/user.model";
import { QueryCacher } from "../utils/queryCacher";
import { isUserVIP } from "../utils/isUserVIP";
@@ -11,7 +11,6 @@ import { Logger } from "../utils/logger";
export async function shadowBanUser(req: Request, res: Response): Promise<Response> {
const userID = req.query.userID as UserID;
const hashedIP = req.query.hashedIP as HashedIP;
const adminUserIDInput = req.query.adminUserID as UserID;
const type = Number.parseInt(req.query.type as string ?? "1");
if (isNaN(type)) {
@@ -21,10 +20,6 @@ export async function shadowBanUser(req: Request, res: Response): Promise<Respon
const enabled = req.query.enabled === undefined
? true
: req.query.enabled === "true";
const lookForIPs = req.query.lookForIPs === "true";
const banUsers = req.query.banUsers === undefined
? true
: req.query.banUsers === "true";
//if enabled is false and the old submissions should be made visible again
const unHideOldSubmissions = req.query.unHideOldSubmissions !== "false";
@@ -32,7 +27,7 @@ export async function shadowBanUser(req: Request, res: Response): Promise<Respon
const categories: Category[] = parseCategories(req, config.categoryList as Category[]);
const deArrowTypes: DeArrowType[] = parseDeArrowTypes(req, config.deArrowTypes);
if (adminUserIDInput == undefined || (userID == undefined && hashedIP == undefined || type <= 0)) {
if (adminUserIDInput == undefined || (userID == undefined || type <= 0)) {
//invalid request
return res.sendStatus(400);
}
@@ -46,32 +41,10 @@ export async function shadowBanUser(req: Request, res: Response): Promise<Respon
//not authorized
return res.sendStatus(403);
}
if (userID) {
const result = await banUser(userID, enabled, unHideOldSubmissions, type, categories, deArrowTypes);
if (enabled && lookForIPs) {
const ipLoggingFixedTime = 1675295716000;
const timeSubmitted = (await db.prepare("all", `SELECT "timeSubmitted" FROM "sponsorTimes" WHERE "timeSubmitted" > ? AND "userID" = ?`, [ipLoggingFixedTime, userID])) as { timeSubmitted: number }[];
const ips = (await Promise.all(timeSubmitted.map((s) => {
return privateDB.prepare("all", `SELECT "hashedIP" FROM "sponsorTimes" WHERE "timeSubmitted" = ?`, [s.timeSubmitted]) as Promise<{ hashedIP: HashedIP }[]>;
}))).flat();
await Promise.all([...new Set(ips.map((ip) => ip.hashedIP))].map((ip) => {
return banIP(ip, enabled, unHideOldSubmissions, type, categories, deArrowTypes, true);
}));
}
if (result) {
res.sendStatus(result);
return;
}
} else if (hashedIP) {
const result = await banIP(hashedIP, enabled, unHideOldSubmissions, type, categories, deArrowTypes, banUsers);
if (result) {
res.sendStatus(result);
return;
}
const result = await banUser(userID, enabled, unHideOldSubmissions, type, categories, deArrowTypes);
if (result) {
res.sendStatus(result);
return;
}
return res.sendStatus(200);
} catch (e) {
@@ -115,58 +88,20 @@ export async function banUser(userID: UserID, enabled: boolean, unHideOldSubmiss
// already not shadowbanned
return 400;
}
return 200;
}
export async function banIP(hashedIP: HashedIP, enabled: boolean, unHideOldSubmissions: boolean, type: number,
categories: Category[], deArrowTypes: DeArrowType[], banUsers: boolean): Promise<number> {
//check to see if this user is already shadowbanned
const row = await db.prepare("get", `SELECT count(*) as "userCount" FROM "shadowBannedIPs" WHERE "hashedIP" = ?`, [hashedIP]);
if (enabled) {
if (row.userCount == 0) {
await db.prepare("run", `INSERT INTO "shadowBannedIPs" VALUES(?)`, [hashedIP]);
}
//find all previous submissions and hide them
if (unHideOldSubmissions) {
const users = await unHideSubmissionsByIP(categories, hashedIP, type);
if (banUsers) {
await Promise.all([...users].map((user) => {
return banUser(user, enabled, unHideOldSubmissions, type, categories, deArrowTypes);
}));
}
} else if (row.userCount > 0) {
// Nothing to do, and already added
return 409;
}
} else if (!enabled) {
if (row.userCount > 0) {
//remove them from the shadow ban list
await db.prepare("run", `DELETE FROM "shadowBannedIPs" WHERE "hashedIP" = ?`, [hashedIP]);
}
//find all previous submissions and unhide them
if (unHideOldSubmissions) {
await unHideSubmissionsByIP(categories, hashedIP, 0);
}
}
return 200;
}
async function unHideSubmissionsByUser(categories: string[], deArrowTypes: DeArrowType[],
userID: UserID, type = 1) {
await db.prepare("run", `UPDATE "sponsorTimes" SET "shadowHidden" = '${type}' WHERE "userID" = ? AND "category" in (${categories.map((c) => `'${c}'`).join(",")})
AND NOT EXISTS ( SELECT "videoID", "category" FROM "lockCategories" WHERE
"sponsorTimes"."videoID" = "lockCategories"."videoID" AND "sponsorTimes"."service" = "lockCategories"."service" AND "sponsorTimes"."category" = "lockCategories"."category")`, [userID]);
if (categories.length) {
await db.prepare("run", `UPDATE "sponsorTimes" SET "shadowHidden" = '${type}' WHERE "userID" = ? AND "category" in (${categories.map((c) => `'${c}'`).join(",")})
AND NOT EXISTS ( SELECT "videoID", "category" FROM "lockCategories" WHERE
"sponsorTimes"."videoID" = "lockCategories"."videoID" AND "sponsorTimes"."service" = "lockCategories"."service" AND "sponsorTimes"."category" = "lockCategories"."category")`, [userID]);
}
// clear cache for all old videos
(await db.prepare("all", `SELECT "videoID", "hashedVideoID", "service", "votes", "views" FROM "sponsorTimes" WHERE "userID" = ?`, [userID]))
(await db.prepare("all", `SELECT "category", "videoID", "hashedVideoID", "service", "userID" FROM "sponsorTimes" WHERE "userID" = ?`, [userID]))
.forEach((videoInfo: { category: Category; videoID: VideoID; hashedVideoID: VideoIDHash; service: Service; userID: UserID; }) => {
QueryCacher.clearSegmentCache(videoInfo);
});
@@ -181,7 +116,6 @@ async function unHideSubmissionsByUser(categories: string[], deArrowTypes: DeArr
[userID]);
}
(await db.prepare("all", `SELECT "videoID", "hashedVideoID", "service" FROM "titles" WHERE "userID" = ?`, [userID]))
.forEach((videoInfo: { videoID: VideoID; hashedVideoID: VideoIDHash; service: Service; }) => {
QueryCacher.clearBrandingCache(videoInfo);
@@ -190,24 +124,4 @@ async function unHideSubmissionsByUser(categories: string[], deArrowTypes: DeArr
.forEach((videoInfo: { videoID: VideoID; hashedVideoID: VideoIDHash; service: Service; }) => {
QueryCacher.clearBrandingCache(videoInfo);
});
}
async function unHideSubmissionsByIP(categories: string[], hashedIP: HashedIP, type = 1): Promise<Set<UserID>> {
const submissions = await privateDB.prepare("all", `SELECT "timeSubmitted" FROM "sponsorTimes" WHERE "hashedIP" = ?`, [hashedIP]) as { timeSubmitted: number }[];
const users: Set<UserID> = new Set();
await Promise.all(submissions.map(async (submission) => {
(await db.prepare("all", `SELECT "videoID", "hashedVideoID", "service", "votes", "views", "userID" FROM "sponsorTimes" WHERE "timeSubmitted" = ? AND "category" in (${categories.map((c) => `'${c}'`).join(",")})`, [submission.timeSubmitted]))
.forEach((videoInfo: { category: Category, videoID: VideoID, hashedVideoID: VideoIDHash, service: Service, userID: UserID }) => {
QueryCacher.clearSegmentCache(videoInfo);
users.add(videoInfo.userID);
}
);
await db.prepare("run", `UPDATE "sponsorTimes" SET "shadowHidden" = ${type} WHERE "timeSubmitted" = ? AND "category" in (${categories.map((c) => `'${c}'`).join(",")})
AND NOT EXISTS ( SELECT "videoID", "category" FROM "lockCategories" WHERE
"sponsorTimes"."videoID" = "lockCategories"."videoID" AND "sponsorTimes"."service" = "lockCategories"."service" AND "sponsorTimes"."category" = "lockCategories"."category")`, [submission.timeSubmitted]);
}));
return users;
}
}

View File

@@ -3,14 +3,18 @@ import { Request, Response } from "express";
export async function viewedVideoSponsorTime(req: Request, res: Response): Promise<Response> {
const UUID = req.query?.UUID;
const videoID = req.query?.videoID;
if (!UUID) {
//invalid request
return res.sendStatus(400);
}
//up the view count by one
await db.prepare("run", `UPDATE "sponsorTimes" SET views = views + 1 WHERE "UUID" = ?`, [UUID]);
if (!videoID) {
await db.prepare("run", `UPDATE "sponsorTimes" SET views = views + 1 WHERE "UUID" = ?`, [UUID]);
} else {
await db.prepare("run", `UPDATE "sponsorTimes" SET views = views + 1 WHERE "UUID" LIKE ? AND "videoID" = ?`, [`${UUID}%`, videoID]);
}
return res.sendStatus(200);
}

View File

@@ -2,7 +2,7 @@ import { Request, Response } from "express";
import { Logger } from "../utils/logger";
import { isUserVIP } from "../utils/isUserVIP";
import { isUserTempVIP } from "../utils/isUserTempVIP";
import { getMaxResThumbnail, YouTubeAPI } from "../utils/youtubeApi";
import { getMaxResThumbnail } from "../utils/youtubeApi";
import { db, privateDB } from "../databases/databases";
import { dispatchEvent, getVoteAuthor, getVoteAuthorRaw } from "../utils/webhookUtils";
import { getFormattedTime } from "../utils/getFormattedTime";
@@ -128,83 +128,80 @@ async function sendWebhooks(voteData: VoteData) {
webhookURL = config.discordCompletelyIncorrectReportWebhookURL;
}
if (config.newLeafURLs !== null) {
const videoID = submissionInfoRow.videoID;
const { err, data } = await YouTubeAPI.listVideos(videoID);
if (err) return;
const videoID = submissionInfoRow.videoID;
const data = await getVideoDetails(videoID);
const isUpvote = voteData.incrementAmount > 0;
// Send custom webhooks
dispatchEvent(isUpvote ? "vote.up" : "vote.down", {
const isUpvote = voteData.incrementAmount > 0;
// Send custom webhooks
dispatchEvent(isUpvote ? "vote.up" : "vote.down", {
"user": {
"status": getVoteAuthorRaw(userSubmissionCountRow.submissionCount, voteData.isTempVIP, voteData.isVIP, voteData.isOwnSubmission),
},
"video": {
"id": submissionInfoRow.videoID,
"title": data?.title,
"url": `https://www.youtube.com/watch?v=${videoID}`,
"thumbnail": getMaxResThumbnail(videoID),
},
"submission": {
"UUID": voteData.UUID,
"views": voteData.row.views,
"category": voteData.category,
"startTime": submissionInfoRow.startTime,
"endTime": submissionInfoRow.endTime,
"user": {
"status": getVoteAuthorRaw(userSubmissionCountRow.submissionCount, voteData.isTempVIP, voteData.isVIP, voteData.isOwnSubmission),
},
"video": {
"id": submissionInfoRow.videoID,
"title": data?.title,
"url": `https://www.youtube.com/watch?v=${videoID}`,
"thumbnail": getMaxResThumbnail(videoID),
},
"submission": {
"UUID": voteData.UUID,
"views": voteData.row.views,
"category": voteData.category,
"startTime": submissionInfoRow.startTime,
"endTime": submissionInfoRow.endTime,
"user": {
"UUID": submissionInfoRow.userID,
"username": submissionInfoRow.userName,
"submissions": {
"total": submissionInfoRow.count,
"ignored": submissionInfoRow.disregarded,
},
"UUID": submissionInfoRow.userID,
"username": submissionInfoRow.userName,
"submissions": {
"total": submissionInfoRow.count,
"ignored": submissionInfoRow.disregarded,
},
},
"votes": {
"before": voteData.row.votes,
"after": (voteData.row.votes + voteData.incrementAmount - voteData.oldIncrementAmount),
},
});
},
"votes": {
"before": voteData.row.votes,
"after": (voteData.row.votes + voteData.incrementAmount - voteData.oldIncrementAmount),
},
});
// Send discord message
if (webhookURL !== null && !isUpvote) {
axios.post(webhookURL, {
"embeds": [{
"title": data?.title,
"url": `https://www.youtube.com/watch?v=${submissionInfoRow.videoID}&t=${(submissionInfoRow.startTime.toFixed(0) - 2)}s#requiredSegment=${voteData.UUID}`,
"description": `**${voteData.row.votes} Votes Prior | \
${(voteData.row.votes + voteData.incrementAmount - voteData.oldIncrementAmount)} Votes Now | ${voteData.row.views} \
Views**\n\n**Locked**: ${voteData.row.locked}\n\n**Submission ID:** ${voteData.UUID}\
\n**Category:** ${submissionInfoRow.category}\
\n\n**Submitted by:** ${submissionInfoRow.userName}\n${submissionInfoRow.userID}\
\n\n**Total User Submissions:** ${submissionInfoRow.count}\
\n**Ignored User Submissions:** ${submissionInfoRow.disregarded}\
\n\n**Timestamp:** \
${getFormattedTime(submissionInfoRow.startTime)} to ${getFormattedTime(submissionInfoRow.endTime)}`,
"color": 10813440,
"author": {
"name": voteData.finalResponse?.webhookMessage ??
voteData.finalResponse?.finalMessage ??
`${getVoteAuthor(userSubmissionCountRow.submissionCount, voteData.isTempVIP, voteData.isVIP, voteData.isOwnSubmission)}${voteData.row.locked ? " (Locked)" : ""}`,
},
"thumbnail": {
"url": getMaxResThumbnail(videoID),
},
}],
})
.then(res => {
if (res.status >= 400) {
Logger.error("Error sending reported submission Discord hook");
Logger.error(JSON.stringify((res.data)));
Logger.error("\n");
}
})
.catch(err => {
Logger.error("Failed to send reported submission Discord hook.");
Logger.error(JSON.stringify(err));
// Send discord message
if (webhookURL !== null && !isUpvote) {
axios.post(webhookURL, {
"embeds": [{
"title": data?.title,
"url": `https://www.youtube.com/watch?v=${submissionInfoRow.videoID}&t=${(submissionInfoRow.startTime.toFixed(0) - 2)}s#requiredSegment=${voteData.UUID}`,
"description": `**${voteData.row.votes} Votes Prior | \
${(voteData.row.votes + voteData.incrementAmount - voteData.oldIncrementAmount)} Votes Now | ${voteData.row.views} \
Views**\n\n**Locked**: ${voteData.row.locked}\n\n**Submission ID:** ${voteData.UUID}\
\n**Category:** ${submissionInfoRow.category}\
\n\n**Submitted by:** ${submissionInfoRow.userName}\n${submissionInfoRow.userID}\
\n\n**Total User Submissions:** ${submissionInfoRow.count}\
\n**Ignored User Submissions:** ${submissionInfoRow.disregarded}\
\n\n**Timestamp:** \
${getFormattedTime(submissionInfoRow.startTime)} to ${getFormattedTime(submissionInfoRow.endTime)}`,
"color": 10813440,
"author": {
"name": voteData.finalResponse?.webhookMessage ??
voteData.finalResponse?.finalMessage ??
`${getVoteAuthor(userSubmissionCountRow.submissionCount, voteData.isTempVIP, voteData.isVIP, voteData.isOwnSubmission)}${voteData.row.locked ? " (Locked)" : ""}`,
},
"thumbnail": {
"url": getMaxResThumbnail(videoID),
},
}],
})
.then(res => {
if (res.status >= 400) {
Logger.error("Error sending reported submission Discord hook");
Logger.error(JSON.stringify((res.data)));
Logger.error("\n");
});
}
}
})
.catch(err => {
Logger.error("Failed to send reported submission Discord hook.");
Logger.error(JSON.stringify(err));
Logger.error("\n");
});
}
}
}
@@ -306,9 +303,10 @@ export async function voteOnSponsorTime(req: Request, res: Response): Promise<Re
const paramUserID = getUserID(req);
const type = req.query.type !== undefined ? parseInt(req.query.type as string) : undefined;
const category = req.query.category as Category;
const videoID = req.query.videoID as VideoID;
const ip = getIP(req);
const result = await vote(ip, UUID, paramUserID, type, category);
const result = await vote(ip, UUID, paramUserID, type, videoID, category);
const response = res.status(result.status);
if (result.message) {
@@ -320,7 +318,7 @@ export async function voteOnSponsorTime(req: Request, res: Response): Promise<Re
}
}
export async function vote(ip: IPAddress, UUID: SegmentUUID, paramUserID: UserID, type: number, category?: Category): Promise<{ status: number, message?: string, json?: unknown }> {
export async function vote(ip: IPAddress, UUID: SegmentUUID, paramUserID: UserID, type: number, videoID?: VideoID, category?: Category): Promise<{ status: number, message?: string, json?: unknown }> {
// missing key parameters
if (!UUID || !paramUserID || !(type !== undefined || category)) {
return { status: 400 };
@@ -330,6 +328,14 @@ export async function vote(ip: IPAddress, UUID: SegmentUUID, paramUserID: UserID
return { status: 200 };
}
if (videoID && UUID.length < 60) {
// Get the full UUID
const segmentInfo: DBSegment = await db.prepare("get", `SELECT "UUID" from "sponsorTimes" WHERE "UUID" LIKE ? AND "videoID" = ?`, [`${UUID}%`, videoID]);
if (segmentInfo) {
UUID = segmentInfo.UUID;
}
}
const originalType = type;
//hash the userID
@@ -373,14 +379,12 @@ export async function vote(ip: IPAddress, UUID: SegmentUUID, paramUserID: UserID
return { status: 400 };
}
const MILLISECONDS_IN_HOUR = 3600000;
const now = Date.now();
const warnings = (await db.prepare("all", `SELECT "reason" FROM warnings WHERE "userID" = ? AND "issueTime" > ? AND enabled = 1 AND type = 0`,
[nonAnonUserID, Math.floor(now - (config.hoursAfterWarningExpires * MILLISECONDS_IN_HOUR))],
const warning = (await db.prepare("get", `SELECT "reason" FROM warnings WHERE "userID" = ? AND enabled = 1 AND type = 0`,
[nonAnonUserID],
));
if (warnings.length >= config.maxNumberOfActiveWarnings) {
const warningReason = warnings[0]?.reason;
if (warning != null) {
const warningReason = warning.reason;
lock.unlock();
return { status: 403, message: "Vote rejected due to a tip from a moderator. This means that we noticed you were making some common mistakes that are not malicious, and we just want to clarify the rules. " +
"Could you please send a message in Discord or Matrix so we can further help you?" +

View File

@@ -3,6 +3,8 @@ import { UserID } from "./user.model";
export type BrandingUUID = string & { readonly __brandingUUID: unique symbol };
export type CasualCategory = ("funny" | "creative" | "clever" | "descriptive" | "other" | "downvote") & { __casualCategoryBrand: unknown };
export interface BrandingDBSubmissionData {
videoID: VideoID,
}
@@ -17,6 +19,7 @@ export interface TitleDBResult extends BrandingDBSubmission {
title: string,
original: number,
votes: number,
downvotes: number,
locked: number,
verification: number,
userID: UserID
@@ -35,6 +38,7 @@ export interface ThumbnailDBResult extends BrandingDBSubmission {
timestamp?: number,
original: number,
votes: number,
downvotes: number,
locked: number,
userID: UserID
}
@@ -48,20 +52,29 @@ export interface ThumbnailResult {
userID?: UserID
}
export interface CasualVote {
id: string,
count: number,
title: string | null
}
export interface BrandingResult {
titles: TitleResult[],
thumbnails: ThumbnailResult[],
casualVotes: CasualVote[],
randomTime: number,
videoDuration: number | null
}
export interface BrandingHashDBResult {
titles: TitleDBResult[],
thumbnails: ThumbnailDBResult[],
segments: BrandingSegmentDBResult[]
titles: TitleDBResult[];
thumbnails: ThumbnailDBResult[];
segments: BrandingSegmentDBResult[];
casualVotes: CasualVoteDBResult[];
}
export interface OriginalThumbnailSubmission {
timestamp?: undefined | null;
original: true;
}
@@ -83,6 +96,20 @@ export interface BrandingSubmission {
videoID: VideoID;
userID: UserID;
service: Service;
autoLock: boolean | undefined;
downvote: boolean | undefined;
videoDuration: number | undefined;
wasWarned: boolean | undefined;
casualMode: boolean | undefined;
}
export interface CasualVoteSubmission {
videoID: VideoID;
userID: UserID;
service: Service;
downvote: boolean | undefined;
categories: CasualCategory[];
title?: string;
}
export interface BrandingSegmentDBResult {
@@ -92,9 +119,22 @@ export interface BrandingSegmentDBResult {
videoDuration: number;
}
export interface CasualVoteDBResult {
category: CasualCategory;
upvotes: number;
downvotes: number;
title?: string;
}
export interface BrandingSegmentHashDBResult extends BrandingDBSubmissionData {
startTime: number;
endTime: number;
category: Category;
videoDuration: number;
}
}
export interface CasualVoteHashDBResult extends BrandingDBSubmissionData {
category: CasualCategory;
upvotes: number;
downvotes: number;
}

View File

@@ -10,7 +10,11 @@ interface RedisConfig extends redis.RedisClientOptions {
maxWriteConnections: number;
stopWritingAfterResponseTime: number;
responseTimePause: number;
maxReadResponseTime: number;
disableHashCache: boolean;
clientCacheSize: number;
useCompression: boolean;
dragonflyMode: boolean;
}
interface RedisReadOnlyConfig extends redis.RedisClientOptions {
@@ -27,6 +31,7 @@ export interface CustomWritePostgresConfig extends CustomPostgresConfig {
maxActiveRequests: number;
timeout: number;
highLoadThreshold: number;
redisTimeoutThreshold: number;
}
export interface CustomPostgresReadOnlyConfig extends CustomPostgresConfig {
@@ -36,6 +41,35 @@ export interface CustomPostgresReadOnlyConfig extends CustomPostgresConfig {
stopRetryThreshold: number;
}
export type ValidatorPattern = string | [string, string];
export interface RequestValidatorRule {
ruleName?: string;
// mostly universal
userAgent?: ValidatorPattern;
userAgentHeader?: ValidatorPattern;
videoDuration?: ValidatorPattern;
videoID?: ValidatorPattern;
userID?: ValidatorPattern;
service?: ValidatorPattern;
endpoint?: ValidatorPattern;
// sb postSkipSegments
startTime?: ValidatorPattern;
endTime?: ValidatorPattern;
category?: ValidatorPattern;
actionType?: ValidatorPattern;
description?: ValidatorPattern;
// dearrow postBranding
title?: ValidatorPattern;
titleOriginal?: boolean;
thumbnailTimestamp?: ValidatorPattern;
thumbnailOriginal?: boolean;
dearrowDownvote?: boolean;
// postCasual
casualCategory?: ValidatorPattern;
// setUsername
newUsername?: ValidatorPattern;
}
export interface SBSConfig {
[index: string]: any
port: number;
@@ -49,8 +83,11 @@ export interface SBSConfig {
discordCompletelyIncorrectReportWebhookURL?: string;
discordMaliciousReportWebhookURL?: string;
discordDeArrowLockedWebhookURL?: string,
discordDeArrowWarnedWebhookURL?: string,
discordNewUserWebhookURL?: string;
neuralBlockURL?: string;
discordNeuralBlockRejectWebhookURL?: string;
discordRejectedNewUserWebhookURL?: string;
minReputationToSubmitChapter: number;
minReputationToSubmitFiller: number;
userCounterURL?: string;
@@ -67,16 +104,16 @@ export interface SBSConfig {
readOnly: boolean;
webhooks: WebhookConfig[];
categoryList: string[];
casualCategoryList: string[];
deArrowTypes: DeArrowType[];
categorySupport: Record<string, string[]>;
maxTitleLength: number;
getTopUsersCacheTimeMinutes: number;
maxNumberOfActiveWarnings: number;
hoursAfterWarningExpires: number;
rateLimit: {
vote: RateLimitConfig;
view: RateLimitConfig;
};
requestValidatorRules: RequestValidatorRule[];
minimumPrefix?: string;
maximumPrefix?: string;
redis?: RedisConfig;
@@ -100,7 +137,18 @@ export interface SBSConfig {
},
tokenSeed: string,
minUserIDLength: number,
deArrowPaywall: boolean
deArrowPaywall: boolean,
useCacheForSegmentGroups: boolean
maxConnections: number;
maxResponseTime: number;
maxResponseTimeWhileLoadingCache: number;
etagExpiry: number;
youTubeKeys: {
visitorData: string | null;
poToken: string | null;
floatieUrl: string | null;
floatieAuth: string | null;
}
}
export interface WebhookConfig {
@@ -150,4 +198,4 @@ export interface CronJobOptions {
export interface DownvoteSegmentArchiveCron {
voteThreshold: number;
timeThresholdInDays: number;
}
}

View File

@@ -22,7 +22,8 @@ export enum ActionType {
// Uncomment as needed
export enum Service {
YouTube = "YouTube",
PeerTube = "PeerTube",
Spotify = "Spotify",
PeerTube = "PeerTube"
// Twitch = 'Twitch',
// Nebula = 'Nebula',
// RSS = 'RSS',
@@ -103,7 +104,7 @@ export interface VideoData {
}
export interface SegmentCache {
shadowHiddenSegmentIPs: SBRecord<VideoID, SBRecord<string, {hashedIP: HashedIP}[]>>,
shadowHiddenSegmentIPs: SBRecord<VideoID, SBRecord<string, Promise<{hashedIP: HashedIP}[] | null>>>,
userHashedIP?: HashedIP
userHashedIPPromise?: Promise<HashedIP>;
}

View File

@@ -23,9 +23,7 @@ export function createMemoryCache(memoryFn: (...args: any[]) => void, cacheTimeM
}
}
// create new promise
const promise = new Promise((resolve) => {
resolve(memoryFn(...args));
});
const promise = Promise.resolve(memoryFn(...args));
// store promise reference until fulfilled
promiseMemory.set(cacheKey, promise);
return promise.then(result => {

View File

@@ -2,6 +2,7 @@ import axios from "axios";
import { Logger } from "../utils/logger";
export const getCWSUsers = (extID: string): Promise<number | undefined> =>
axios.post(`https://chrome.google.com/webstore/ajax/detail?pv=20210820&id=${extID}`)
.then(res => res.data.split("\n")[2])
.then(data => JSON.parse(data))
@@ -10,4 +11,22 @@ export const getCWSUsers = (extID: string): Promise<number | undefined> =>
.catch((err) => {
Logger.error(`Error getting chrome users - ${err}`);
return 0;
});
});
/* istanbul ignore next */
export function getChromeUsers(chromeExtensionUrl: string): Promise<number> {
return axios.get(chromeExtensionUrl)
.then(res => {
const body = res.data;
// 2024-02-09
// >20,000 users<
const match = body.match(/>([\d,]+) users</)?.[1];
if (match) {
return parseInt(match.replace(/,/g, ""));
}
})
.catch(/* istanbul ignore next */ () => {
Logger.debug(`Failing to connect to ${chromeExtensionUrl}`);
return 0;
});
}

View File

@@ -46,7 +46,10 @@ async function newLeafWrapper(videoId: string, ignoreCache: boolean) {
export function getVideoDetails(videoId: string, ignoreCache = false): Promise<videoDetails> {
if (!config.newLeafURLs) {
return getPlayerData(videoId, ignoreCache)
.then(data => convertFromInnerTube(data));
.then(data => convertFromInnerTube(data))
.catch(() => {
return null;
});
}
return Promise.any([
newLeafWrapper(videoId, ignoreCache)

View File

@@ -1,11 +1,12 @@
import axios from "axios";
import axios, { AxiosError } from "axios";
import { Logger } from "./logger";
import { innerTubeVideoDetails } from "../types/innerTubeApi.model";
import DiskCache from "./diskCache";
import { config } from "../config";
const privateResponse = (videoId: string): innerTubeVideoDetails => ({
const privateResponse = (videoId: string, reason: string): innerTubeVideoDetails => ({
videoId,
title: "",
title: reason,
channelId: "",
// exclude video duration
isOwnerViewing: false,
@@ -27,24 +28,58 @@ const privateResponse = (videoId: string): innerTubeVideoDetails => ({
publishDate: ""
});
async function getFromITube (videoID: string): Promise<innerTubeVideoDetails> {
export async function getFromITube (videoID: string): Promise<innerTubeVideoDetails> {
if (config.youTubeKeys.floatieUrl) {
try {
const result = await axios.get(config.youTubeKeys.floatieUrl, {
params: {
videoID,
auth: config.youTubeKeys.floatieAuth
}
});
if (result.status === 200) {
return result.data?.videoDetails ?? privateResponse(videoID, result.data?.playabilityStatus?.reason ?? "Bad response");
} else {
return Promise.reject(`Floatie returned non-200 response: ${result.status}`);
}
} catch (e) {
if (e instanceof AxiosError) {
const result = e.response;
if (result && result.status === 500) {
return privateResponse(videoID, result.data ?? "Bad response");
} else {
return Promise.reject(`Floatie returned non-200 response: ${result?.status}`);
}
}
}
}
// start subrequest
const url = "https://www.youtube.com/youtubei/v1/player";
const data = {
context: {
client: {
clientName: "WEB",
clientVersion: "2.20221215.04.01"
clientVersion: "2.20221215.04.01",
visitorData: config.youTubeKeys.visitorData
}
},
videoId: videoID
videoId: videoID,
serviceIntegrityDimensions: {
poToken: config.youTubeKeys.poToken
}
};
const result = await axios.post(url, data, {
timeout: 3500
timeout: 3500,
headers: {
"X-Goog-Visitor-Id": config.youTubeKeys.visitorData
}
});
/* istanbul ignore else */
if (result.status === 200) {
return result.data?.videoDetails ?? privateResponse(videoID);
return result.data?.videoDetails ?? privateResponse(videoID, result.data?.playabilityStatus?.reason ?? "Bad response");
} else {
return Promise.reject(`Innertube returned non-200 response: ${result.status}`);
}

View File

@@ -9,12 +9,14 @@ const errorMessage = (parameter: string) => `${parameter} parameter does not mat
export function parseSkipSegments(req: Request): {
categories: Category[];
actionTypes: ActionType[];
trimUUIDs: number | null;
requiredSegments: SegmentUUID[];
service: Service;
errors: string[];
} {
const categories: Category[] = parseCategories(req, [ "sponsor" as Category ]);
const actionTypes: ActionType[] = parseActionTypes(req, [ActionType.Skip]);
const trimUUIDs: number | null = req.query.trimUUIDs ? (parseInt(req.query.trimUUIDs as string) || null) : null;
const requiredSegments: SegmentUUID[] = parseRequiredSegments(req);
const service: Service = getService(req.query.service, req.body.services);
const errors: string[] = [];
@@ -27,6 +29,7 @@ export function parseSkipSegments(req: Request): {
return {
categories,
actionTypes,
trimUUIDs,
requiredSegments,
service,
errors

View File

@@ -1,17 +1,30 @@
import { config } from "../config";
import { db } from "../databases/databases";
import { db, privateDB } from "../databases/databases";
import { Category } from "../types/segments.model";
import { Feature, HashedUserID } from "../types/user.model";
import { hasFeature } from "./features";
import { isUserVIP } from "./isUserVIP";
import { oneOf } from "./promise";
import redis from "./redis";
import { getReputation } from "./reputation";
import { getServerConfig } from "./serverConfig";
interface OldSubmitterResult {
canSubmit: boolean;
newUser: boolean;
}
interface CanSubmitResult {
canSubmit: boolean;
reason: string;
}
interface CanSubmitGlobalResult {
canSubmit: boolean;
newUser: boolean;
reason: string;
}
async function lowDownvotes(userID: HashedUserID): Promise<boolean> {
const result = await db.prepare("get", `SELECT count(*) as "submissionCount", SUM(CASE WHEN "votes" < 0 AND "views" > 5 THEN 1 ELSE 0 END) AS "downvotedSubmissions" FROM "sponsorTimes" WHERE "userID" = ?`
, [userID], { useReplica: true });
@@ -19,6 +32,66 @@ async function lowDownvotes(userID: HashedUserID): Promise<boolean> {
return result.submissionCount > 5 && result.downvotedSubmissions / result.submissionCount < 0.10;
}
const fiveMinutes = 5 * 60 * 1000;
async function oldSubmitterOrAllowed(userID: HashedUserID): Promise<OldSubmitterResult> {
const submitterThreshold = await getServerConfig("old-submitter-block-date");
const maxUsers = await getServerConfig("max-users-per-minute");
if (!submitterThreshold && !maxUsers) {
return { canSubmit: true, newUser: false };
}
const result = await db.prepare("get", `SELECT count(*) as "submissionCount" FROM "sponsorTimes" WHERE "userID" = ? AND "shadowHidden" = 0 AND "votes" >= 0 AND "timeSubmitted" < ?`
, [userID, parseInt(submitterThreshold) || Infinity], { useReplica: true });
const isOldSubmitter = result.submissionCount >= 1;
if (!isOldSubmitter) {
await redis.zRemRangeByScore("submitters", "-inf", Date.now() - fiveMinutes);
const last5MinUsers = await redis.zCard("submitters");
if (maxUsers && last5MinUsers < parseInt(maxUsers)) {
await redis.zAdd("submitters", { score: Date.now(), value: userID });
return { canSubmit: true, newUser: true };
}
}
return { canSubmit: isOldSubmitter, newUser: false };
}
async function oldDeArrowSubmitterOrAllowed(userID: HashedUserID): Promise<OldSubmitterResult> {
const submitterThreshold = await getServerConfig("old-submitter-block-date");
const maxUsers = await getServerConfig("max-users-per-minute-dearrow");
if (!submitterThreshold && !maxUsers) {
return { canSubmit: true, newUser: false };
}
const result = await db.prepare("get", `SELECT count(*) as "submissionCount" FROM "titles" JOIN "titleVotes" ON "titles"."UUID" = "titleVotes"."UUID" WHERE "userID" = ? AND "shadowHidden" = 0 AND "votes" >= 0 AND "timeSubmitted" < ?`
, [userID, parseInt(submitterThreshold) || Infinity], { useReplica: true });
const isOldSubmitter = result.submissionCount >= 1;
if (!isOldSubmitter) {
if (!submitterThreshold) {
const bannedResult = await db.prepare("get", `SELECT count(*) as "submissionCount" FROM "titles" JOIN "titleVotes" ON "titles"."UUID" = "titleVotes"."UUID" WHERE "userID" = ? AND "shadowHidden" != 0`, [userID], { useReplica: true });
if (bannedResult?.submissionCount === 0) {
const voteResult = await privateDB.prepare("get", `SELECT "UUID" from "titleVotes" where "userID" = ?`, [userID], { useReplica: true });
if (voteResult?.UUID) {
// Count at least one vote as an old submitter as well
return { canSubmit: true, newUser: false };
}
}
}
await redis.zRemRangeByScore("submittersDeArrow", "-inf", Date.now() - fiveMinutes);
const last5MinUsers = await redis.zCard("submittersDeArrow");
if (maxUsers && last5MinUsers < parseInt(maxUsers)) {
await redis.zAdd("submittersDeArrow", { score: Date.now(), value: userID });
return { canSubmit: true, newUser: true };
}
}
return { canSubmit: isOldSubmitter, newUser: false };
}
export async function canSubmit(userID: HashedUserID, category: Category): Promise<CanSubmitResult> {
switch (category) {
case "chapter":
@@ -36,4 +109,28 @@ export async function canSubmit(userID: HashedUserID, category: Category): Promi
reason: ""
};
}
}
}
export async function canSubmitGlobal(userID: HashedUserID): Promise<CanSubmitGlobalResult> {
const oldSubmitterOrAllowedPromise = oldSubmitterOrAllowed(userID);
return {
canSubmit: await oneOf([isUserVIP(userID),
(async () => (await oldSubmitterOrAllowedPromise).canSubmit)()
]),
newUser: (await oldSubmitterOrAllowedPromise).newUser,
reason: "We are currently experiencing a mass spam attack, we are restricting submissions for now"
};
}
export async function canSubmitDeArrow(userID: HashedUserID): Promise<CanSubmitGlobalResult> {
const oldSubmitterOrAllowedPromise = oldDeArrowSubmitterOrAllowed(userID);
return {
canSubmit: await oneOf([isUserVIP(userID),
(async () => (await oldSubmitterOrAllowedPromise).canSubmit)()
]),
newUser: (await oldSubmitterOrAllowedPromise).newUser,
reason: "We are currently experiencing a mass spam attack, we are restricting submissions for now"
};
}

View File

@@ -1,23 +1,28 @@
import redis from "../utils/redis";
import redis, { TooManyActiveConnectionsError } from "../utils/redis";
import { Logger } from "../utils/logger";
import { skipSegmentsHashKey, skipSegmentsKey, reputationKey, ratingHashKey, skipSegmentGroupsKey, userFeatureKey, videoLabelsKey, videoLabelsHashKey, brandingHashKey, brandingKey } from "./redisKeys";
import { skipSegmentsHashKey, skipSegmentsKey, reputationKey, ratingHashKey, skipSegmentGroupsKey, userFeatureKey, videoLabelsKey, videoLabelsHashKey, brandingHashKey, brandingKey, videoLabelsLargerHashKey, skipSegmentsLargerHashKey } from "./redisKeys";
import { Service, VideoID, VideoIDHash } from "../types/segments.model";
import { Feature, HashedUserID, UserID } from "../types/user.model";
import { config } from "../config";
async function get<T>(fetchFromDB: () => Promise<T>, key: string): Promise<T> {
try {
const reply = await redis.get(key);
const reply = await redis.getWithCache(key);
if (reply) {
Logger.debug(`Got data from redis: ${reply}`);
return JSON.parse(reply);
}
} catch (e) { } //eslint-disable-line no-empty
} catch (e) {
if (e instanceof TooManyActiveConnectionsError) {
throw e;
}
}
const data = await fetchFromDB();
redis.setEx(key, config.redis?.expiryTime, JSON.stringify(data)).catch((err) => Logger.error(err));
// Undefined can't be stringified, but null can
redis.setExWithCache(key, config.redis?.expiryTime, JSON.stringify(data ?? null)).catch((err) => Logger.error(err));
return data;
}
@@ -32,7 +37,7 @@ async function getTraced<T>(fetchFromDB: () => Promise<T>, key: string): Promise
const startTime = Date.now();
try {
const reply = await redis.get(key);
const reply = await redis.getWithCache(key);
if (reply) {
Logger.debug(`Got data from redis: ${reply}`);
@@ -42,12 +47,16 @@ async function getTraced<T>(fetchFromDB: () => Promise<T>, key: string): Promise
endTime: Date.now()
};
}
} catch (e) { } //eslint-disable-line no-empty
} catch (e) {
if (e instanceof TooManyActiveConnectionsError) {
throw e;
}
}
const dbStartTime = Date.now();
const data = await fetchFromDB();
redis.setEx(key, config.redis?.expiryTime, JSON.stringify(data)).catch((err) => Logger.error(err));
redis.setExWithCache(key, config.redis?.expiryTime, JSON.stringify(data)).catch((err) => Logger.error(err));
return {
data,
@@ -116,8 +125,10 @@ function clearSegmentCache(videoInfo: { videoID: VideoID; hashedVideoID: VideoID
redis.del(skipSegmentsKey(videoInfo.videoID, videoInfo.service)).catch((err) => Logger.error(err));
redis.del(skipSegmentGroupsKey(videoInfo.videoID, videoInfo.service)).catch((err) => Logger.error(err));
redis.del(skipSegmentsHashKey(videoInfo.hashedVideoID, videoInfo.service)).catch((err) => Logger.error(err));
redis.del(skipSegmentsLargerHashKey(videoInfo.hashedVideoID, videoInfo.service)).catch((err) => Logger.error(err));
redis.del(videoLabelsKey(videoInfo.hashedVideoID, videoInfo.service)).catch((err) => Logger.error(err));
redis.del(videoLabelsHashKey(videoInfo.hashedVideoID, videoInfo.service)).catch((err) => Logger.error(err));
redis.del(videoLabelsLargerHashKey(videoInfo.hashedVideoID, videoInfo.service)).catch((err) => Logger.error(err));
if (videoInfo.userID) redis.del(reputationKey(videoInfo.userID)).catch((err) => Logger.error(err));
clearBrandingCache(videoInfo);
@@ -135,6 +146,7 @@ async function getKeyLastModified(key: string): Promise<Date> {
if (!config.redis?.enabled) return Promise.reject("ETag - Redis not enabled");
return await redis.ttl(key)
.then(ttl => {
if (ttl <= 0) return new Date();
const sinceLive = config.redis?.expiryTime - ttl;
const now = Math.floor(Date.now() / 1000);
return new Date((now-sinceLive) * 1000);

View File

@@ -1,39 +1,61 @@
import { config } from "../config";
import { Logger } from "./logger";
import { SetOptions, createClient } from "redis";
import { RedisClientType, SetOptions, createClient } from "redis";
import { RedisCommandArgument, RedisCommandArguments, RedisCommandRawReply } from "@redis/client/dist/lib/commands";
import { RedisClientOptions } from "@redis/client/dist/lib/client";
import { RedisReply } from "rate-limit-redis";
import { db } from "../databases/databases";
import { Postgres } from "../databases/Postgres";
import { compress, uncompress } from "lz4-napi";
import { LRUCache } from "lru-cache";
import { shouldClientCacheKey } from "./redisKeys";
import { ZMember } from "@redis/client/dist/lib/commands/generic-transformers";
export interface RedisStats {
activeRequests: number;
writeRequests: number;
avgReadTime: number;
avgWriteTime: number;
memoryCacheHits: number
memoryCacheTotalHits: number
memoryCacheLength: number;
memoryCacheSize: number;
lastInvalidation: number;
lastInvalidationMessage: number;
}
interface RedisSB {
get(key: RedisCommandArgument): Promise<string>;
get(key: RedisCommandArgument, useClientCache?: boolean): Promise<string>;
getWithCache(key: RedisCommandArgument): Promise<string>;
set(key: RedisCommandArgument, value: RedisCommandArgument, options?: SetOptions): Promise<string>;
setWithCache(key: RedisCommandArgument, value: RedisCommandArgument, options?: SetOptions): Promise<string>;
setEx(key: RedisCommandArgument, seconds: number, value: RedisCommandArgument): Promise<string>;
setExWithCache(key: RedisCommandArgument, seconds: number, value: RedisCommandArgument): Promise<string>;
del(...keys: [RedisCommandArgument]): Promise<number>;
increment?(key: RedisCommandArgument): Promise<RedisCommandRawReply[]>;
sendCommand(args: RedisCommandArguments, options?: RedisClientOptions): Promise<RedisReply>;
ttl(key: RedisCommandArgument): Promise<number>;
quit(): Promise<void>;
zRemRangeByScore(key: string, min: number | RedisCommandArgument, max: number | RedisCommandArgument): Promise<number>;
zAdd(key: string, members: ZMember | ZMember[]): Promise<number>;
zCard(key: string): Promise<number>;
}
let exportClient: RedisSB = {
get: () => new Promise((resolve) => resolve(null)),
set: () => new Promise((resolve) => resolve(null)),
setEx: () => new Promise((resolve) => resolve(null)),
del: () => new Promise((resolve) => resolve(null)),
increment: () => new Promise((resolve) => resolve(null)),
sendCommand: () => new Promise((resolve) => resolve(null)),
quit: () => new Promise((resolve) => resolve(null)),
ttl: () => new Promise((resolve) => resolve(null)),
get: () => Promise.resolve(null),
getWithCache: () => Promise.resolve(null),
set: () => Promise.resolve(null),
setWithCache: () => Promise.resolve(null),
setEx: () => Promise.resolve(null),
setExWithCache: () => Promise.resolve(null),
del: () => Promise.resolve(null),
increment: () => Promise.resolve(null),
sendCommand: () => Promise.resolve(null),
quit: () => Promise.resolve(null),
ttl: () => Promise.resolve(null),
zRemRangeByScore: () => Promise.resolve(null),
zAdd: () => Promise.resolve(null),
zCard: () => Promise.resolve(null)
};
let lastClientFail = 0;
@@ -41,12 +63,40 @@ let lastReadFail = 0;
let activeRequests = 0;
let writeRequests = 0;
let memoryCacheHits = 0;
let memoryCacheMisses = 0;
let memoryCacheUncachedMisses = 0;
let lastInvalidationMessage = 0;
let lastInvalidation = 0;
const readResponseTime: number[] = [];
const writeResponseTime: number[] = [];
let lastResponseTimeLimit = 0;
const maxStoredTimes = 200;
export let connectionPromise = Promise.resolve();
const activeRequestPromises: Record<string, Promise<string>> = {};
// Used to handle race conditions
const resetKeys: Set<RedisCommandArgument> = new Set();
const cache = config.redis.clientCacheSize ? new LRUCache<RedisCommandArgument, string>({
maxSize: config.redis.clientCacheSize,
sizeCalculation: (value) => value.length,
ttl: 1000 * 60 * 30,
ttlResolution: 1000 * 60 * 15
}) : null;
// Used to cache ttl data
const ttlCache = config.redis.clientCacheSize ? new LRUCache<RedisCommandArgument, number>({
max: config.redis.clientCacheSize / 10 / 4, // 4 byte integer per element
ttl: 1000 * 60 * 30,
ttlResolution: 1000 * 60 * 15
}) : null;
// For redis
let cacheConnectionClientId = "";
export class TooManyActiveConnectionsError extends Error {}
export let connectionPromise: Promise<unknown> = Promise.resolve();
if (config.redis?.enabled) {
Logger.info("Connected to redis");
@@ -54,21 +104,151 @@ if (config.redis?.enabled) {
const readClient = config.redisRead?.enabled ? createClient(config.redisRead) : null;
connectionPromise = client.connect();
void readClient?.connect(); // void as we don't care about the promise
exportClient = client as RedisSB;
exportClient = client as unknown as RedisSB;
let cacheClient = null as RedisClientType | null;
const createKeyName = (key: RedisCommandArgument) => (key + (config.redis.useCompression ? ".c" : "")) as RedisCommandArgument;
exportClient.getWithCache = (key) => {
const cachedItem = cache && cacheClient && cache.get(key);
if (cachedItem != null) {
memoryCacheHits++;
return Promise.resolve(cachedItem);
} else if (shouldClientCacheKey(key)) {
memoryCacheMisses++;
}
if (memoryCacheHits + memoryCacheMisses > 50000) {
memoryCacheHits = 0;
memoryCacheMisses = 0;
memoryCacheUncachedMisses = 0;
}
if (activeRequestPromises[key as string] !== undefined) {
return activeRequestPromises[key as string];
}
const request = exportClient.get(createKeyName(key)).then((reply) => {
if (reply === null) return null;
if (config.redis.useCompression) {
const decompressed = uncompress(Buffer.from(reply, "base64")).then((decompressed) => decompressed.toString("utf-8"));
if (cache && shouldClientCacheKey(key)) {
decompressed.then((d) => {
if (!resetKeys.has(key)) {
cache.set(key, d);
}
resetKeys.delete(key);
}).catch(Logger.error);
} else {
resetKeys.delete(key);
}
return decompressed;
} else {
if (cache && shouldClientCacheKey(key)) {
if (!resetKeys.has(key)) {
cache.set(key, reply);
}
}
resetKeys.delete(key);
return reply;
}
});
activeRequestPromises[key as string] = request;
void request.finally(() => {
delete activeRequestPromises[key as string];
resetKeys.delete(key);
});
return request;
};
exportClient.setWithCache = (key, value, options) => {
if (cache) {
cache.set(key, value as string);
}
if (config.redis.useCompression) {
return compress(Buffer.from(value as string, "utf-8")).then((compressed) =>
exportClient.set(createKeyName(key), compressed.toString("base64"), options)
);
} else {
return exportClient.set(createKeyName(key), value, options);
}
};
exportClient.setExWithCache = (key, seconds, value) => {
if (cache) {
cache.set(key, value as string);
}
if (config.redis.useCompression) {
return compress(Buffer.from(value as string, "utf-8")).then((compressed) =>
exportClient.setEx(createKeyName(key), seconds, compressed.toString("base64"))
);
} else {
return exportClient.setEx(createKeyName(key), seconds, value);
}
};
const del = client.del.bind(client);
exportClient.del = (...keys) => {
if (config.redis.dragonflyMode) {
for (const key of keys) {
void client.publish("__redis__:invalidate", key);
}
}
if (config.redis.useCompression) {
return del(keys.flatMap((key) => [key, createKeyName(key)]) as [RedisCommandArgument]);
} else {
return del(...keys);
}
};
const ttl = client.ttl.bind(client);
exportClient.ttl = async (key) => {
const ttlResult = cache && cacheClient && ttlCache.get(key);
if (ttlResult != null) {
// Trigger usage of cache
cache.get(key);
return ttlResult + config.redis?.expiryTime - Math.floor(Date.now() / 1000);
} else {
const result = await ttl(createKeyName(key));
if (ttlCache) ttlCache.set(key, Math.floor(Date.now() / 1000) - (config.redis?.expiryTime - result));
return result;
}
};
const get = client.get.bind(client);
const getRead = readClient?.get?.bind(readClient);
exportClient.get = (key) => new Promise((resolve, reject) => {
if (config.redis.maxConnections && activeRequests > config.redis.maxConnections) {
reject("Too many active requests in general");
reject(new TooManyActiveConnectionsError(`Too many active requests in general: ${activeRequests} over ${config.redis.maxConnections}`));
return;
}
if (config.redis.maxReadResponseTime && activeRequests > maxStoredTimes
&& readResponseTime[readResponseTime.length - 1] > config.redis.maxReadResponseTime) {
reject(new TooManyActiveConnectionsError(`Redis response time too high in general: ${readResponseTime[readResponseTime.length - 1]}ms with ${activeRequests} connections`));
return;
}
// For tracking
if (!shouldClientCacheKey(key)) memoryCacheUncachedMisses++;
const start = Date.now();
activeRequests++;
const timeout = config.redis.getTimeout ? setTimeout(() => reject(), config.redis.getTimeout) : null;
const shouldUseTimeout = config.redis.getTimeout && db.shouldUseRedisTimeout();
const timeout = shouldUseTimeout ? setTimeout(() => reject(), config.redis.getTimeout) : null;
const chosenGet = pickChoice(get, getRead);
chosenGet(key).then((reply) => {
if (timeout !== null) clearTimeout(timeout);
@@ -85,7 +265,7 @@ if (config.redis?.enabled) {
lastResponseTimeLimit = Date.now();
}
}).catch((err) => {
if (chosenGet === get) {
if (chosenGet === get || chosenGet === cacheClient?.get) {
lastClientFail = Date.now();
} else {
lastReadFail = Date.now();
@@ -135,13 +315,16 @@ if (config.redis?.enabled) {
.then((reply) => resolve(reply))
.catch((err) => reject(err))
);
exportClient.zRemRangeByScore = client.zRemRangeByScore.bind(client);
exportClient.zAdd = client.zAdd.bind(client);
exportClient.zCard = client.zCard.bind(client);
/* istanbul ignore next */
client.on("error", function(error) {
lastClientFail = Date.now();
Logger.error(`Redis Error: ${error}`);
});
/* istanbul ignore next */
client.on("reconnect", () => {
client.on("reconnecting", () => {
Logger.info("Redis: trying to reconnect");
});
/* istanbul ignore next */
@@ -150,9 +333,57 @@ if (config.redis?.enabled) {
Logger.error(`Redis Read-Only Error: ${error}`);
});
/* istanbul ignore next */
readClient?.on("reconnect", () => {
readClient?.on("reconnecting", () => {
Logger.info("Redis Read-Only: trying to reconnect");
});
// It needs to recreate itself when the connection fails as the queue connection doesn't properly restart
const createCacheClient = () => {
cacheClient = createClient(config.redis) as RedisClientType;
/* istanbul ignore next */
cacheClient.on("error", function (error) {
lastClientFail = Date.now();
Logger.error(`Redis Cache Client Error: ${error}`);
});
/* istanbul ignore next */
cacheClient.on("reconnecting", () => {
Logger.info("Redis cache client: trying to reconnect");
cache?.clear();
void cacheClient.disconnect();
setTimeout(() => createCacheClient(), 1000);
});
// eslint-disable-next-line @typescript-eslint/no-misused-promises
cacheClient.on("ready", async () => {
cache?.clear();
await setupCacheClientListener(cacheClient as RedisClientType, cache);
void Promise.all([
setupCacheClientTracking(client as RedisClientType, cacheClient as RedisClientType),
setupCacheClientTracking(readClient as RedisClientType, cacheClient as RedisClientType)
]).then(() => cache?.clear());
});
void cacheClient.connect();
};
if (config.redis.clientCacheSize) {
createCacheClient();
client.on("ready", () => {
if (cacheClient.isReady) {
void setupCacheClientTracking(client as RedisClientType, cacheClient as RedisClientType);
}
});
readClient?.on("ready", () => {
if (cacheClient.isReady) {
void setupCacheClientTracking(readClient as RedisClientType, cacheClient as RedisClientType);
}
});
}
}
function pickChoice<T>(client: T, readClient: T): T {
@@ -173,7 +404,51 @@ export function getRedisStats(): RedisStats {
writeRequests,
avgReadTime: readResponseTime.length > 0 ? readResponseTime.reduce((a, b) => a + b, 0) / readResponseTime.length : 0,
avgWriteTime: writeResponseTime.length > 0 ? writeResponseTime.reduce((a, b) => a + b, 0) / writeResponseTime.length : 0,
memoryCacheHits: memoryCacheHits / (memoryCacheHits + memoryCacheMisses),
memoryCacheTotalHits: memoryCacheHits / (memoryCacheHits + memoryCacheMisses + memoryCacheUncachedMisses),
memoryCacheLength: cache?.size ?? 0,
memoryCacheSize: cache?.calculatedSize ?? 0,
lastInvalidation,
lastInvalidationMessage
};
}
async function setupCacheClientListener(cacheClient: RedisClientType,
cache: LRUCache<RedisCommandArgument, string>) {
if (!config.redis.dragonflyMode) {
cacheConnectionClientId = String(await cacheClient.clientId());
}
cacheClient.subscribe("__redis__:invalidate", (message) => {
if (message) {
lastInvalidationMessage = Date.now();
const keys = Buffer.isBuffer(message) ? [message.toString()] : message;
for (let key of keys) {
if (config.redis.useCompression) key = key.replace(/\.c$/, "");
if (cache.delete(key)) {
lastInvalidation = Date.now();
}
ttlCache.delete(key);
// To tell it to not save the result of this currently running request
if (key && activeRequestPromises[key] !== undefined) {
resetKeys.add(key);
}
}
}
}).catch(Logger.error);
}
async function setupCacheClientTracking(client: RedisClientType,
cacheClient: RedisClientType) {
if (!client || !cacheClient.isReady || config.redis.dragonflyMode) return;
await client.sendCommand(["CLIENT", "TRACKING", "ON", "REDIRECT", cacheConnectionClientId, "BCAST"]);
}
export default exportClient;

View File

@@ -3,6 +3,7 @@ import { Feature, HashedUserID, UserID } from "../types/user.model";
import { HashedValue } from "../types/hash.model";
import { Logger } from "./logger";
import { BrandingUUID } from "../types/branding.model";
import { RedisCommandArgument } from "@redis/client/dist/lib/commands";
export const skipSegmentsKey = (videoID: VideoID, service: Service): string =>
`segments.v4.${service}.videoID.${videoID}`;
@@ -17,31 +18,38 @@ export function skipSegmentsHashKey(hashedVideoIDPrefix: VideoIDHash, service: S
return `segments.v4.${service}.${hashedVideoIDPrefix}`;
}
export function skipSegmentsLargerHashKey(hashedVideoIDPrefix: VideoIDHash, service: Service): string {
hashedVideoIDPrefix = hashedVideoIDPrefix.substring(0, 5) as VideoIDHash;
if (hashedVideoIDPrefix.length !== 5) Logger.warn(`Redis skip segment hash-prefix key is not length 5! ${hashedVideoIDPrefix}`);
return `segments.v4.${service}.${hashedVideoIDPrefix}`;
}
export const brandingKey = (videoID: VideoID, service: Service): string =>
`branding.v2.${service}.videoID.${videoID}`;
`branding.v3.${service}.videoID.${videoID}`;
export function brandingHashKey(hashedVideoIDPrefix: VideoIDHash, service: Service): string {
hashedVideoIDPrefix = hashedVideoIDPrefix.substring(0, 4) as VideoIDHash;
if (hashedVideoIDPrefix.length !== 4) Logger.warn(`Redis skip segment hash-prefix key is not length 4! ${hashedVideoIDPrefix}`);
return `branding.v2.${service}.${hashedVideoIDPrefix}`;
return `branding.v3.${service}.${hashedVideoIDPrefix}`;
}
export const brandingIPKey = (uuid: BrandingUUID): string =>
`branding.shadow.${uuid}`;
`branding.v1.shadow.${uuid}`;
export const shadowHiddenIPKey = (videoID: VideoID, timeSubmitted: number, service: Service): string =>
`segments.${service}.videoID.${videoID}.shadow.${timeSubmitted}`;
`segments.v1.${service}.videoID.${videoID}.shadow.${timeSubmitted}`;
export const reputationKey = (userID: UserID): string =>
`reputation.user.${userID}`;
`reputation.v1.user.${userID}`;
export function ratingHashKey(hashPrefix: VideoIDHash, service: Service): string {
hashPrefix = hashPrefix.substring(0, 4) as VideoIDHash;
if (hashPrefix.length !== 4) Logger.warn(`Redis rating hash-prefix key is not length 4! ${hashPrefix}`);
return `rating.${service}.${hashPrefix}`;
return `rating.v1.${service}.${hashPrefix}`;
}
export function shaHashKey(singleIter: HashedValue): string {
@@ -54,15 +62,26 @@ export const tempVIPKey = (userID: HashedUserID): string =>
`vip.temp.${userID}`;
export const videoLabelsKey = (videoID: VideoID, service: Service): string =>
`labels.v1.${service}.videoID.${videoID}`;
`labels.v2.${service}.videoID.${videoID}`;
export function videoLabelsHashKey(hashedVideoIDPrefix: VideoIDHash, service: Service): string {
hashedVideoIDPrefix = hashedVideoIDPrefix.substring(0, 3) as VideoIDHash;
if (hashedVideoIDPrefix.length !== 3) Logger.warn(`Redis video labels hash-prefix key is not length 3! ${hashedVideoIDPrefix}`);
return `labels.v1.${service}.${hashedVideoIDPrefix}`;
return `labels.v2.3.${service}.${hashedVideoIDPrefix}`;
}
export function videoLabelsLargerHashKey(hashedVideoIDPrefix: VideoIDHash, service: Service): string {
hashedVideoIDPrefix = hashedVideoIDPrefix.substring(0, 4) as VideoIDHash;
if (hashedVideoIDPrefix.length !== 4) Logger.warn(`Redis video labels hash-prefix key is not length 4! ${hashedVideoIDPrefix}`);
return `labels.v2.4.${service}.${hashedVideoIDPrefix}`;
}
export function userFeatureKey (userID: HashedUserID, feature: Feature): string {
return `user.${userID}.feature.${feature}`;
return `user.v1.${userID}.feature.${feature}`;
}
export function shouldClientCacheKey(key: RedisCommandArgument): boolean {
return (key as string).match(/^(?:segments\.|reputation\.|branding\.|labels\.)/) !== null;
}

View File

@@ -37,6 +37,12 @@ export async function acquireLock(key: string, timeout = defaultTimeout): Promis
}
} catch (e) {
Logger.error(e as string);
// Fallback to allowing
return {
status: true,
unlock: () => void 0
};
}
return {

View File

@@ -14,7 +14,14 @@ interface ReputationDBResult {
mostUpvotedInLockedVideoSum: number
}
const activeReputationRequests: Record<UserID, Promise<ReputationDBResult>> = {};
export async function getReputation(userID: UserID): Promise<number> {
// Hardcode for NN-block because too many submissions
if (userID === "d6e8b39e6a79917166486066667caab54a2dec5e8384e46f92a82ef56e775005") {
return Promise.resolve(27);
}
const weekAgo = Date.now() - 1000 * 60 * 60 * 24 * 7; // 45 days ago
const pastDate = Date.now() - 1000 * 60 * 60 * 24 * 45; // 45 days ago
// 1596240000000 is August 1st 2020, a little after auto upvote was disabled
@@ -42,9 +49,18 @@ export async function getReputation(userID: UserID): Promise<number> {
THEN 1 ELSE 0 END) AS "mostUpvotedInLockedVideoSum"
FROM "sponsorTimes" as "a" WHERE "userID" = ? AND "actionType" != 'full'`, [userID, weekAgo, pastDate, userID], { useReplica: true }) as Promise<ReputationDBResult>;
const result = await QueryCacher.get(fetchFromDB, reputationKey(userID));
const promise = activeReputationRequests[userID] ?? QueryCacher.get(fetchFromDB, reputationKey(userID));
activeReputationRequests[userID] = promise;
return calculateReputationFromMetrics(result);
try {
const result = await promise;
delete activeReputationRequests[userID];
return calculateReputationFromMetrics(result);
} catch (e) {
throw new Error(`${(e as Error)?.message}\n\n${userID}`);
}
}
// convert a number from one range to another.

View File

@@ -0,0 +1,267 @@
import { config } from "../config";
import {
CasualCategory,
ThumbnailSubmission,
TitleSubmission,
} from "../types/branding.model";
import { ValidatorPattern, RequestValidatorRule } from "../types/config.model";
import { IncomingSegment } from "../types/segments.model";
export interface RequestValidatorInput {
userAgent?: string;
userAgentHeader?: string;
videoDuration?: string | number;
videoID?: string;
userID?: string;
service?: string;
segments?: IncomingSegment[];
dearrow?: {
title?: TitleSubmission;
thumbnail?: ThumbnailSubmission;
downvote: boolean;
};
casualCategories?: CasualCategory[];
newUsername?: string;
endpoint?: string;
}
export type CompiledValidityCheck = (input: RequestValidatorInput) => string | null;
type CompiledPatternCheck = (input: RequestValidatorInput) => boolean;
type CompiledSegmentCheck = (input: IncomingSegment) => boolean;
type InputExtractor = (
input: RequestValidatorInput,
) => string | number | undefined | null;
type SegmentExtractor = (input: IncomingSegment) => string | undefined | null;
type BooleanRules = "titleOriginal" | "thumbnailOriginal" | "dearrowDownvote";
type RuleEntry =
| [Exclude<keyof RequestValidatorRule, BooleanRules>, ValidatorPattern]
| [BooleanRules, boolean];
let compiledRules: CompiledValidityCheck;
function patternToRegex(pattern: ValidatorPattern): RegExp {
return typeof pattern === "string"
? new RegExp(pattern, "i")
: new RegExp(...pattern);
}
function compilePattern(
pattern: ValidatorPattern,
extractor: InputExtractor,
): CompiledPatternCheck {
const regex = patternToRegex(pattern);
return (input: RequestValidatorInput) => {
const field = extractor(input);
if (field == undefined) return false;
return regex.test(String(field));
};
}
function compileSegmentPattern(
pattern: ValidatorPattern,
extractor: SegmentExtractor,
): CompiledSegmentCheck {
const regex = patternToRegex(pattern);
return (input: IncomingSegment) => {
const field = extractor(input);
if (field == undefined) return false;
return regex.test(field);
};
}
export function compileRules(
ruleDefinitions: RequestValidatorRule[],
): CompiledValidityCheck {
if (ruleDefinitions.length === 0) return () => null;
const rules: CompiledValidityCheck[] = [];
let untitledRuleCounter = 0;
for (const ruleDefinition of ruleDefinitions) {
const ruleComponents: CompiledPatternCheck[] = [];
const segmentRuleComponents: CompiledSegmentCheck[] = [];
for (const [ruleKey, rulePattern] of Object.entries(
ruleDefinition,
) as RuleEntry[]) {
switch (ruleKey) {
case "userAgent":
ruleComponents.push(
compilePattern(rulePattern, (input) => input.userAgent),
);
break;
case "userAgentHeader":
ruleComponents.push(
compilePattern(
rulePattern,
(input) => input.userAgentHeader,
),
);
break;
case "videoDuration":
ruleComponents.push(
compilePattern(
rulePattern,
(input) => input.videoDuration,
),
);
break;
case "videoID":
ruleComponents.push(
compilePattern(rulePattern, (input) => input.videoID),
);
break;
case "userID":
ruleComponents.push(
compilePattern(rulePattern, (input) => input.userID),
);
break;
case "service":
ruleComponents.push(
compilePattern(rulePattern, (input) => input.service),
);
break;
case "startTime":
segmentRuleComponents.push(
compileSegmentPattern(
rulePattern,
(input) => input.segment[0],
),
);
break;
case "endTime":
segmentRuleComponents.push(
compileSegmentPattern(
rulePattern,
(input) => input.segment[1],
),
);
break;
case "category":
segmentRuleComponents.push(
compileSegmentPattern(
rulePattern,
(input) => input.category,
),
);
break;
case "actionType":
segmentRuleComponents.push(
compileSegmentPattern(
rulePattern,
(input) => input.actionType,
),
);
break;
case "description":
segmentRuleComponents.push(
compileSegmentPattern(
rulePattern,
(input) => input.description,
),
);
break;
case "title":
ruleComponents.push(
compilePattern(
rulePattern,
(input) => input.dearrow?.title?.title,
),
);
break;
case "titleOriginal":
ruleComponents.push(
(input) =>
input.dearrow?.title?.original === rulePattern,
);
break;
case "thumbnailTimestamp":
ruleComponents.push(
compilePattern(
rulePattern,
(input) => input.dearrow?.thumbnail?.timestamp,
),
);
break;
case "thumbnailOriginal":
ruleComponents.push(
(input) =>
input.dearrow?.thumbnail?.original === rulePattern,
);
break;
case "dearrowDownvote":
ruleComponents.push(
(input) => input.dearrow?.downvote === rulePattern,
);
break;
case "newUsername":
ruleComponents.push(
compilePattern(
rulePattern,
(input) => input.newUsername,
),
);
break;
case "endpoint":
ruleComponents.push(
compilePattern(rulePattern, (input) => input.endpoint),
);
break;
case "casualCategory": {
const regex = patternToRegex(rulePattern);
ruleComponents.push((input) => {
if (input.casualCategories === undefined) {
return false;
}
for (const category of input.casualCategories) {
if (regex.test(category)) return true;
}
return false;
});
break;
}
case "ruleName":
// not a rule component
break;
default: {
// eslint-disable-next-line @typescript-eslint/no-unused-vars
const _exhaustive: never = ruleKey;
}
}
}
if (segmentRuleComponents.length > 0) {
ruleComponents.push((input) => {
if (input.segments === undefined) return false;
for (const segment of input.segments) {
let result = true;
for (const rule of segmentRuleComponents) {
if (!rule(segment)) {
result = false;
break;
}
}
if (result) return true;
}
return false;
});
}
const ruleName = ruleDefinition.ruleName ?? `Untitled rule ${++untitledRuleCounter}`;
rules.push((input) => {
for (const rule of ruleComponents) {
if (!rule(input)) return null;
}
return ruleName;
});
}
return (input) => {
for (const rule of rules) {
const result = rule(input);
if (result !== null) return result;
}
return null;
};
}
export function isRequestInvalid(input: RequestValidatorInput): string | null {
compiledRules ??= compileRules(config.requestValidatorRules);
return compiledRules(input);
}

View File

@@ -0,0 +1,7 @@
import { db } from "../databases/databases";
export async function getServerConfig(key: string): Promise<string | null> {
const row = await db.prepare("get", `SELECT "value" as v FROM "config" WHERE "key" = ?`, [key]);
return row?.v ?? null;
}

View File

@@ -1,11 +1,16 @@
export function parseUserAgent(userAgent: string): string {
const ua = userAgent.toLowerCase();
const ua = userAgent;
if (ua.match(/(com.google.android.youtube\/)|(com.vanced.android.youtube\/)|(^YouTube\/)|(^Dalvik\/)/)) {
return `Vanced/${ua.match(/.android.youtube\/([^\s]+)/)[1]}`;
const revanced = ua.match(/RVX\S+|ReVanced\S+/i);
if (revanced) {
return revanced[0];
}
if (ua.match(/(mpv_sponsorblock\/)|(^python-requests)|(^GuzzleHttp\/)|(^PostmanRuntime\/)/)) {
if (ua.match(/(com.google.android.youtube\/)|(com.vanced.android.youtube\/)|(^YouTube\/)|(^Dalvik\/)/i)) {
return `Vanced/${ua.match(/.android.youtube\/([^\s]+)/i)?.[1]}`;
}
if (ua.match(/(mpv_sponsorblock)|(^python-requests)|(^GuzzleHttp\/)|(^PostmanRuntime\/)/i)) {
return ua;
}

View File

@@ -50,7 +50,6 @@
]
}
],
"hoursAfterWarningExpires": 24,
"rateLimit": {
"vote": {
"windowMs": 900000,

View File

@@ -31,7 +31,7 @@ describe("304 etag validation", () => {
const endpoint = "/etag";
for (const hashType of ["skipSegments", "skipSegmentsHash", "videoLabel", "videoLabelHash"]) {
it(`${hashType} etag should return 304`, () => {
const etagKey = `${hashType};${genRandom};YouTube;${Date.now()}`;
const etagKey = `"${hashType};${genRandom};YouTube;${Date.now()}"`;
return redis.setEx(etagKey, 8400, "test").then(() =>
client.get(endpoint, { headers: { "If-None-Match": etagKey } }).then(res => {
assert.strictEqual(res.status, 304);
@@ -43,14 +43,14 @@ describe("304 etag validation", () => {
}
it(`other etag type should not return 304`, () => {
const etagKey = `invalidHashType;${genRandom};YouTube;${Date.now()}`;
const etagKey = `"invalidHashType;${genRandom};YouTube;${Date.now()}"`;
return client.get(endpoint, { headers: { "If-None-Match": etagKey } }).then(res => {
assert.strictEqual(res.status, 404);
});
});
it(`outdated etag type should not return 304`, () => {
const etagKey = `skipSegments;${genRandom};YouTube;5000`;
const etagKey = `"skipSegments;${genRandom};YouTube;5000"`;
return client.get(endpoint, { headers: { "If-None-Match": etagKey } }).then(res => {
assert.strictEqual(res.status, 404);
});

View File

@@ -3,7 +3,7 @@ import assert from "assert";
import { getHash } from "../../src/utils/getHash";
import { db } from "../../src/databases/databases";
import { Service } from "../../src/types/segments.model";
import { BrandingUUID, ThumbnailResult, TitleResult } from "../../src/types/branding.model";
import { BrandingUUID, CasualVote, ThumbnailResult, TitleResult } from "../../src/types/branding.model";
import { partialDeepEquals } from "../utils/partialDeepEquals";
describe("getBranding", () => {
@@ -13,6 +13,10 @@ describe("getBranding", () => {
const videoIDEmpty = "videoID4";
const videoIDRandomTime = "videoID5";
const videoIDUnverified = "videoID6";
const videoIDvidDuration = "videoID7";
const videoIDCasual = "videoIDCasual";
const videoIDCasualDownvoted = "videoIDCasualDownvoted";
const videoIDCasualTitle = "videoIDCasualTitle";
const videoID1Hash = getHash(videoID1, 1).slice(0, 4);
const videoID2LockedHash = getHash(videoID2Locked, 1).slice(0, 4);
@@ -20,6 +24,10 @@ describe("getBranding", () => {
const videoIDEmptyHash = "aaaa";
const videoIDRandomTimeHash = getHash(videoIDRandomTime, 1).slice(0, 4);
const videoIDUnverifiedHash = getHash(videoIDUnverified, 1).slice(0, 4);
const videoIDvidDurationHash = getHash(videoIDUnverified, 1).slice(0, 4);
const videoIDCasualHash = getHash(videoIDCasual, 1).slice(0, 4);
const videoIDCasualDownvotedHash = getHash(videoIDCasualDownvoted, 1).slice(0, 4);
const videoIDCasualTitleHash = getHash(videoIDCasualTitle, 1).slice(0, 4);
const endpoint = "/api/branding";
const getBranding = (params: Record<string, any>) => client({
@@ -36,29 +44,37 @@ describe("getBranding", () => {
before(async () => {
const titleQuery = `INSERT INTO "titles" ("videoID", "title", "original", "userID", "service", "hashedVideoID", "timeSubmitted", "UUID") VALUES (?, ?, ?, ?, ?, ?, ?, ?)`;
const titleVotesQuery = `INSERT INTO "titleVotes" ("UUID", "votes", "locked", "shadowHidden", "verification") VALUES (?, ?, ?, ?, ?)`;
const titleVotesQuery = `INSERT INTO "titleVotes" ("UUID", "votes", "locked", "shadowHidden", "verification", "downvotes", "removed") VALUES (?, ?, ?, ?, ?, ?, ?)`;
const thumbnailQuery = `INSERT INTO "thumbnails" ("videoID", "original", "userID", "service", "hashedVideoID", "timeSubmitted", "UUID") VALUES (?, ?, ?, ?, ?, ?, ?)`;
const thumbnailTimestampsQuery = `INSERT INTO "thumbnailTimestamps" ("UUID", "timestamp") VALUES (?, ?)`;
const thumbnailVotesQuery = `INSERT INTO "thumbnailVotes" ("UUID", "votes", "locked", "shadowHidden") VALUES (?, ?, ?, ?)`;
const thumbnailVotesQuery = `INSERT INTO "thumbnailVotes" ("UUID", "votes", "locked", "shadowHidden", "downvotes", "removed") VALUES (?, ?, ?, ?, ?, ?)`;
const segmentQuery = 'INSERT INTO "sponsorTimes" ("videoID", "startTime", "endTime", "votes", "locked", "UUID", "userID", "timeSubmitted", "views", "category", "actionType", "service", "videoDuration", "hidden", "shadowHidden", "description", "hashedVideoID") VALUES(?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)';
const insertCasualVotesQuery = `INSERT INTO "casualVotes" ("UUID", "videoID", "service", "hashedVideoID", "category", "upvotes", "timeSubmitted") VALUES (?, ?, ?, ?, ?, ?, ?)`;
const insertCasualVotesTitleQuery = `INSERT INTO "casualVoteTitles" ("videoID", "service", "hashedVideoID", "id", "title") VALUES (?, ?, ?, ?, ?)`;
await Promise.all([
db.prepare("run", titleQuery, [videoID1, "title1", 0, "userID1", Service.YouTube, videoID1Hash, 1, "UUID1"]),
db.prepare("run", titleQuery, [videoID1, "title2", 0, "userID2", Service.YouTube, videoID1Hash, 1, "UUID2"]),
db.prepare("run", titleQuery, [videoID1, "title3", 1, "userID3", Service.YouTube, videoID1Hash, 1, "UUID3"]),
db.prepare("run", titleQuery, [videoID1, "title4removed", 0, "userID4", Service.YouTube, videoID1Hash, 1, "UUID4"]),
db.prepare("run", thumbnailQuery, [videoID1, 0, "userID1", Service.YouTube, videoID1Hash, 1, "UUID1T"]),
db.prepare("run", thumbnailQuery, [videoID1, 1, "userID2", Service.YouTube, videoID1Hash, 1, "UUID2T"]),
db.prepare("run", thumbnailQuery, [videoID1, 0, "userID3", Service.YouTube, videoID1Hash, 1, "UUID3T"]),
db.prepare("run", thumbnailQuery, [videoID1, 0, "userID4", Service.YouTube, videoID1Hash, 1, "UUID4T"]),
]);
await Promise.all([
db.prepare("run", titleVotesQuery, ["UUID1", 3, 0, 0, 0]),
db.prepare("run", titleVotesQuery, ["UUID2", 2, 0, 0, 0]),
db.prepare("run", titleVotesQuery, ["UUID3", 1, 0, 0, 0]),
db.prepare("run", titleVotesQuery, ["UUID1", 3, 0, 0, 0, 0, 0]),
db.prepare("run", titleVotesQuery, ["UUID2", 3, 0, 0, 0, 1, 0]),
db.prepare("run", titleVotesQuery, ["UUID3", 0, 0, 0, 0, 0, 0]),
db.prepare("run", titleVotesQuery, ["UUID4", 5, 0, 0, 0, 0, 1]),
db.prepare("run", thumbnailTimestampsQuery, ["UUID1T", 1]),
db.prepare("run", thumbnailTimestampsQuery, ["UUID3T", 3]),
db.prepare("run", thumbnailVotesQuery, ["UUID1T", 3, 0, 0]),
db.prepare("run", thumbnailVotesQuery, ["UUID2T", 2, 0, 0]),
db.prepare("run", thumbnailVotesQuery, ["UUID3T", 1, 0, 0])
db.prepare("run", thumbnailTimestampsQuery, ["UUID4T", 18]),
db.prepare("run", thumbnailVotesQuery, ["UUID1T", 3, 0, 0, 0, 0]),
db.prepare("run", thumbnailVotesQuery, ["UUID2T", 3, 0, 0, 1, 0]),
db.prepare("run", thumbnailVotesQuery, ["UUID3T", 1, 0, 0, 0, 0]),
db.prepare("run", thumbnailVotesQuery, ["UUID4T", 5, 0, 0, 0, 1])
]);
await Promise.all([
@@ -71,15 +87,15 @@ describe("getBranding", () => {
]);
await Promise.all([
db.prepare("run", titleVotesQuery, ["UUID11", 3, 0, 0, 0]),
db.prepare("run", titleVotesQuery, ["UUID21", 2, 0, 0, 0]),
db.prepare("run", titleVotesQuery, ["UUID31", 1, 1, 0, 0]),
db.prepare("run", titleVotesQuery, ["UUID11", 3, 0, 0, 0, 0, 0]),
db.prepare("run", titleVotesQuery, ["UUID21", 2, 0, 0, 0, 0, 0]),
db.prepare("run", titleVotesQuery, ["UUID31", 1, 1, 0, 0, 0, 0]),
db.prepare("run", thumbnailTimestampsQuery, ["UUID11T", 1]),
db.prepare("run", thumbnailTimestampsQuery, ["UUID31T", 3]),
db.prepare("run", thumbnailVotesQuery, ["UUID11T", 3, 0, 0]),
db.prepare("run", thumbnailVotesQuery, ["UUID21T", 2, 0, 0]),
db.prepare("run", thumbnailVotesQuery, ["UUID31T", 1, 1, 0]),
db.prepare("run", thumbnailVotesQuery, ["UUID11T", 3, 0, 0, 0, 0]),
db.prepare("run", thumbnailVotesQuery, ["UUID21T", 2, 0, 0, 0, 0]),
db.prepare("run", thumbnailVotesQuery, ["UUID31T", 1, 1, 0, 0, 0]),
]);
await Promise.all([
@@ -92,19 +108,18 @@ describe("getBranding", () => {
]);
await Promise.all([
db.prepare("run", titleVotesQuery, ["UUID12", 3, 0, 0, 0]),
db.prepare("run", titleVotesQuery, ["UUID22", 2, 0, 0, 0]),
db.prepare("run", titleVotesQuery, ["UUID32", 1, 0, 1, 0]),
db.prepare("run", titleVotesQuery, ["UUID12", 3, 0, 0, 0, 0, 0]),
db.prepare("run", titleVotesQuery, ["UUID22", 2, 0, 0, 0, 0, 0]),
db.prepare("run", titleVotesQuery, ["UUID32", 1, 0, 1, 0, 0, 0]),
db.prepare("run", thumbnailTimestampsQuery, ["UUID12T", 1]),
db.prepare("run", thumbnailTimestampsQuery, ["UUID32T", 3]),
db.prepare("run", thumbnailVotesQuery, ["UUID12T", 3, 0, 0]),
db.prepare("run", thumbnailVotesQuery, ["UUID22T", 2, 0, 0]),
db.prepare("run", thumbnailVotesQuery, ["UUID32T", 1, 0, 1])
db.prepare("run", thumbnailVotesQuery, ["UUID12T", 3, 0, 0, 0, 0]),
db.prepare("run", thumbnailVotesQuery, ["UUID22T", 2, 0, 0, 0, 0]),
db.prepare("run", thumbnailVotesQuery, ["UUID32T", 1, 0, 1, 0, 0])
]);
const query = 'INSERT INTO "sponsorTimes" ("videoID", "startTime", "endTime", "votes", "locked", "UUID", "userID", "timeSubmitted", "views", "category", "actionType", "service", "videoDuration", "hidden", "shadowHidden", "description", "hashedVideoID") VALUES(?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)';
await db.prepare("run", query, [videoIDRandomTime, 1, 11, 1, 0, "uuidbranding1", "testman", 0, 50, "sponsor", "skip", "YouTube", 100, 0, 0, "", videoIDRandomTimeHash]);
await db.prepare("run", query, [videoIDRandomTime, 20, 33, 2, 0, "uuidbranding2", "testman", 0, 50, "intro", "skip", "YouTube", 100, 0, 0, "", videoIDRandomTimeHash]);
await db.prepare("run", segmentQuery, [videoIDRandomTime, 1, 11, 1, 0, "uuidbranding1", "testman", 0, 50, "sponsor", "skip", "YouTube", 100, 0, 0, "", videoIDRandomTimeHash]);
await db.prepare("run", segmentQuery, [videoIDRandomTime, 20, 33, 2, 0, "uuidbranding2", "testman", 0, 50, "intro", "skip", "YouTube", 100, 0, 0, "", videoIDRandomTimeHash]);
await Promise.all([
db.prepare("run", titleQuery, [videoIDUnverified, "title1", 0, "userID1", Service.YouTube, videoIDUnverifiedHash, 1, "UUID-uv-1"]),
@@ -116,19 +131,44 @@ describe("getBranding", () => {
]);
await Promise.all([
db.prepare("run", titleVotesQuery, ["UUID-uv-1", 3, 0, 0, -1]),
db.prepare("run", titleVotesQuery, ["UUID-uv-2", 2, 0, 0, -1]),
db.prepare("run", titleVotesQuery, ["UUID-uv-3", 0, 0, 0, -1]),
db.prepare("run", titleVotesQuery, ["UUID-uv-1", 3, 0, 0, -1, 0, 0]),
db.prepare("run", titleVotesQuery, ["UUID-uv-2", 2, 0, 0, -1, 0, 0]),
db.prepare("run", titleVotesQuery, ["UUID-uv-3", 0, 0, 0, -1, 0, 0]),
db.prepare("run", thumbnailTimestampsQuery, ["UUID-uv-1T", 1]),
db.prepare("run", thumbnailTimestampsQuery, ["UUID-uv-3T", 3]),
db.prepare("run", thumbnailVotesQuery, ["UUID-uv-1T", 3, 0, 0]),
db.prepare("run", thumbnailVotesQuery, ["UUID-uv-2T", 2, 0, 0]),
db.prepare("run", thumbnailVotesQuery, ["UUID-uv-3T", 1, 0, 0])
db.prepare("run", thumbnailVotesQuery, ["UUID-uv-1T", 3, 0, 0, 0, 0]),
db.prepare("run", thumbnailVotesQuery, ["UUID-uv-2T", 2, 0, 0, 0, 0]),
db.prepare("run", thumbnailVotesQuery, ["UUID-uv-3T", 1, 0, 0, 0, 0])
]);
// Video duration test segments
await Promise.all([
db.prepare("run", segmentQuery, [videoIDvidDuration, 0, 1, 0, 0, "uuidvd1", "testman", 10, 0, "sponsor", "skip", "YouTube", 0, 0, 0, "", videoIDvidDurationHash]), // visible, no vid duration
db.prepare("run", segmentQuery, [videoIDvidDuration, 0, 2, -2, 0, "uuidvd2", "testman", 11, 0, "sponsor", "skip", "YouTube", 10, 0, 0, "", videoIDvidDurationHash]), // downvoted
db.prepare("run", segmentQuery, [videoIDvidDuration, 0, 3, 0, 0, "uuidvd3", "testman", 12, 0, "sponsor", "skip", "YouTube", 10.1, 1, 0, "", videoIDvidDurationHash]), // hidden
db.prepare("run", segmentQuery, [videoIDvidDuration, 0, 4, 0, 0, "uuidvd4", "testman", 13, 0, "sponsor", "skip", "YouTube", 20.1, 0, 1, "", videoIDvidDurationHash]), // shadowhidden
db.prepare("run", segmentQuery, [videoIDvidDuration, 0, 5, 0, 0, "uuidvd5", "testman", 14, 0, "sponsor", "skip", "YouTube", 21.3, 0, 0, "", videoIDvidDurationHash]), // oldest visible w/ duration, should be picked
db.prepare("run", segmentQuery, [videoIDvidDuration, 0, 6, 0, 0, "uuidvd6", "testman", 15, 0, "sponsor", "skip", "YouTube", 21.37, 0, 0, "", videoIDvidDurationHash]), // not the oldest visible
db.prepare("run", segmentQuery, [videoIDvidDuration, 0, 7, -2, 0, "uuidvd7", "testman", 16, 0, "sponsor", "skip", "YouTube", 21.38, 0, 0, "", videoIDvidDurationHash]), // downvoted, not the oldest
]);
await Promise.all([
db.prepare("run", insertCasualVotesQuery, ["postBrandCasual1", videoIDCasual, Service.YouTube, videoIDCasualHash, "clever", 1, Date.now()]),
db.prepare("run", insertCasualVotesQuery, ["postBrandCasual2", videoIDCasualDownvoted, Service.YouTube, videoIDCasualDownvotedHash, "clever", 1, Date.now()]),
db.prepare("run", insertCasualVotesQuery, ["postBrandCasual2d", videoIDCasualDownvoted, Service.YouTube, videoIDCasualDownvotedHash, "downvote", 1, Date.now()]),
db.prepare("run", insertCasualVotesQuery, ["postBrandCasual3", videoIDCasualDownvoted, Service.YouTube, videoIDCasualDownvotedHash, "other", 4, Date.now()]),
db.prepare("run", insertCasualVotesQuery, ["postBrandCasual4", videoIDCasualTitle, Service.YouTube, videoIDCasualTitleHash, "clever", 8, Date.now()]),
db.prepare("run", insertCasualVotesQuery, ["postBrandCasual4d", videoIDCasualTitle, Service.YouTube, videoIDCasualTitleHash, "downvote", 4, Date.now()]),
db.prepare("run", insertCasualVotesQuery, ["postBrandCasual4o", videoIDCasualTitle, Service.YouTube, videoIDCasualTitleHash, "other", 3, Date.now()]),
]);
await Promise.all([
db.prepare("run", insertCasualVotesTitleQuery, [videoIDCasualTitle, Service.YouTube, videoIDCasualTitleHash, 0, "a cool title"]),
]);
});
it("should get top titles and thumbnails", async () => {
await checkVideo(videoID1, videoID1Hash, {
await checkVideo(videoID1, videoID1Hash, false, {
titles: [{
title: "title1",
original: false,
@@ -144,7 +184,7 @@ describe("getBranding", () => {
}, {
title: "title3",
original: true,
votes: 1,
votes: 0,
locked: false,
UUID: "UUID3" as BrandingUUID
}],
@@ -170,7 +210,7 @@ describe("getBranding", () => {
});
it("should get top titles and thumbnails prioritizing locks", async () => {
await checkVideo(videoID2Locked, videoID2LockedHash, {
await checkVideo(videoID2Locked, videoID2LockedHash, false, {
titles: [{
title: "title3",
original: true,
@@ -212,7 +252,7 @@ describe("getBranding", () => {
});
it("should get top titles and hide shadow hidden", async () => {
await checkVideo(videoID2ShadowHide, videoID2ShadowHideHash, {
await checkVideo(videoID2ShadowHide, videoID2ShadowHideHash, false, {
titles: [{
title: "title1",
original: false,
@@ -242,8 +282,8 @@ describe("getBranding", () => {
});
it("should get 404 when nothing", async () => {
const result1 = await getBranding({ videoID: videoIDEmpty });
const result2 = await getBrandingByHash(videoIDEmptyHash, {});
const result1 = await getBranding({ videoID: videoIDEmpty, fetchAll: true });
const result2 = await getBrandingByHash(videoIDEmptyHash, { fetchAll: true });
assert.strictEqual(result1.status, 404);
assert.strictEqual(result2.status, 404);
@@ -252,8 +292,8 @@ describe("getBranding", () => {
it("should get correct random time", async () => {
const videoDuration = 100;
const result1 = await getBranding({ videoID: videoIDRandomTime });
const result2 = await getBrandingByHash(videoIDRandomTimeHash, {});
const result1 = await getBranding({ videoID: videoIDRandomTime, fetchAll: true });
const result2 = await getBrandingByHash(videoIDRandomTimeHash, { fetchAll: true });
const randomTime = result1.data.randomTime;
assert.strictEqual(randomTime, result2.data[videoIDRandomTime].randomTime);
@@ -266,7 +306,7 @@ describe("getBranding", () => {
});
it("should get top titles and thumbnails that are unverified", async () => {
await checkVideo(videoIDUnverified, videoIDUnverifiedHash, {
await checkVideo(videoIDUnverified, videoIDUnverifiedHash, true, {
titles: [{
title: "title1",
original: false,
@@ -307,12 +347,53 @@ describe("getBranding", () => {
});
});
async function checkVideo(videoID: string, videoIDHash: string, expected: {
titles: TitleResult[],
thumbnails: ThumbnailResult[]
it("should get the correct video duration", async () => {
const correctDuration = 21.3;
const result1 = await getBranding({ videoID: videoIDvidDuration, fetchAll: true });
const result2 = await getBrandingByHash(videoIDvidDurationHash, { fetchAll: true });
assert.strictEqual(result1.data.videoDuration, correctDuration);
assert.strictEqual(result2.data[videoIDvidDuration].videoDuration, correctDuration);
});
it("should get casual votes", async () => {
await checkVideo(videoIDCasual, videoIDCasualHash, true, {
casualVotes: [{
id: "clever",
count: 1,
title: null
}]
});
});
it("should not get casual votes with downvotes", async () => {
await checkVideo(videoIDCasualDownvoted, videoIDCasualDownvotedHash, true, {
casualVotes: [{
id: "other",
count: 3,
title: null
}]
});
});
it("should get casual votes with title", async () => {
await checkVideo(videoIDCasualTitle, videoIDCasualTitleHash, true, {
casualVotes: [{
id: "clever",
count: 4,
title: "a cool title"
}]
});
});
async function checkVideo(videoID: string, videoIDHash: string, fetchAll: boolean, expected: {
titles?: TitleResult[],
thumbnails?: ThumbnailResult[],
casualVotes?: CasualVote[]
}) {
const result1 = await getBranding({ videoID });
const result2 = await getBrandingByHash(videoIDHash, {});
const result1 = await getBranding({ videoID, fetchAll });
const result2 = await getBrandingByHash(videoIDHash, { fetchAll });
assert.strictEqual(result1.status, 200);
assert.strictEqual(result2.status, 200);

View File

@@ -16,17 +16,18 @@ describe("getChapterNames", function () {
"Weird name",
"A different one",
"Something else",
"Weirder name",
];
const nameSearch = (query: string, expected: string): Promise<void> => {
const nameSearch = (query: string, expected: string | null, expectedResults: number): Promise<void> => {
const expectedData = [{
description: expected
}];
return client.get(`${endpoint}?description=${query}&channelID=${chapterChannelID}`)
.then(res => {
assert.strictEqual(res.status, 200);
assert.strictEqual(res.data.length, chapterNames.length);
assert.ok(partialDeepEquals(res.data, expectedData));
assert.strictEqual(res.status, expectedResults == 0 ? 404 : 200);
assert.strictEqual(res.data.length, expectedResults);
if (expected != null) assert.ok(partialDeepEquals(res.data, expectedData));
});
};
@@ -35,11 +36,13 @@ describe("getChapterNames", function () {
await insertChapter(db, chapterNames[0], { videoID: chapterNamesVid1, startTime: 60, endTime: 80 });
await insertChapter(db, chapterNames[1], { videoID: chapterNamesVid1, startTime: 70, endTime: 75 });
await insertChapter(db, chapterNames[2], { videoID: chapterNamesVid1, startTime: 71, endTime: 76 });
await insertChapter(db, chapterNames[3], { videoID: chapterNamesVid1, startTime: 72, endTime: 77 });
await insertVideoInfo(db, chapterNamesVid1, chapterChannelID);
});
it("Search for 'weird'", () => nameSearch("weird", chapterNames[0]));
it("Search for 'different'", () => nameSearch("different", chapterNames[1]));
it("Search for 'something'", () => nameSearch("something", chapterNames[2]));
});
it("Search for 'weird' (2 results)", () => nameSearch("weird", chapterNames[0], 2));
it("Search for 'different' (1 result)", () => nameSearch("different", chapterNames[1], 1));
it("Search for 'something' (1 result)", () => nameSearch("something", chapterNames[2], 1));
it("Search for 'unrelated' (0 result)", () => nameSearch("unrelated", null, 0));
});

View File

@@ -30,6 +30,7 @@ describe("getSkipSegments", () => {
await db.prepare("run", query, ["chapterVid", 71, 75, 2, 0, "chapterVid-3", "testman", 0, 50, "chapter", "chapter", "YouTube", 0, 0, 0, "Chapter 3"]);
await db.prepare("run", query, ["requiredSegmentHashVid", 10, 20, -2, 0, "1d04b98f48e8f8bcc15c6ae5ac050801cd6dcfd428fb5f9e65c4e16e7807340fa", "testman", 0, 50, "sponsor", "skip", "YouTube", 0, 0, 0, ""]);
await db.prepare("run", query, ["requiredSegmentHashVid", 20, 30, -2, 0, "1ebde8e8ae03096b6c866aa2c8cc7ee1d720ca1fca27bea3f39a6a1b876577e71", "testman", 0, 50, "sponsor", "skip", "YouTube", 0, 0, 0, ""]);
await db.prepare("run", query, ["getSkipSegmentID0", 1, 12, 1, 0, "uuid01-spot", "testman", 0, 50, "sponsor", "skip", "Spotify", 100, 0, 0, ""]);
return;
});
@@ -495,4 +496,22 @@ describe("getSkipSegments", () => {
})
.catch(err => done(err));
});
it("Should be able to get a time by category for spotify service", (done) => {
client.get(endpoint, { params: { videoID: "getSkipSegmentID0", category: "sponsor", service: "spotify" } })
.then(res => {
assert.strictEqual(res.status, 200);
const data = res.data;
assert.strictEqual(data.length, 1);
assert.strictEqual(data[0].segment[0], 1);
assert.strictEqual(data[0].segment[1], 12);
assert.strictEqual(data[0].category, "sponsor");
assert.strictEqual(data[0].UUID, "uuid01-spot");
assert.strictEqual(data[0].votes, 1);
assert.strictEqual(data[0].locked, 0);
assert.strictEqual(data[0].videoDuration, 100);
done();
})
.catch(err => done(err));
});
});

View File

@@ -73,11 +73,11 @@ describe("getUserID", () => {
)
);
it("Should be able to get multiple fuzzy user info from middle", () => {
it("Should be able to get multiple fuzzy user info from middle", () =>
validateSearch("user",
[users["fuzzy_1"], users["fuzzy_2"], users["specific_1"]]
);
});
)
);
it("Should be able to get with fuzzy public ID", () => {
const userID = users["public_1"].pubID.substring(0,60);
@@ -117,4 +117,4 @@ describe("getUserID 400/ 404", () => {
it("Should not allow usernames less than 3 characters", () => validateStatus("aa", 400));
it("Should return 404 if escaped backslashes present", () => validateStatus("%redos\\\\_", 404));
it("Should return 404 if backslashes present", () => validateStatus(`\\%redos\\_`, 404));
});
});

View File

@@ -81,19 +81,19 @@ describe("getUserInfo", () => {
// warnings & bans
// warn-0
insertWarning(db, users["warn-0"].pubID, { reason: "warning0-0", issueTime: 10 });
await insertWarning(db, users["warn-0"].pubID, { reason: "warning0-0", issueTime: 10 });
// warn-1
insertWarning(db, users["warn-1"].pubID, { reason: "warning1-0", issueTime: 20 });
insertWarning(db, users["warn-1"].pubID, { reason: "warning1-1", issueTime: 30 });
await insertWarning(db, users["warn-1"].pubID, { reason: "warning1-0", issueTime: 20 });
await insertWarning(db, users["warn-1"].pubID, { reason: "warning1-1", issueTime: 30 });
// warn -2
insertWarning(db, users["warn-2"].pubID, { reason: "warning2-0", issueTime: 40, enabled: false });
await insertWarning(db, users["warn-2"].pubID, { reason: "warning2-0", issueTime: 40, enabled: false });
// warn-3
insertWarning(db, users["warn-3"].pubID, { reason: "warning3-0", issueTime: 50 });
insertWarning(db, users["warn-3"].pubID, { reason: "warning3-1", issueTime: 60, enabled: false });
await insertWarning(db, users["warn-3"].pubID, { reason: "warning3-0", issueTime: 50 });
await insertWarning(db, users["warn-3"].pubID, { reason: "warning3-1", issueTime: 60, enabled: false });
// ban-
insertBan(db, users["ban-1"].pubID);
insertBan(db, users["ban-2"].pubID);
await insertBan(db, users["ban-1"].pubID);
await insertBan(db, users["ban-2"].pubID);
});
it("Should be able to get a 200", () => statusTest(200, { userID: users["n-1"].privID }));

View File

@@ -1,6 +1,9 @@
import { getHash } from "../../src/utils/getHash";
import { client } from "../utils/httpClient";
import assert from "assert";
import { insertSegment } from "../utils/segmentQueryGen";
import { db } from "../../src/databases/databases";
import { HashedUserID } from "../../src/types/user.model";
// helpers
const getUsername = (userID: string) => client({
@@ -22,6 +25,10 @@ const userOnePublic = getHash(userOnePrivate);
const userOneUsername = "getUsername_username";
describe("getUsername test", function() {
before(async () => {
await insertSegment(db, { userID: userOnePublic as HashedUserID });
});
it("Should get back publicUserID if not set", (done) => {
getUsername(userOnePrivate)
.then(result => {
@@ -50,4 +57,4 @@ describe("getUsername test", function() {
})
.catch(err => done(err));
});
});
});

View File

@@ -6,26 +6,27 @@ import { getHash } from "../../src/utils/getHash";
describe("getVideoLabelHash", () => {
const endpoint = "/api/videoLabels";
before(async () => {
const query = 'INSERT INTO "sponsorTimes" ("videoID", "hashedVideoID", "votes", "locked", "UUID", "userID", "timeSubmitted", "category", "actionType", "hidden", "shadowHidden", "startTime", "endTime", "views") VALUES(?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, 0, 0, 0)';
await db.prepare("run", query, ["getLabelHashSponsor" , getHash("getLabelHashSponsor", 1) , 2, 0, "labelhash01", "labeluser", 0, "sponsor", "full", 0, 0]);
await db.prepare("run", query, ["getLabelHashEA" , getHash("getLabelHashEA", 1) , 2, 0, "labelhash02", "labeluser", 0, "exclusive_access", "full", 0, 0]);
await db.prepare("run", query, ["getLabelHashSelfpromo" , getHash("getLabelHashSelfpromo", 1) , 2, 0, "labelhash03", "labeluser", 0, "selfpromo", "full", 0, 0]);
const query = 'INSERT INTO "sponsorTimes" ("videoID", "hashedVideoID", "votes", "locked", "UUID", "userID", "timeSubmitted", "category", "actionType", "hidden", "shadowHidden", "startTime", "endTime", "views") VALUES(?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, 0, 0)';
await db.prepare("run", query, ["getLabelHashSponsor" , getHash("getLabelHashSponsor", 1) , 2, 0, "labelhash01", "labeluser", 0, "sponsor", "full", 0, 0, 0]);
await db.prepare("run", query, ["getLabelHashSponsor" , getHash("getLabelHashSponsor", 1) , 2, 0, "labelhash012", "labeluser", 0, "sponsor", "skip", 0, 0, 2]);
await db.prepare("run", query, ["getLabelHashEA" , getHash("getLabelHashEA", 1) , 2, 0, "labelhash02", "labeluser", 0, "exclusive_access", "full", 0, 0, 0]);
await db.prepare("run", query, ["getLabelHashSelfpromo" , getHash("getLabelHashSelfpromo", 1) , 2, 0, "labelhash03", "labeluser", 0, "selfpromo", "full", 0, 0, 0]);
// priority override
await db.prepare("run", query, ["getLabelHashPriority" , getHash("getLabelHashPriority", 1) , 2, 0, "labelhash04", "labeluser", 0, "sponsor", "full", 0, 0]);
await db.prepare("run", query, ["getLabelHashPriority" , getHash("getLabelHashPriority", 1) , 2, 0, "labelhash05", "labeluser", 0, "exclusive_access", "full", 0, 0]);
await db.prepare("run", query, ["getLabelHashPriority" , getHash("getLabelHashPriority", 1) , 2, 0, "labelhash06", "labeluser", 0, "selfpromo", "full", 0, 0]);
await db.prepare("run", query, ["getLabelHashPriority" , getHash("getLabelHashPriority", 1) , 2, 0, "labelhash04", "labeluser", 0, "sponsor", "full", 0, 0, 0]);
await db.prepare("run", query, ["getLabelHashPriority" , getHash("getLabelHashPriority", 1) , 2, 0, "labelhash05", "labeluser", 0, "exclusive_access", "full", 0, 0, 0]);
await db.prepare("run", query, ["getLabelHashPriority" , getHash("getLabelHashPriority", 1) , 2, 0, "labelhash06", "labeluser", 0, "selfpromo", "full", 0, 0, 0]);
// locked only
await db.prepare("run", query, ["getLabelHashLocked" , getHash("getLabelHashLocked", 1) , 2, 0, "labelhash07", "labeluser", 0, "sponsor", "full", 0, 0]);
await db.prepare("run", query, ["getLabelHashLocked" , getHash("getLabelHashLocked", 1) , 2, 0, "labelhash08", "labeluser", 0, "exclusive_access", "full", 0, 0]);
await db.prepare("run", query, ["getLabelHashLocked" , getHash("getLabelHashLocked", 1) , 2, 1, "labelhash09", "labeluser", 0, "selfpromo", "full", 0, 0]);
await db.prepare("run", query, ["getLabelHashLocked" , getHash("getLabelHashLocked", 1) , 2, 0, "labelhash07", "labeluser", 0, "sponsor", "full", 0, 0, 0]);
await db.prepare("run", query, ["getLabelHashLocked" , getHash("getLabelHashLocked", 1) , 2, 0, "labelhash08", "labeluser", 0, "exclusive_access", "full", 0, 0, 0]);
await db.prepare("run", query, ["getLabelHashLocked" , getHash("getLabelHashLocked", 1) , 2, 1, "labelhash09", "labeluser", 0, "selfpromo", "full", 0, 0, 0]);
// hidden segments
await db.prepare("run", query, ["getLabelHashDownvote" , getHash("getLabelHashDownvote", 1) , -2, 0, "labelhash10", "labeluser", 0, "selfpromo", "full", 0, 0]);
await db.prepare("run", query, ["getLabelHashHidden" , getHash("getLabelHashHidden", 1) , 2, 0, "labelhash11", "labeluser", 0, "selfpromo", "full", 1, 0]);
await db.prepare("run", query, ["getLabelHashShHidden" , getHash("getLabelHashShHidden", 1) , 2, 0, "labelhash12", "labeluser", 0, "selfpromo", "full", 0, 1]);
await db.prepare("run", query, ["getLabelHashDownvote" , getHash("getLabelHashDownvote", 1) , -2, 0, "labelhash10", "labeluser", 0, "selfpromo", "full", 0, 0, 0]);
await db.prepare("run", query, ["getLabelHashHidden" , getHash("getLabelHashHidden", 1) , 2, 0, "labelhash11", "labeluser", 0, "selfpromo", "full", 1, 0, 0]);
await db.prepare("run", query, ["getLabelHashShHidden" , getHash("getLabelHashShHidden", 1) , 2, 0, "labelhash12", "labeluser", 0, "selfpromo", "full", 0, 1, 0]);
// priority override2
await db.prepare("run", query, ["getLabelHashPriority2" , getHash("getLabelHashPriority2", 1) , -2, 0, "labelhash13", "labeluser", 0, "sponsor", "full", 0, 0]);
await db.prepare("run", query, ["getLabelHashPriority2" , getHash("getLabelHashPriority2", 1) , 2, 0, "labelhash14", "labeluser", 0, "exclusive_access", "full", 0, 0]);
await db.prepare("run", query, ["getLabelHashPriority2" , getHash("getLabelHashPriority2", 1) , 2, 0, "labelhash15", "labeluser", 0, "selfpromo", "full", 0, 0]);
await db.prepare("run", query, ["getLabelHashPriority2" , getHash("getLabelHashPriority2", 1) , -2, 0, "labelhash13", "labeluser", 0, "sponsor", "full", 0, 0, 0]);
await db.prepare("run", query, ["getLabelHashPriority2" , getHash("getLabelHashPriority2", 1) , 2, 0, "labelhash14", "labeluser", 0, "exclusive_access", "full", 0, 0, 0]);
await db.prepare("run", query, ["getLabelHashPriority2" , getHash("getLabelHashPriority2", 1) , 2, 0, "labelhash15", "labeluser", 0, "selfpromo", "full", 0, 0, 0]);
return;
});
@@ -33,25 +34,20 @@ describe("getVideoLabelHash", () => {
function validateLabel(data: any, videoID: string) {
assert.strictEqual(data[0].videoID, videoID);
assert.strictEqual(data[0].segments.length, 1);
assert.strictEqual(data[0].segments[0].segment[0], 0);
assert.strictEqual(data[0].segments[0].segment[1], 0);
assert.strictEqual(data[0].segments[0].actionType, "full");
assert.strictEqual(data[0].segments[0].userID, "labeluser");
}
const get = (videoID: string) => client.get(`${endpoint}/${getHash(videoID, 1).substring(0, 4)}`);
const get = (videoID: string, hasStartSegment = false) => client.get(`${endpoint}/${getHash(videoID, 1).substring(0, 4)}?hasStartSegment=${hasStartSegment}`);
it("Should be able to get sponsor only label", (done) => {
const videoID = "getLabelHashSponsor";
get(videoID)
get(videoID, true)
.then(res => {
assert.strictEqual(res.status, 200);
const data = res.data;
validateLabel(data, videoID);
const result = data[0].segments[0];
assert.strictEqual(result.category, "sponsor");
assert.strictEqual(result.UUID, "labelhash01");
assert.strictEqual(result.locked, 0);
assert.strictEqual(data[0].hasStartSegment, true);
done();
})
.catch(err => done(err));
@@ -59,15 +55,14 @@ describe("getVideoLabelHash", () => {
it("Should be able to get exclusive access only label", (done) => {
const videoID = "getLabelHashEA";
get(videoID)
get(videoID, true)
.then(res => {
assert.strictEqual(res.status, 200);
const data = res.data;
validateLabel(data, videoID);
const result = data[0].segments[0];
assert.strictEqual(result.category, "exclusive_access");
assert.strictEqual(result.UUID, "labelhash02");
assert.strictEqual(result.locked, 0);
assert.strictEqual(data[0].hasStartSegment, false);
done();
})
.catch(err => done(err));
@@ -82,8 +77,7 @@ describe("getVideoLabelHash", () => {
validateLabel(data, videoID);
const result = data[0].segments[0];
assert.strictEqual(result.category, "selfpromo");
assert.strictEqual(result.UUID, "labelhash03");
assert.strictEqual(result.locked, 0);
assert.strictEqual(data[0].hasStartSegment, undefined);
done();
})
.catch(err => done(err));
@@ -98,8 +92,6 @@ describe("getVideoLabelHash", () => {
validateLabel(data, videoID);
const result = data[0].segments[0];
assert.strictEqual(result.category, "sponsor");
assert.strictEqual(result.UUID, "labelhash04");
assert.strictEqual(result.locked, 0);
done();
})
.catch(err => done(err));
@@ -114,8 +106,6 @@ describe("getVideoLabelHash", () => {
validateLabel(data, videoID);
const result = data[0].segments[0];
assert.strictEqual(result.category, "selfpromo");
assert.strictEqual(result.UUID, "labelhash09");
assert.strictEqual(result.locked, 1);
done();
})
.catch(err => done(err));
@@ -130,8 +120,6 @@ describe("getVideoLabelHash", () => {
validateLabel(data, videoID);
const result = data[0].segments[0];
assert.strictEqual(result.category, "exclusive_access");
assert.strictEqual(result.UUID, "labelhash14");
assert.strictEqual(result.locked, 0);
done();
})
.catch(err => done(err));

View File

@@ -31,10 +31,6 @@ describe("getVideoLabels", () => {
function validateLabel(result: any) {
assert.strictEqual(result.length, 1);
assert.strictEqual(result[0].segment[0], 0);
assert.strictEqual(result[0].segment[1], 0);
assert.strictEqual(result[0].actionType, "full");
assert.strictEqual(result[0].userID, "labeluser");
}
const get = (videoID: string) => client.get(endpoint, { params: { videoID } });
@@ -46,8 +42,6 @@ describe("getVideoLabels", () => {
const data = res.data;
validateLabel(data);
assert.strictEqual(data[0].category, "sponsor");
assert.strictEqual(data[0].UUID, "label01");
assert.strictEqual(data[0].locked, 0);
done();
})
.catch(err => done(err));
@@ -60,8 +54,6 @@ describe("getVideoLabels", () => {
const data = res.data;
validateLabel(data);
assert.strictEqual(data[0].category, "exclusive_access");
assert.strictEqual(data[0].UUID, "label02");
assert.strictEqual(data[0].locked, 0);
done();
})
.catch(err => done(err));
@@ -74,8 +66,6 @@ describe("getVideoLabels", () => {
const data = res.data;
validateLabel(data);
assert.strictEqual(data[0].category, "selfpromo");
assert.strictEqual(data[0].UUID, "label03");
assert.strictEqual(data[0].locked, 0);
done();
})
.catch(err => done(err));
@@ -88,8 +78,6 @@ describe("getVideoLabels", () => {
const data = res.data;
validateLabel(data);
assert.strictEqual(data[0].category, "sponsor");
assert.strictEqual(data[0].UUID, "label04");
assert.strictEqual(data[0].locked, 0);
done();
})
.catch(err => done(err));
@@ -102,8 +90,6 @@ describe("getVideoLabels", () => {
const data = res.data;
validateLabel(data);
assert.strictEqual(data[0].category, "selfpromo");
assert.strictEqual(data[0].UUID, "label09");
assert.strictEqual(data[0].locked, 1);
done();
})
.catch(err => done(err));
@@ -116,8 +102,6 @@ describe("getVideoLabels", () => {
const data = res.data;
validateLabel(data);
assert.strictEqual(data[0].category, "exclusive_access");
assert.strictEqual(data[0].UUID, "label14");
assert.strictEqual(data[0].locked, 0);
done();
})
.catch(err => done(err));

View File

@@ -33,11 +33,11 @@ const checkUserViews = (user: User) =>
});
describe("getViewsForUser", function() {
before(() => {
before(async () => {
// add views for users
insertSegment(db, { userID: users["u-1"].pubID, views: users["u-1"].info.views1 });
insertSegment(db, { userID: users["u-1"].pubID, views: users["u-1"].info.views2 });
insertSegment(db, { userID: users["u-2"].pubID, views: users["u-2"].info.views });
await insertSegment(db, { userID: users["u-1"].pubID, views: users["u-1"].info.views1 });
await insertSegment(db, { userID: users["u-1"].pubID, views: users["u-1"].info.views2 });
await insertSegment(db, { userID: users["u-2"].pubID, views: users["u-2"].info.views });
});
it("Should get back views for user one", () =>
checkUserViews(users["u-1"])
@@ -53,4 +53,4 @@ describe("getViewsForUser", function() {
client({ url: endpoint })
.then(res => assert.strictEqual(res.status, 400))
);
});
});

View File

@@ -21,7 +21,7 @@ const expectedInnerTube = { // partial type of innerTubeVideoDetails
};
const currentViews = 49816;
describe("innertube API test", function() {
xdescribe("innertube API test", function() {
it("should be able to get innerTube details", async () => {
const result = await innerTube.getPlayerData(videoID, true);
assert.ok(partialDeepEquals(result, expectedInnerTube));

View File

@@ -15,7 +15,6 @@ describe("postBranding", () => {
const userID5 = `PostBrandingUser5${".".repeat(16)}`;
const userID6 = `PostBrandingUser6${".".repeat(16)}`;
const userID7 = `PostBrandingUser7${".".repeat(16)}`;
const userID8 = `PostBrandingUser8${".".repeat(16)}`;
const bannedUser = `BannedPostBrandingUser${".".repeat(16)}`;
@@ -50,22 +49,44 @@ describe("postBranding", () => {
const insertThumbnailQuery = 'INSERT INTO "thumbnails" ("videoID", "original", "userID", "service", "hashedVideoID", "timeSubmitted", "UUID") VALUES (?, ?, ?, ?, ?, ?, ?)';
await db.prepare("run", insertThumbnailQuery, ["postBrandLocked1", 0, getHash(userID3), Service.YouTube, getHash("postBrandLocked1"), Date.now(), "postBrandLocked1"]);
await db.prepare("run", insertThumbnailQuery, ["postBrandLocked2", 1, getHash(userID4), Service.YouTube, getHash("postBrandLocked2"), Date.now(), "postBrandLocked2"]);
const insertThumbnailVotesQuery = 'INSERT INTO "thumbnailVotes" ("UUID", "votes", "locked", "shadowHidden") VALUES (?, ?, ?, ?);';
await db.prepare("run", insertThumbnailVotesQuery, ["postBrandLocked1", 0, 1, 0]);
await db.prepare("run", insertThumbnailVotesQuery, ["postBrandLocked2", 0, 1, 0]);
// Approved original thumbnail submitter
await db.prepare("run", insertThumbnailQuery, ["postBrandOriginThumb", 0, getHash(userID4), Service.YouTube, getHash("postBrandOriginThumb"), Date.now(), "postBrandOriginThumb"]);
await db.prepare("run", insertThumbnailQuery, ["postBrandOriginThumb2", 0, getHash(userID4), Service.YouTube, getHash("postBrandOriginThumb2"), Date.now(), "postBrandOriginThumb2"]);
await db.prepare("run", insertThumbnailQuery, ["postBrandOriginThumb3", 0, getHash(userID4), Service.YouTube, getHash("postBrandOriginThumb3"), Date.now(), "postBrandOriginThumb3"]);
await db.prepare("run", insertThumbnailQuery, ["postBrandOriginThumb4", 0, getHash(userID4), Service.YouTube, getHash("postBrandOriginThumb4"), Date.now(), "postBrandOriginThumb4"]);
await db.prepare("run", insertThumbnailQuery, ["postBrandOriginThumb5", 0, getHash(userID4), Service.YouTube, getHash("postBrandOriginThumb5"), Date.now(), "postBrandOriginThumb5"]);
await db.prepare("run", insertThumbnailVotesQuery, ["postBrandOriginThumb", 4, 0, 0]);
await db.prepare("run", insertThumbnailVotesQuery, ["postBrandOriginThumb2", 1, 0, 0]);
await db.prepare("run", insertThumbnailVotesQuery, ["postBrandOriginThumb3", 0, 0, 0]);
await db.prepare("run", insertThumbnailVotesQuery, ["postBrandOriginThumb4", 0, 0, 0]);
await db.prepare("run", insertThumbnailVotesQuery, ["postBrandOriginThumb5", 0, 0, 0]);
// Testing vip submission removal
await db.prepare("run", insertTitleQuery, ["postBrandRemoved1", "Some title", 0, getHash(userID1), Service.YouTube, getHash("postBrandRemoved1"), Date.now(), "postBrandRemoved1"]);
await db.prepare("run", insertTitleVotesQuery, ["postBrandRemoved1", 0, 1, 0, 0]);
await db.prepare("run", insertTitleQuery, ["postBrandRemoved1", "Some other title", 0, getHash(userID1), Service.YouTube, getHash("postBrandRemoved1"), Date.now(), "postBrandRemoved2"]);
await db.prepare("run", insertTitleVotesQuery, ["postBrandRemoved2", 0, 1, 0, 0]);
// Testing vip submission removal
const insertThumbnailTimestampQuery = 'INSERT INTO "thumbnailTimestamps" ("UUID", "timestamp") VALUES (?, ?)';
await db.prepare("run", insertThumbnailQuery, ["postBrandRemoved1", 0, getHash(userID3), Service.YouTube, getHash("postBrandRemoved1"), Date.now(), "postBrandRemoved1"]);
await db.prepare("run", insertThumbnailTimestampQuery, ["postBrandRemoved1", 12.34]);
await db.prepare("run", insertThumbnailVotesQuery, ["postBrandRemoved1", 0, 1, 0]);
await db.prepare("run", insertThumbnailQuery, ["postBrandRemoved1", 0, getHash(userID3), Service.YouTube, getHash("postBrandRemoved1"), Date.now(), "postBrandRemoved2"]);
await db.prepare("run", insertThumbnailTimestampQuery, ["postBrandRemoved2", 13.34]);
await db.prepare("run", insertThumbnailVotesQuery, ["postBrandRemoved2", 0, 1, 0]);
// Verified through title submissions
await db.prepare("run", insertTitleQuery, ["postBrandVerified1", "Some title", 0, getHash(userID7), Service.YouTube, getHash("postBrandVerified1"), Date.now(), "postBrandVerified1"]);
await db.prepare("run", insertTitleQuery, ["postBrandVerified2", "Some title", 1, getHash(userID7), Service.YouTube, getHash("postBrandVerified2"), Date.now(), "postBrandVerified2"]);
await db.prepare("run", insertTitleVotesQuery, ["postBrandVerified1", 5, 0, 0, -1]);
await db.prepare("run", insertTitleVotesQuery, ["postBrandVerified2", -1, 0, 0, -1]);
// Verified through SponsorBlock submissions
const insertSegment = 'INSERT INTO "sponsorTimes" ("videoID", "startTime", "endTime", "votes", "locked", "UUID", "userID", "timeSubmitted", "views", "category", "actionType", "service", "videoDuration", "hidden", "shadowHidden", "description") VALUES(?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)';
await db.prepare("run", insertSegment, ["postBrandVerified3", 1, 11, 1, 0, "postBrandVerified3", getHash(userID8), 0, 50, "sponsor", "skip", "YouTube", 100, 0, 0, ""]);
await db.prepare("run", insertSegment, ["postBrandVerified3", 11, 21, 1, 0, "postBrandVerified32", getHash(userID8), 0, 50, "sponsor", "skip", "YouTube", 100, 0, 0, ""]);
await db.prepare("run", insertSegment, ["postBrandVerified3", 21, 31, 1, 0, "postBrandVerified33", getHash(userID8), 0, 50, "sponsor", "skip", "YouTube", 100, 0, 0, ""]);
// Testing details for banned user handling
await db.prepare("run", insertTitleQuery, ["postBrandBannedCustomVote", "Some title", 0, getHash(userID1), Service.YouTube, getHash("postBrandBannedCustomVote"), Date.now(), "postBrandBannedCustomVote"]);
await db.prepare("run", insertTitleQuery, ["postBrandBannedOriginalVote", "Some title", 1, getHash(userID1), Service.YouTube, getHash("postBrandBannedOriginalVote"), Date.now(), "postBrandBannedOriginalVote"]);
@@ -75,7 +96,6 @@ describe("postBranding", () => {
await db.prepare("run", insertThumbnailQuery, ["postBrandBannedOriginalVote", 1, getHash(userID1), Service.YouTube, getHash("postBrandBannedOriginalVote"), Date.now(), "postBrandBannedOriginalVote"]);
await db.prepare("run", insertThumbnailVotesQuery, ["postBrandBannedCustomVote", 0, 0, 0]);
await db.prepare("run", insertThumbnailVotesQuery, ["postBrandBannedOriginalVote", 0, 0, 0]);
const insertThumbnailTimestampQuery = 'INSERT INTO "thumbnailTimestamps" ("UUID", "timestamp") VALUES (?, ?)';
await db.prepare("run", insertThumbnailTimestampQuery, ["postBrandBannedCustomVote", 12.34]);
});
@@ -99,6 +119,7 @@ describe("postBranding", () => {
assert.strictEqual(dbTitle.title, title.title);
assert.strictEqual(dbTitle.original, title.original ? 1 : 0);
assert.strictEqual(dbTitle.casualMode, 0);
assert.strictEqual(dbVotes.votes, 0);
assert.strictEqual(dbVotes.locked, 0);
@@ -127,11 +148,12 @@ describe("postBranding", () => {
assert.strictEqual(dbTitle.original, title.original ? 1 : 0);
assert.strictEqual(dbVotes.votes, 0);
assert.strictEqual(dbVotes.downvotes, 0);
assert.strictEqual(dbVotes.locked, 0);
assert.strictEqual(dbVotes.shadowHidden, 0);
});
it("Submit only original thumbnail", async () => {
it("Submit only original thumbnail without permission", async () => {
const videoID = "postBrand3";
const thumbnail = {
original: true
@@ -144,6 +166,25 @@ describe("postBranding", () => {
videoID
});
assert.strictEqual(res.status, 200);
const dbThumbnail = await queryThumbnailByVideo(videoID);
assert.strictEqual(dbThumbnail, undefined);
});
it("Submit only original thumbnail with permission", async () => {
const videoID = "postBrand3";
const thumbnail = {
original: true
};
const res = await postBranding({
thumbnail,
userID: userID4,
service: Service.YouTube,
videoID
});
assert.strictEqual(res.status, 200);
const dbThumbnail = await queryThumbnailByVideo(videoID);
const dbVotes = await queryThumbnailVotesByUUID(dbThumbnail.UUID);
@@ -155,6 +196,30 @@ describe("postBranding", () => {
assert.strictEqual(dbVotes.shadowHidden, 0);
});
it("Submit only original thumbnail as VIP", async () => {
const videoID = "postBrandV3";
const thumbnail = {
original: true
};
const res = await postBranding({
thumbnail,
userID: vipUser,
service: Service.YouTube,
videoID
});
assert.strictEqual(res.status, 200);
const dbThumbnail = await queryThumbnailByVideo(videoID);
const dbVotes = await queryThumbnailVotesByUUID(dbThumbnail.UUID);
assert.strictEqual(dbThumbnail.original, thumbnail.original ? 1 : 0);
assert.strictEqual(dbVotes.votes, 0);
assert.strictEqual(dbVotes.locked, 1);
assert.strictEqual(dbVotes.shadowHidden, 0);
});
it("Submit only custom thumbnail", async () => {
const videoID = "postBrand4";
const thumbnail = {
@@ -223,6 +288,316 @@ describe("postBranding", () => {
assert.strictEqual(dbThumbnailVotes.shadowHidden, 0);
});
it("Submit another title and thumbnail", async () => {
const videoID = "postBrand5";
const title = {
title: "Some other title",
original: false
};
const thumbnail = {
timestamp: 13.42,
original: false
};
const res = await postBranding({
title,
thumbnail,
userID: userID4,
service: Service.YouTube,
videoID
});
assert.strictEqual(res.status, 200);
const dbTitle = await queryTitleByVideo(videoID);
const dbTitleVotes = await queryTitleVotesByUUID(dbTitle.UUID);
const dbThumbnail = await queryThumbnailByVideo(videoID);
const dbThumbnailTimestamps = await queryThumbnailTimestampsByUUID(dbThumbnail.UUID);
const dbThumbnailVotes = await queryThumbnailVotesByUUID(dbThumbnail.UUID);
assert.strictEqual(dbTitle.title, title.title);
assert.strictEqual(dbTitle.original, title.original ? 1 : 0);
assert.strictEqual(dbTitleVotes.votes, 0);
assert.strictEqual(dbTitleVotes.locked, 0);
assert.strictEqual(dbTitleVotes.shadowHidden, 0);
assert.strictEqual(dbThumbnailTimestamps.timestamp, thumbnail.timestamp);
assert.strictEqual(dbThumbnail.original, thumbnail.original ? 1 : 0);
assert.strictEqual(dbThumbnailVotes.votes, 0);
assert.strictEqual(dbThumbnailVotes.locked, 0);
assert.strictEqual(dbThumbnailVotes.shadowHidden, 0);
});
it("Downvote title and thumbnail", async () => {
const videoID = "postBrand5";
const title = {
title: "Some other title",
original: false
};
const thumbnail = {
timestamp: 13.42,
original: false
};
const res = await postBranding({
title,
thumbnail,
userID: userID6,
service: Service.YouTube,
videoID,
downvote: true
});
assert.strictEqual(res.status, 200);
const dbTitles = await queryTitleByVideo(videoID, true);
for (const dbTitle of dbTitles) {
if (dbTitle.title === title.title) {
const dbTitleVotes = await queryTitleVotesByUUID(dbTitle.UUID);
assert.strictEqual(dbTitleVotes.votes, 0);
assert.strictEqual(dbTitleVotes.downvotes, 1);
assert.strictEqual(dbTitleVotes.locked, 0);
assert.strictEqual(dbTitleVotes.shadowHidden, 0);
}
}
const dbThumbnails = await queryThumbnailByVideo(videoID, true);
for (const dbThumbnail of dbThumbnails) {
if (dbThumbnail.timestamp === thumbnail.timestamp) {
const dbThumbnailVotes = await queryThumbnailVotesByUUID(dbThumbnail.UUID);
assert.strictEqual(dbThumbnailVotes.votes, 0);
assert.strictEqual(dbThumbnailVotes.downvotes, 1);
assert.strictEqual(dbThumbnailVotes.locked, 0);
assert.strictEqual(dbThumbnailVotes.shadowHidden, 0);
}
}
});
it("Double downvote title and thumbnail should only count once", async () => {
const videoID = "postBrand5";
const title = {
title: "Some other title",
original: false
};
const thumbnail = {
timestamp: 13.42,
original: false
};
const res = await postBranding({
title,
thumbnail,
userID: userID6,
service: Service.YouTube,
videoID,
downvote: true
});
assert.strictEqual(res.status, 200);
const dbTitles = await queryTitleByVideo(videoID, true);
for (const dbTitle of dbTitles) {
if (dbTitle.title === title.title) {
const dbTitleVotes = await queryTitleVotesByUUID(dbTitle.UUID);
assert.strictEqual(dbTitleVotes.votes, 0);
assert.strictEqual(dbTitleVotes.downvotes, 1);
assert.strictEqual(dbTitleVotes.locked, 0);
assert.strictEqual(dbTitleVotes.shadowHidden, 0);
}
}
const dbThumbnails = await queryThumbnailByVideo(videoID, true);
for (const dbThumbnail of dbThumbnails) {
if (dbThumbnail.timestamp === thumbnail.timestamp) {
const dbThumbnailVotes = await queryThumbnailVotesByUUID(dbThumbnail.UUID);
assert.strictEqual(dbThumbnailVotes.votes, 0);
assert.strictEqual(dbThumbnailVotes.downvotes, 1);
assert.strictEqual(dbThumbnailVotes.locked, 0);
assert.strictEqual(dbThumbnailVotes.shadowHidden, 0);
}
}
});
it("Downvote your own title and thumbnail", async () => {
const videoID = "postBrand5";
const title = {
title: "Some other title",
original: false
};
const thumbnail = {
timestamp: 13.42,
original: false
};
const res = await postBranding({
title,
thumbnail,
userID: userID4,
service: Service.YouTube,
videoID,
downvote: true
});
assert.strictEqual(res.status, 200);
const dbTitles = await queryTitleByVideo(videoID, true);
for (const dbTitle of dbTitles) {
if (dbTitle.title === title.title) {
const dbTitleVotes = await queryTitleVotesByUUID(dbTitle.UUID);
assert.strictEqual(dbTitleVotes.votes, -1);
assert.strictEqual(dbTitleVotes.downvotes, 1);
assert.strictEqual(dbTitleVotes.locked, 0);
assert.strictEqual(dbTitleVotes.shadowHidden, 0);
}
}
const dbThumbnails = await queryThumbnailByVideo(videoID, true);
for (const dbThumbnail of dbThumbnails) {
if (dbThumbnail.timestamp === thumbnail.timestamp) {
const dbThumbnailVotes = await queryThumbnailVotesByUUID(dbThumbnail.UUID);
assert.strictEqual(dbThumbnailVotes.votes, -1);
assert.strictEqual(dbThumbnailVotes.downvotes, 1);
assert.strictEqual(dbThumbnailVotes.locked, 0);
assert.strictEqual(dbThumbnailVotes.shadowHidden, 0);
}
}
});
it("Downvote another title and thumbnail", async () => {
const videoID = "postBrand5";
const title = {
title: "Some title",
original: false
};
const thumbnail = {
timestamp: 12.42,
original: false
};
const res = await postBranding({
title,
thumbnail,
userID: userID6,
service: Service.YouTube,
videoID,
downvote: true
});
assert.strictEqual(res.status, 200);
const dbTitles = await queryTitleByVideo(videoID, true);
for (const dbTitle of dbTitles) {
if (dbTitle.title === title.title) {
const dbTitleVotes = await queryTitleVotesByUUID(dbTitle.UUID);
assert.strictEqual(dbTitleVotes.votes, 0);
assert.strictEqual(dbTitleVotes.downvotes, 1);
assert.strictEqual(dbTitleVotes.locked, 0);
assert.strictEqual(dbTitleVotes.shadowHidden, 0);
}
}
const dbThumbnails = await queryThumbnailByVideo(videoID, true);
for (const dbThumbnail of dbThumbnails) {
const dbThumbnailVotes = await queryThumbnailVotesByUUID(dbThumbnail.UUID);
if (dbThumbnail.timestamp === thumbnail.timestamp) {
assert.strictEqual(dbThumbnailVotes.votes, 0);
assert.strictEqual(dbThumbnailVotes.downvotes, 1);
assert.strictEqual(dbThumbnailVotes.locked, 0);
assert.strictEqual(dbThumbnailVotes.shadowHidden, 0);
}
}
});
it("Fail to downvote locked title and thumbnail", async () => {
const videoID = "postBrandRemoved1";
const title = {
title: "Some other title",
original: false
};
const thumbnail = {
timestamp: 12.34,
original: false
};
const res = await postBranding({
title,
thumbnail,
userID: userID6,
service: Service.YouTube,
videoID,
downvote: true
});
assert.strictEqual(res.status, 403);
const dbTitles = await queryTitleByVideo(videoID, true);
for (const dbTitle of dbTitles) {
if (dbTitle.title === title.title) {
const dbTitleVotes = await queryTitleVotesByUUID(dbTitle.UUID);
assert.strictEqual(dbTitleVotes.votes, 0);
assert.strictEqual(dbTitleVotes.downvotes, 0);
assert.strictEqual(dbTitleVotes.locked, 1);
assert.strictEqual(dbTitleVotes.shadowHidden, 0);
}
}
const dbThumbnails = await queryThumbnailByVideo(videoID, true);
for (const dbThumbnail of dbThumbnails) {
if (dbThumbnail.timestamp === thumbnail.timestamp) {
const dbThumbnailVotes = await queryThumbnailVotesByUUID(dbThumbnail.UUID);
assert.strictEqual(dbThumbnailVotes.votes, 0);
assert.strictEqual(dbThumbnailVotes.downvotes, 0);
assert.strictEqual(dbThumbnailVotes.locked, 1);
assert.strictEqual(dbThumbnailVotes.shadowHidden, 0);
}
}
});
it("Upvote after downvoting title and thumbnail", async () => {
const videoID = "postBrand5";
const title = {
title: "Some other title",
original: false
};
const thumbnail = {
timestamp: 13.42,
original: false
};
const res = await postBranding({
title,
thumbnail,
userID: userID6,
service: Service.YouTube,
videoID
});
assert.strictEqual(res.status, 200);
const dbTitles = await queryTitleByVideo(videoID, true);
for (const dbTitle of dbTitles) {
if (dbTitle.title === title.title) {
const dbTitleVotes = await queryTitleVotesByUUID(dbTitle.UUID);
assert.strictEqual(dbTitleVotes.votes, 0);
assert.strictEqual(dbTitleVotes.downvotes, 0);
assert.strictEqual(dbTitleVotes.locked, 0);
assert.strictEqual(dbTitleVotes.shadowHidden, 0);
}
}
const dbThumbnails = await queryThumbnailByVideo(videoID, true);
for (const dbThumbnail of dbThumbnails) {
if (dbThumbnail.timestamp === thumbnail.timestamp) {
const dbThumbnailVotes = await queryThumbnailVotesByUUID(dbThumbnail.UUID);
assert.strictEqual(dbThumbnailVotes.votes, 0);
assert.strictEqual(dbThumbnailVotes.downvotes, 0);
assert.strictEqual(dbThumbnailVotes.locked, 0);
assert.strictEqual(dbThumbnailVotes.shadowHidden, 0);
}
}
});
it("Submit title and thumbnail as VIP", async () => {
const videoID = "postBrand6";
const title = {
@@ -320,6 +695,192 @@ describe("postBranding", () => {
assert.strictEqual(otherSegmentThumbnailVotes2.locked, 1);
});
it("Submit title and thumbnail as VIP without locking", async () => {
const videoID = "postBrand6";
const title = {
title: "Some title",
original: false
};
const thumbnail = {
timestamp: 12.42,
original: false
};
const res = await postBranding({
title,
thumbnail,
userID: vipUser,
service: Service.YouTube,
videoID,
autoLock: false
});
assert.strictEqual(res.status, 200);
const dbTitles = await queryTitleByVideo(videoID, true);
for (const dbTitle of dbTitles) {
if (dbTitle.title === title.title) {
const dbTitleVotes = await queryTitleVotesByUUID(dbTitle.UUID);
assert.strictEqual(dbTitleVotes.locked, 0);
}
}
const dbThumbnails = await queryThumbnailByVideo(videoID, true);
for (const dbThumbnail of dbThumbnails) {
if (dbThumbnail.timestamp === thumbnail.timestamp) {
const dbThumbnailVotes = await queryThumbnailVotesByUUID(dbThumbnail.UUID);
assert.strictEqual(dbThumbnailVotes.locked, 0);
}
}
});
it("Downvote title and thumbnail as VIP", async () => {
const videoID = "postBrandRemoved1";
const title = {
title: "Some title",
original: false
};
const thumbnail = {
timestamp: 12.34,
original: false
};
const res = await postBranding({
title,
thumbnail,
userID: vipUser,
service: Service.YouTube,
videoID,
downvote: true
});
assert.strictEqual(res.status, 200);
const otherSegmentTitleVotes1 = await queryTitleVotesByUUID("postBrandRemoved1");
const otherSegmentTitleVotes2 = await queryTitleVotesByUUID("postBrandRemoved2");
const otherSegmentThumbnailVotes1 = await queryThumbnailVotesByUUID("postBrandRemoved1");
const otherSegmentThumbnailVotes2 = await queryThumbnailVotesByUUID("postBrandRemoved2");
assert.strictEqual(otherSegmentTitleVotes1.removed, 1);
assert.strictEqual(otherSegmentTitleVotes1.downvotes, 1);
assert.strictEqual(otherSegmentTitleVotes2.removed, 0);
assert.strictEqual(otherSegmentTitleVotes2.downvotes, 0);
assert.strictEqual(otherSegmentThumbnailVotes1.removed, 1);
assert.strictEqual(otherSegmentThumbnailVotes1.downvotes, 1);
assert.strictEqual(otherSegmentThumbnailVotes2.removed, 0);
assert.strictEqual(otherSegmentThumbnailVotes2.downvotes, 0);
});
it("Downvote another title and thumbnail as VIP", async () => {
const videoID = "postBrandRemoved1";
const title = {
title: "Some other title",
original: false
};
const thumbnail = {
timestamp: 13.34,
original: false
};
const res = await postBranding({
title,
thumbnail,
userID: vipUser,
service: Service.YouTube,
videoID,
downvote: true
});
assert.strictEqual(res.status, 200);
const otherSegmentTitleVotes1 = await queryTitleVotesByUUID("postBrandRemoved1");
const otherSegmentTitleVotes2 = await queryTitleVotesByUUID("postBrandRemoved2");
const otherSegmentThumbnailVotes1 = await queryThumbnailVotesByUUID("postBrandRemoved1");
const otherSegmentThumbnailVotes2 = await queryThumbnailVotesByUUID("postBrandRemoved2");
assert.strictEqual(otherSegmentTitleVotes1.removed, 1);
assert.strictEqual(otherSegmentTitleVotes1.downvotes, 1);
assert.strictEqual(otherSegmentTitleVotes2.removed, 1);
assert.strictEqual(otherSegmentTitleVotes2.downvotes, 1);
assert.strictEqual(otherSegmentThumbnailVotes1.removed, 1);
assert.strictEqual(otherSegmentThumbnailVotes1.downvotes, 1);
assert.strictEqual(otherSegmentThumbnailVotes2.removed, 1);
assert.strictEqual(otherSegmentThumbnailVotes2.downvotes, 1);
});
it("Remove downvote on title and thumbnail as VIP", async () => {
const videoID = "postBrandRemoved1";
const title = {
title: "Some title",
original: false
};
const thumbnail = {
timestamp: 12.34,
original: false
};
const res = await postBranding({
title,
thumbnail,
userID: vipUser,
service: Service.YouTube,
videoID
});
assert.strictEqual(res.status, 200);
const otherSegmentTitleVotes1 = await queryTitleVotesByUUID("postBrandRemoved1");
const otherSegmentTitleVotes2 = await queryTitleVotesByUUID("postBrandRemoved2");
const otherSegmentThumbnailVotes1 = await queryThumbnailVotesByUUID("postBrandRemoved1");
const otherSegmentThumbnailVotes2 = await queryThumbnailVotesByUUID("postBrandRemoved2");
assert.strictEqual(otherSegmentTitleVotes1.removed, 0);
assert.strictEqual(otherSegmentTitleVotes1.downvotes, 0);
assert.strictEqual(otherSegmentTitleVotes2.removed, 1);
assert.strictEqual(otherSegmentTitleVotes2.downvotes, 1);
assert.strictEqual(otherSegmentThumbnailVotes1.removed, 0);
assert.strictEqual(otherSegmentThumbnailVotes1.downvotes, 0);
assert.strictEqual(otherSegmentThumbnailVotes2.removed, 1);
assert.strictEqual(otherSegmentThumbnailVotes2.downvotes, 1);
});
it("Downvote title and thumbnail as VIP without removing", async () => {
const videoID = "postBrandRemoved1";
const title = {
title: "Some title",
original: false
};
const thumbnail = {
timestamp: 12.34,
original: false
};
const res = await postBranding({
title,
thumbnail,
userID: vipUser,
service: Service.YouTube,
videoID,
downvote: true,
autoLock: false
});
assert.strictEqual(res.status, 200);
const otherSegmentTitleVotes1 = await queryTitleVotesByUUID("postBrandRemoved1");
const otherSegmentTitleVotes2 = await queryTitleVotesByUUID("postBrandRemoved2");
const otherSegmentThumbnailVotes1 = await queryThumbnailVotesByUUID("postBrandRemoved1");
const otherSegmentThumbnailVotes2 = await queryThumbnailVotesByUUID("postBrandRemoved2");
assert.strictEqual(otherSegmentTitleVotes1.removed, 0);
assert.strictEqual(otherSegmentTitleVotes1.downvotes, 1);
assert.strictEqual(otherSegmentTitleVotes2.removed, 1);
assert.strictEqual(otherSegmentTitleVotes2.downvotes, 1);
assert.strictEqual(otherSegmentThumbnailVotes1.removed, 0);
assert.strictEqual(otherSegmentThumbnailVotes1.downvotes, 1);
assert.strictEqual(otherSegmentThumbnailVotes2.removed, 1);
assert.strictEqual(otherSegmentThumbnailVotes2.downvotes, 1);
});
it("Vote the same title again", async () => {
const videoID = "postBrand1";
const title = {
@@ -414,7 +975,7 @@ describe("postBranding", () => {
const res = await postBranding({
thumbnail,
userID: userID3,
userID: userID4,
service: Service.YouTube,
videoID
});
@@ -486,6 +1047,7 @@ describe("postBranding", () => {
assert.strictEqual(dbTitle.original, title.original ? 1 : 0);
assert.strictEqual(dbVotes.votes, 1);
assert.strictEqual(dbVotes.downvotes, 0);
assert.strictEqual(dbVotes.locked, 0);
assert.strictEqual(dbVotes.shadowHidden, 0);
});
@@ -572,28 +1134,48 @@ describe("postBranding", () => {
assert.strictEqual(dbVotes3.verification, 0);
});
it("Submit from verified user from SponsorBlock submissions", async () => {
const videoID = "postBrandVerified2-2";
it("Submit title and thumbnail with casual mode", async () => {
const videoID = "postBrandCasual1";
const title = {
title: "Some title",
original: false
};
const thumbnail = {
timestamp: 12.42,
original: false
};
const res = await postBranding({
title,
userID: userID8,
thumbnail,
userID: userID5,
service: Service.YouTube,
videoID
videoID,
casualMode: true
});
assert.strictEqual(res.status, 200);
const dbTitle = await queryTitleByVideo(videoID);
const dbVotes = await queryTitleVotesByUUID(dbTitle.UUID);
const dbTitleVotes = await queryTitleVotesByUUID(dbTitle.UUID);
const dbThumbnail = await queryThumbnailByVideo(videoID);
const dbThumbnailTimestamps = await queryThumbnailTimestampsByUUID(dbThumbnail.UUID);
const dbThumbnailVotes = await queryThumbnailVotesByUUID(dbThumbnail.UUID);
assert.strictEqual(dbTitle.title, title.title);
assert.strictEqual(dbTitle.original, title.original ? 1 : 0);
assert.strictEqual(dbTitle.casualMode, 1);
assert.strictEqual(dbVotes.verification, 0);
assert.strictEqual(dbTitleVotes.votes, 0);
assert.strictEqual(dbTitleVotes.locked, 0);
assert.strictEqual(dbTitleVotes.shadowHidden, 0);
assert.strictEqual(dbThumbnailTimestamps.timestamp, thumbnail.timestamp);
assert.strictEqual(dbThumbnail.original, thumbnail.original ? 1 : 0);
assert.strictEqual(dbThumbnail.casualMode, 1);
assert.strictEqual(dbThumbnailVotes.votes, 0);
assert.strictEqual(dbThumbnailVotes.locked, 0);
assert.strictEqual(dbThumbnailVotes.shadowHidden, 0);
});
it("Banned users should not be able to vote (custom title)", async () => {

251
test/cases/postCasual.ts Normal file
View File

@@ -0,0 +1,251 @@
import { db } from "../../src/databases/databases";
import { client } from "../utils/httpClient";
import assert from "assert";
import { Service } from "../../src/types/segments.model";
describe("postCasual", () => {
const userID1 = `PostCasualUser1${".".repeat(16)}`;
const userID2 = `PostCasualUser2${".".repeat(16)}`;
const userID3 = `PostCasualUser3${".".repeat(16)}`;
const endpoint = "/api/casual";
const postCasual = (data: Record<string, any>) => client({
method: "POST",
url: endpoint,
data
});
const queryCasualVotesByVideo = (videoID: string, all = false, titleID = 0) => db.prepare(all ? "all" : "get", `SELECT * FROM "casualVotes" WHERE "videoID" = ? AND "titleID" = ? ORDER BY "timeSubmitted" ASC`, [videoID, titleID]);
it("submit casual vote", async () => {
const videoID = "postCasual1";
const res = await postCasual({
categories: ["clever"],
userID: userID1,
service: Service.YouTube,
title: "title",
videoID
});
assert.strictEqual(res.status, 200);
const dbVotes = await queryCasualVotesByVideo(videoID);
assert.strictEqual(dbVotes.category, "clever");
assert.strictEqual(dbVotes.upvotes, 1);
});
it("submit same casual vote again", async () => {
const videoID = "postCasual1";
const res = await postCasual({
categories: ["clever"],
userID: userID1,
service: Service.YouTube,
title: "title",
videoID
});
assert.strictEqual(res.status, 200);
const dbVotes = await queryCasualVotesByVideo(videoID);
assert.strictEqual(dbVotes.category, "clever");
assert.strictEqual(dbVotes.upvotes, 1);
});
it("submit casual upvote", async () => {
const videoID = "postCasual1";
const res = await postCasual({
categories: ["clever"],
userID: userID2,
service: Service.YouTube,
title: "title",
videoID
});
assert.strictEqual(res.status, 200);
const dbVotes = await queryCasualVotesByVideo(videoID);
assert.strictEqual(dbVotes.category, "clever");
assert.strictEqual(dbVotes.upvotes, 2);
});
it("submit casual downvote from same user", async () => {
const videoID = "postCasual1";
const res = await postCasual({
downvote: true,
userID: userID1,
service: Service.YouTube,
videoID
});
assert.strictEqual(res.status, 200);
const dbVotes = await queryCasualVotesByVideo(videoID, true);
assert.strictEqual(dbVotes[0].category, "clever");
assert.strictEqual(dbVotes[0].upvotes, 1);
assert.strictEqual(dbVotes[1].category, "downvote");
assert.strictEqual(dbVotes[1].upvotes, 1);
});
it("submit casual downvote from different user", async () => {
const videoID = "postCasual1";
const res = await postCasual({
downvote: true,
userID: userID3,
service: Service.YouTube,
title: "title",
videoID
});
assert.strictEqual(res.status, 200);
const dbVotes = await queryCasualVotesByVideo(videoID, true);
assert.strictEqual(dbVotes[0].category, "clever");
assert.strictEqual(dbVotes[0].upvotes, 1);
assert.strictEqual(dbVotes[1].category, "downvote");
assert.strictEqual(dbVotes[1].upvotes, 2);
});
it("submit casual upvote from same user", async () => {
const videoID = "postCasual1";
const res = await postCasual({
categories: ["clever"],
downvote: false,
userID: userID3,
service: Service.YouTube,
title: "title",
videoID
});
assert.strictEqual(res.status, 200);
const dbVotes = await queryCasualVotesByVideo(videoID, true);
assert.strictEqual(dbVotes[0].category, "clever");
assert.strictEqual(dbVotes[0].upvotes, 2);
assert.strictEqual(dbVotes[1].category, "downvote");
assert.strictEqual(dbVotes[1].upvotes, 1);
});
it("submit multiple casual votes", async () => {
const videoID = "postCasual2";
const res = await postCasual({
categories: ["clever", "other"],
userID: userID1,
service: Service.YouTube,
title: "title",
videoID
});
assert.strictEqual(res.status, 200);
const dbVotes = await queryCasualVotesByVideo(videoID, true);
assert.strictEqual(dbVotes[0].category, "clever");
assert.strictEqual(dbVotes[0].upvotes, 1);
assert.strictEqual(dbVotes[1].category, "other");
assert.strictEqual(dbVotes[1].upvotes, 1);
});
it("downvote on video with previous votes with multiple categories", async () => {
const videoID = "postCasual2";
const res = await postCasual({
downvote: true,
userID: userID1,
service: Service.YouTube,
title: "title",
videoID
});
assert.strictEqual(res.status, 200);
const dbVotes = await queryCasualVotesByVideo(videoID, true);
assert.strictEqual(dbVotes[0].category, "clever");
assert.strictEqual(dbVotes[0].upvotes, 0);
assert.strictEqual(dbVotes[1].category, "other");
assert.strictEqual(dbVotes[1].upvotes, 0);
assert.strictEqual(dbVotes[2].category, "downvote");
assert.strictEqual(dbVotes[2].upvotes, 1);
});
it("upvote on video with previous downvotes with multiple categories", async () => {
const videoID = "postCasual2";
const res = await postCasual({
categories: ["clever", "other"],
userID: userID1,
service: Service.YouTube,
title: "title",
videoID
});
assert.strictEqual(res.status, 200);
const dbVotes = await queryCasualVotesByVideo(videoID, true);
assert.strictEqual(dbVotes[0].category, "clever");
assert.strictEqual(dbVotes[0].upvotes, 1);
assert.strictEqual(dbVotes[1].category, "other");
assert.strictEqual(dbVotes[1].upvotes, 1);
});
it("downvote on video with no existing votes", async () => {
const videoID = "postCasual3";
const res = await postCasual({
userID: userID1,
service: Service.YouTube,
title: "title",
videoID,
downvote: true
});
assert.strictEqual(res.status, 200);
const dbVotes = await queryCasualVotesByVideo(videoID);
assert.strictEqual(dbVotes.category, "downvote");
assert.strictEqual(dbVotes.upvotes, 1);
});
it("submit multiple casual votes for different title", async () => {
const videoID = "postCasual2";
const res = await postCasual({
categories: ["clever", "funny"],
userID: userID2,
service: Service.YouTube,
title: "title 2",
videoID
});
assert.strictEqual(res.status, 200);
const dbVotes = await queryCasualVotesByVideo(videoID, true, 1);
assert.strictEqual(dbVotes[0].category, "clever");
assert.strictEqual(dbVotes[0].upvotes, 1);
assert.strictEqual(dbVotes[1].category, "funny");
assert.strictEqual(dbVotes[1].upvotes, 1);
const dbVotesOriginal = await queryCasualVotesByVideo(videoID, true, 0);
assert.strictEqual(dbVotesOriginal[0].category, "clever");
assert.strictEqual(dbVotesOriginal[0].upvotes, 1);
assert.strictEqual(dbVotesOriginal[1].category, "other");
assert.strictEqual(dbVotesOriginal[1].upvotes, 1);
});
});

View File

@@ -46,16 +46,16 @@ describe("postSkipSegments", () => {
const submitVIPuser = `VIPPostSkipUser${".".repeat(16)}`;
const queryDatabase = (videoID: string) => db.prepare("get", `SELECT "startTime", "endTime", "votes", "userID", "locked", "category", "actionType" FROM "sponsorTimes" WHERE "videoID" = ?`, [videoID]);
const queryDatabaseActionType = (videoID: string) => db.prepare("get", `SELECT "startTime", "endTime", "locked", "category", "actionType" FROM "sponsorTimes" WHERE "videoID" = ?`, [videoID]);
const queryDatabase = (videoID: string, service = "YouTube") => db.prepare("get", `SELECT "startTime", "endTime", "votes", "userID", "locked", "category", "actionType" FROM "sponsorTimes" WHERE "videoID" = ? AND "service" = ?`, [videoID, service]);
const queryDatabaseActionType = (videoID: string, service = "YouTube") => db.prepare("get", `SELECT "startTime", "endTime", "locked", "category", "actionType" FROM "sponsorTimes" WHERE "videoID" = ? AND "service" = ?`, [videoID, service]);
const queryDatabaseVideoInfo = (videoID: string) => db.prepare("get", `SELECT * FROM "videoInfo" WHERE "videoID" = ?`, [videoID]);
before(() => {
before(async () => {
const insertSponsorTimeQuery = 'INSERT INTO "sponsorTimes" ("videoID", "startTime", "endTime", "votes", "UUID", "userID", "timeSubmitted", views, category, "actionType", "videoDuration", "shadowHidden", "hashedVideoID") VALUES(?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)';
db.prepare("run", insertSponsorTimeQuery, ["full_video_segment", 0, 0, 0, "full-video-uuid-0", submitUserTwoHash, 0, 0, "sponsor", "full", 0, 0, "full_video_segment"]);
await db.prepare("run", insertSponsorTimeQuery, ["full_video_segment", 0, 0, 0, "full-video-uuid-0", submitUserTwoHash, 0, 0, "sponsor", "full", 0, 0, "full_video_segment"]);
const insertVipUserQuery = 'INSERT INTO "vipUsers" ("userID") VALUES (?)';
db.prepare("run", insertVipUserQuery, [getHash(submitVIPuser)]);
await db.prepare("run", insertVipUserQuery, [getHash(submitVIPuser)]);
});
it("Should be able to submit a single time (Params method)", (done) => {
@@ -142,33 +142,6 @@ describe("postSkipSegments", () => {
.catch(err => done(err));
});
it("Should be able to submit a single time under a different service (JSON method)", (done) => {
const videoID = "postSkip7";
postSkipSegmentJSON({
userID: submitUserOne,
videoID,
service: "PeerTube",
segments: [{
segment: [0, 10],
category: "sponsor",
}],
})
.then(async res => {
assert.strictEqual(res.status, 200);
const row = await db.prepare("get", `SELECT "startTime", "endTime", "locked", "category", "service" FROM "sponsorTimes" WHERE "videoID" = ?`, [videoID]);
const expected = {
startTime: 0,
endTime: 10,
locked: 0,
category: "sponsor",
service: "PeerTube",
};
assert.ok(partialDeepEquals(row, expected));
done();
})
.catch(err => done(err));
});
it("VIP submission should start locked", (done) => {
const videoID = "vipuserIDSubmission";
postSkipSegmentJSON({
@@ -374,4 +347,55 @@ describe("postSkipSegments", () => {
})
.catch(err => done(err));
});
it("Should be able to submit for spotify service", (done) => {
const videoID = "postSkipParamSingle";
postSkipSegmentParam({
videoID,
startTime: 23,
endTime: 105,
userID: submitUserOne,
category: "sponsor",
service: "Spotify"
})
.then(async res => {
assert.strictEqual(res.status, 200);
const row = await queryDatabase(videoID, "Spotify");
const expected = {
startTime: 23,
endTime: 105,
category: "sponsor",
};
assert.ok(partialDeepEquals(row, expected));
done();
})
.catch(err => done(err));
});
it("Should be able to submit a time for spotify service (JSON method)", (done) => {
const videoID = "postSkipJSONSingle";
postSkipSegmentJSON({
userID: submitUserOne,
videoID,
segments: [{
segment: [22, 103],
category: "sponsor",
}],
service: "Spotify"
})
.then(async res => {
assert.strictEqual(res.status, 200);
const row = await queryDatabase(videoID, "Spotify");
const expected = {
startTime: 22,
endTime: 103,
locked: 0,
category: "sponsor",
};
assert.ok(partialDeepEquals(row, expected));
done();
})
.catch(err => done(err));
});
});

View File

@@ -19,11 +19,11 @@ describe("postSkipSegments - Automod 80%", () => {
const queryDatabaseCategory = (videoID: string) => db.prepare("all", `SELECT "startTime", "endTime", "category" FROM "sponsorTimes" WHERE "videoID" = ? and "votes" > -1`, [videoID]);
before(() => {
before(async () => {
const insertSponsorTimeQuery = 'INSERT INTO "sponsorTimes" ("videoID", "startTime", "endTime", "votes", "UUID", "userID", "timeSubmitted", views, category, "actionType", "videoDuration", "shadowHidden", "hashedVideoID") VALUES(?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)';
db.prepare("run", insertSponsorTimeQuery, [over80VideoID, 0, 1000, 0, "80percent-uuid-0", userIDHash, 0, 0, "interaction", "skip", 0, 0, over80VideoID]);
db.prepare("run", insertSponsorTimeQuery, [over80VideoID, 1001, 1005, 0, "80percent-uuid-1", userIDHash, 0, 0, "interaction", "skip", 0, 0, over80VideoID]);
db.prepare("run", insertSponsorTimeQuery, [over80VideoID, 0, 5000, -2, "80percent-uuid-2", userIDHash, 0, 0, "interaction", "skip", 0, 0, over80VideoID]);
await db.prepare("run", insertSponsorTimeQuery, [over80VideoID, 0, 1000, 0, "80percent-uuid-0", userIDHash, 0, 0, "interaction", "skip", 0, 0, over80VideoID]);
await db.prepare("run", insertSponsorTimeQuery, [over80VideoID, 1001, 1005, 0, "80percent-uuid-1", userIDHash, 0, 0, "interaction", "skip", 0, 0, over80VideoID]);
await db.prepare("run", insertSponsorTimeQuery, [over80VideoID, 0, 5000, -2, "80percent-uuid-2", userIDHash, 0, 0, "interaction", "skip", 0, 0, over80VideoID]);
});
it("Should allow multiple times if total is under 80% of video (JSON method)", (done) => {

Some files were not shown because too many files have changed in this diff Show More