430 Commits

Author SHA1 Message Date
Ajay Ramachandran
fcd0fb7ac7 Merge pull request #616 from ajayyy/dependabot/npm_and_yarn/js-yaml-3.14.2
Bump js-yaml from 3.14.1 to 3.14.2
2025-11-17 16:22:24 -05:00
dependabot[bot]
b97b50a8f6 Bump js-yaml from 3.14.1 to 3.14.2
Bumps [js-yaml](https://github.com/nodeca/js-yaml) from 3.14.1 to 3.14.2.
- [Changelog](https://github.com/nodeca/js-yaml/blob/master/CHANGELOG.md)
- [Commits](https://github.com/nodeca/js-yaml/compare/3.14.1...3.14.2)

---
updated-dependencies:
- dependency-name: js-yaml
  dependency-version: 3.14.2
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-11-17 17:48:52 +00:00
Ajay
6d5b6dd3ae Add service test case for get skip segments 2025-10-14 03:46:27 -04:00
Ajay
0412386870 Add support for spotify service 2025-10-14 03:34:02 -04:00
Ajay
1eedc9fa09 Remove old test 2025-10-03 15:13:31 -04:00
Ajay
c1fc6519b4 Remove username restriction 2025-10-03 15:09:02 -04:00
Ajay
2d5d3637fd Fix errors 2025-09-30 23:02:16 -04:00
Ajay
99ed7698c4 Handle trimmed UUID duplicates 2025-09-30 22:52:32 -04:00
Ajay Ramachandran
c0ee5206a2 Merge pull request #613 from ajayyy/dependabot/npm_and_yarn/tar-fs-2.1.4
Bump tar-fs from 2.1.3 to 2.1.4
2025-09-26 23:58:45 -04:00
dependabot[bot]
9c65f3ca34 Bump tar-fs from 2.1.3 to 2.1.4
Bumps [tar-fs](https://github.com/mafintosh/tar-fs) from 2.1.3 to 2.1.4.
- [Commits](https://github.com/mafintosh/tar-fs/compare/v2.1.3...v2.1.4)

---
updated-dependencies:
- dependency-name: tar-fs
  dependency-version: 2.1.4
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-09-26 19:54:27 +00:00
Ajay Ramachandran
c2c92cd168 Merge pull request #610 from mini-bomba/warning-history
Keep a history of warnings in the public database
2025-09-19 02:15:59 -04:00
Ajay Ramachandran
b4ea2018d5 Merge pull request #596 from ajayyy/dependabot/npm_and_yarn/multi-456de2e4f1
Bump serialize-javascript and mocha
2025-09-18 16:08:32 -04:00
dependabot[bot]
da448af4cf Bump serialize-javascript and mocha
Bumps [serialize-javascript](https://github.com/yahoo/serialize-javascript) to 6.0.2 and updates ancestor dependency [mocha](https://github.com/mochajs/mocha). These dependencies need to be updated together.


Updates `serialize-javascript` from 6.0.0 to 6.0.2
- [Release notes](https://github.com/yahoo/serialize-javascript/releases)
- [Commits](https://github.com/yahoo/serialize-javascript/compare/v6.0.0...v6.0.2)

Updates `mocha` from 10.1.0 to 10.8.2
- [Release notes](https://github.com/mochajs/mocha/releases)
- [Changelog](https://github.com/mochajs/mocha/blob/main/CHANGELOG.md)
- [Commits](https://github.com/mochajs/mocha/compare/v10.1.0...v10.8.2)

---
updated-dependencies:
- dependency-name: serialize-javascript
  dependency-type: indirect
- dependency-name: mocha
  dependency-type: direct:development
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-09-18 19:34:12 +00:00
Ajay Ramachandran
33a7934f33 Merge pull request #607 from ajayyy/dependabot/npm_and_yarn/form-data-4.0.4
Bump form-data from 4.0.0 to 4.0.4
2025-09-18 15:32:47 -04:00
Ajay Ramachandran
a2cad19167 Merge pull request #612 from ajayyy/dependabot/npm_and_yarn/multi-e981fcb12d
Bump path-to-regexp and express
2025-09-18 15:32:39 -04:00
dependabot[bot]
721720a60d Bump path-to-regexp and express
Bumps [path-to-regexp](https://github.com/pillarjs/path-to-regexp) to 1.9.0 and updates ancestor dependency [express](https://github.com/expressjs/express). These dependencies need to be updated together.


Updates `path-to-regexp` from 1.8.0 to 1.9.0
- [Release notes](https://github.com/pillarjs/path-to-regexp/releases)
- [Changelog](https://github.com/pillarjs/path-to-regexp/blob/master/History.md)
- [Commits](https://github.com/pillarjs/path-to-regexp/compare/v1.8.0...v1.9.0)

Updates `express` from 4.21.1 to 4.21.2
- [Release notes](https://github.com/expressjs/express/releases)
- [Changelog](https://github.com/expressjs/express/blob/4.21.2/History.md)
- [Commits](https://github.com/expressjs/express/compare/4.21.1...4.21.2)

---
updated-dependencies:
- dependency-name: path-to-regexp
  dependency-version: 1.9.0
  dependency-type: indirect
- dependency-name: express
  dependency-version: 4.21.2
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-09-13 13:49:39 +00:00
Ajay Ramachandran
220fe52013 Merge pull request #611 from ajayyy/dependabot/npm_and_yarn/axios-1.12.1
Bump axios from 1.8.4 to 1.12.1
2025-09-13 09:48:30 -04:00
dependabot[bot]
07c0f5cfbd Bump axios from 1.8.4 to 1.12.1
Bumps [axios](https://github.com/axios/axios) from 1.8.4 to 1.12.1.
- [Release notes](https://github.com/axios/axios/releases)
- [Changelog](https://github.com/axios/axios/blob/v1.x/CHANGELOG.md)
- [Commits](https://github.com/axios/axios/compare/v1.8.4...v1.12.1)

---
updated-dependencies:
- dependency-name: axios
  dependency-version: 1.12.1
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-09-13 12:38:31 +00:00
mini-bomba
899000309f make eslint scream about promises, then fix all lints
also rewrite a bunch of test suites from using done callbacks to using
async functions - it's way too easy to forget about a .catch() clause
2025-09-11 01:14:40 +02:00
mini-bomba
5664ff4f58 fix missed awaits for db.prepare in test cases 2025-09-10 23:24:44 +02:00
mini-bomba
c942eea640 autogenerate userids for the postwarning test suite 2025-09-10 22:59:54 +02:00
mini-bomba
b09e552d1d add disableTime column to the warnings table 2025-09-10 22:50:43 +02:00
mini-bomba
3e74a0da58 Remove warning expiry, save warning history 2025-09-10 18:54:56 +02:00
mini-bomba
1b99a8534c type IDatabase::prepare with overloads 2025-09-10 17:08:39 +02:00
Ajay
3711286ef2 Fix old xss prevention only removing first less than symbol 2025-07-30 01:26:02 -04:00
Ajay
74b9b123a8 Merge branch 'master' of https://github.com/ajayyy/SponsorBlockServer 2025-07-25 02:15:29 -04:00
Ajay
367cb24478 Add hook category 2025-07-25 02:15:27 -04:00
dependabot[bot]
41c91b8b03 Bump form-data from 4.0.0 to 4.0.4
Bumps [form-data](https://github.com/form-data/form-data) from 4.0.0 to 4.0.4.
- [Release notes](https://github.com/form-data/form-data/releases)
- [Changelog](https://github.com/form-data/form-data/blob/master/CHANGELOG.md)
- [Commits](https://github.com/form-data/form-data/compare/v4.0.0...v4.0.4)

---
updated-dependencies:
- dependency-name: form-data
  dependency-version: 4.0.4
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-07-21 23:11:19 +00:00
Ajay Ramachandran
40c4ec7437 Merge pull request #606 from ajayyy/dependabot/npm_and_yarn/tar-fs-2.1.3
Bump tar-fs from 2.1.2 to 2.1.3
2025-06-03 14:40:36 -04:00
dependabot[bot]
70ce320737 Bump tar-fs from 2.1.2 to 2.1.3
Bumps [tar-fs](https://github.com/mafintosh/tar-fs) from 2.1.2 to 2.1.3.
- [Commits](https://github.com/mafintosh/tar-fs/commits)

---
updated-dependencies:
- dependency-name: tar-fs
  dependency-version: 2.1.3
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-06-03 09:19:22 +00:00
Ajay Ramachandran
0bfc9b30f5 Merge pull request #605 from mini-bomba/fix/banned-webhooks
don't send dearrow webhooks for banned users
2025-05-29 12:09:59 -04:00
mini-bomba
bce5385864 shortcircuit the new user check for banned users 2025-05-26 16:41:55 +02:00
mini-bomba
f71c4ceba9 don't send dearrow webhooks for banned users 2025-05-25 22:46:23 +02:00
Ajay
69ca711bb3 Merge branch 'master' of https://github.com/ajayyy/SponsorBlockServer 2025-04-28 19:17:55 -04:00
Ajay
e519986027 Fix dearrow hiding 2025-04-28 19:17:54 -04:00
Ajay Ramachandran
c0e7401a73 Merge pull request #604 from mini-bomba/limit-usernames
Limit username creation
2025-04-28 19:04:03 -04:00
Ajay
314461c9f0 Fix old user check 2025-04-28 19:03:24 -04:00
mini-bomba
655789e62d Limit username creation 2025-04-29 00:10:20 +02:00
Ajay
339ba127eb Fix log 2025-04-28 02:59:15 -04:00
Ajay
da393da9e9 Fix log 2025-04-28 02:56:36 -04:00
Ajay
aa2c7bf6ea Fix log 2025-04-28 02:41:34 -04:00
Ajay
c82708aae8 Fix log 2025-04-28 02:40:57 -04:00
Ajay
26c575d37a Add log 2025-04-28 02:32:17 -04:00
Ajay Ramachandran
1b3b1b1cb3 Merge pull request #603 from mini-bomba/request-validator-rule-names
Add request validator rule names
2025-04-26 01:46:42 -04:00
mini-bomba
9bc4bf8c7b Add request validator rule names 2025-04-26 02:10:33 +02:00
Ajay
cbc38c5ac8 Add another logging webhook 2025-04-25 17:43:01 -04:00
Ajay Ramachandran
e7f3753077 Merge pull request #602 from mini-bomba/request-validator
Create an engine for rule-based request validation
2025-04-25 17:37:25 -04:00
mini-bomba
f44d3cd92c rephrase old rejection logs 2025-04-25 21:52:39 +02:00
mini-bomba
4db4e9458e hook up extra functions to the request validator 2025-04-25 21:52:39 +02:00
mini-bomba
b2cd048909 load request validator rules from env as json 2025-04-25 21:11:30 +02:00
mini-bomba
5c249fb02b test cases for the request validator engine 2025-04-25 21:11:30 +02:00
mini-bomba
f7e5394a18 create a request validator engine 2025-04-25 21:11:30 +02:00
Ajay
161db6df0c Don't error if failing to parse vanced ua 2025-04-25 13:51:46 -04:00
Ajay
920d288f0b Add title to webhook 2025-04-25 13:00:34 -04:00
Ajay
0d005c23bf Add another validity filter 2025-04-25 12:55:44 -04:00
Ajay
9f745d3a8b Move permission check 2025-04-21 23:50:46 -04:00
Ajay
39f8dc6c22 Fix revanced ua 2025-04-21 20:14:43 -04:00
Ajay
08ba5c21b1 Fix validity check 2025-04-21 19:39:46 -04:00
Ajay
cfd61dc8dd Validity check 2025-04-21 19:26:42 -04:00
Ajay
039fb3ac7a More logs 2025-04-21 12:39:14 -04:00
Ajay
fccebfa487 Fixed webhook again 2025-04-21 11:27:16 -04:00
Ajay
6130ac8150 Change color for dearrow webhook 2025-04-21 11:20:08 -04:00
Ajay
7e681d2cd5 Fix webhook newlines 2025-04-21 11:15:30 -04:00
Ajay
707b36d161 Fix user agent parser lower casing 2025-04-21 11:10:47 -04:00
Ajay
b849328fae More logging 2025-04-21 10:53:49 -04:00
Ajay
3d596f4528 Save user agent for dearrow 2025-04-17 01:05:34 -04:00
Ajay
ed5a397a30 Improve permission check 2025-04-15 02:01:41 -04:00
Ajay
300642fd4f Fix innertube failure handling 2025-04-12 00:44:10 -04:00
Ajay
46580322fc Fix dearrow old submitter check 2025-04-11 02:44:19 -04:00
Ajay
318152dac6 ua 2025-04-11 02:41:17 -04:00
Ajay
8111d34b30 Fix dearrow threshold not configurable 2025-04-10 16:51:46 -04:00
Ajay
ac78dee210 Fix undefined error 2025-04-10 14:38:08 -04:00
Ajay
d18a4a13f2 Check dearrow vote history for new submitters 2025-04-10 12:47:32 -04:00
Ajay
8d40d61efc Allow max users without a submitter threshold 2025-04-10 02:35:22 -04:00
Ajay
74f6224091 Add new user limit per 5 mins 2025-04-10 02:26:09 -04:00
Ajay
9b55dc5d4d Add new config option 2025-04-08 16:52:16 -04:00
Ajay
8cd2138989 Use config for old submitter check 2025-04-08 16:50:04 -04:00
Ajay
e40af45c73 Fix query 2025-04-08 16:43:01 -04:00
Ajay
5de1fe4388 Fix post config url 2025-04-08 16:26:51 -04:00
Ajay
2ef3d68af0 Await promise not being awaited 2025-04-08 16:02:29 -04:00
Ajay
f67244663e Use alias for getting server config 2025-04-08 15:56:23 -04:00
Ajay
00064d5a7c Fix config fetching 2025-04-08 15:46:14 -04:00
Ajay
ac26aed21c Add endpoints for config setting 2025-04-08 15:18:32 -04:00
Ajay
2aa3589312 Add ability to set config 2025-04-08 15:15:39 -04:00
Ajay
82af8f200b Add better ua parsing 2025-04-08 14:21:24 -04:00
Ajay
69cb33aad0 Add logs 2025-04-08 13:23:24 -04:00
Ajay
3817d7fdba Better submission error message 2025-04-08 13:21:01 -04:00
Ajay
34a6a83e44 Change dearrow permission requirements 2025-04-07 19:28:52 -04:00
Ajay
0967373cb2 Rename func 2025-04-07 00:57:48 -04:00
Ajay
b7794b57d0 Fix can vote checks 2025-04-07 00:57:08 -04:00
Ajay
550339db41 Add permission check in more places 2025-04-07 00:36:01 -04:00
Ajay
b69f050b44 Old submitter only 2025-04-07 00:29:53 -04:00
Ajay Ramachandran
59a986f32f Merge pull request #600 from ajayyy/dependabot/npm_and_yarn/multi-b9f445934c
Bump semver and nodemon
2025-03-29 15:07:03 -04:00
dependabot[bot]
7088a1688d Bump semver and nodemon
Bumps [semver](https://github.com/npm/node-semver) to 7.7.1 and updates ancestor dependencies [semver](https://github.com/npm/node-semver) and [nodemon](https://github.com/remy/nodemon). These dependencies need to be updated together.


Updates `semver` from 7.3.7 to 7.7.1
- [Release notes](https://github.com/npm/node-semver/releases)
- [Changelog](https://github.com/npm/node-semver/blob/main/CHANGELOG.md)
- [Commits](https://github.com/npm/node-semver/compare/v7.3.7...v7.7.1)

Updates `semver` from 6.3.0 to 7.7.1
- [Release notes](https://github.com/npm/node-semver/releases)
- [Changelog](https://github.com/npm/node-semver/blob/main/CHANGELOG.md)
- [Commits](https://github.com/npm/node-semver/compare/v7.3.7...v7.7.1)

Updates `nodemon` from 2.0.20 to 3.1.9
- [Release notes](https://github.com/remy/nodemon/releases)
- [Commits](https://github.com/remy/nodemon/compare/v2.0.20...v3.1.9)

---
updated-dependencies:
- dependency-name: semver
  dependency-type: indirect
- dependency-name: semver
  dependency-type: indirect
- dependency-name: nodemon
  dependency-type: direct:development
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-03-29 04:55:45 +00:00
Ajay Ramachandran
f07f94fb09 Merge pull request #598 from ajayyy/dependabot/npm_and_yarn/tar-fs-2.1.2
Bump tar-fs from 2.1.1 to 2.1.2
2025-03-29 00:54:48 -04:00
Ajay Ramachandran
a7758a2608 Merge pull request #599 from ajayyy/dependabot/npm_and_yarn/axios-1.8.4
Bump axios from 1.7.7 to 1.8.4
2025-03-29 00:54:40 -04:00
dependabot[bot]
fd5bc43281 Bump axios from 1.7.7 to 1.8.4
Bumps [axios](https://github.com/axios/axios) from 1.7.7 to 1.8.4.
- [Release notes](https://github.com/axios/axios/releases)
- [Changelog](https://github.com/axios/axios/blob/v1.x/CHANGELOG.md)
- [Commits](https://github.com/axios/axios/compare/v1.7.7...v1.8.4)

---
updated-dependencies:
- dependency-name: axios
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-03-29 04:40:49 +00:00
dependabot[bot]
3633d0fbb4 Bump tar-fs from 2.1.1 to 2.1.2
Bumps [tar-fs](https://github.com/mafintosh/tar-fs) from 2.1.1 to 2.1.2.
- [Commits](https://github.com/mafintosh/tar-fs/compare/v2.1.1...v2.1.2)

---
updated-dependencies:
- dependency-name: tar-fs
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-03-28 22:28:04 +00:00
Ajay
aae56887da Update gitignore 2025-03-12 02:46:05 -04:00
Ajay
4fe7cebcb3 Add caching for 5 length skip skip segment query 2025-03-12 02:45:59 -04:00
Ajay
31e678fdc2 Store titles for casual vote submissions
When an uploader changes the title, it will reset the casual votes
2025-02-17 03:16:57 -05:00
Ajay
d44ce3c2dc Add casual votes table export 2025-02-16 14:43:55 -05:00
Ajay
5f9b4c8acc Make casual downvotes apply to all categories 2025-02-13 04:03:38 -05:00
Ajay
d608125b41 Add endpoint for casual submission count 2025-02-12 03:52:03 -05:00
Ajay
fb3abb3216 Fix index for casual votes 2025-02-06 03:01:22 -05:00
Ajay
ccde64e90f Change casual submission to allow submitting multiple categories 2025-02-06 02:57:09 -05:00
Ajay
4abf57b0ce Save casual mode status in db 2025-02-06 02:51:13 -05:00
Ajay
07435b9af1 Add casual mode endpoint 2025-02-05 03:38:55 -05:00
Ajay Ramachandran
ab9cab8ff5 Merge pull request #582 from hanydd/dev_join
Change reduce to join function for simplicity
2025-02-03 20:55:55 -05:00
Ajay Ramachandran
311c653ea2 Merge pull request #592 from Choromanski/feature/node-deprecation
Upgraded github actions dependencies
2025-02-03 20:54:48 -05:00
Ajay Ramachandran
e92d47e1a4 Merge pull request #593 from ajayyy/dependabot/npm_and_yarn/multi-9f37c16f8f
Bump cookie and express
2025-02-03 20:54:22 -05:00
Ajay Ramachandran
3734b88cb5 Merge pull request #595 from mchangrh/patch-1
bump & lock rsync dockerfile
2025-01-18 12:54:45 -05:00
Michael M. Chang
00086d9001 bump & lock rsync dockerfile 2025-01-18 06:45:09 -08:00
Ajay
a37a552b17 Fix video labels keys not clearing properly 2025-01-18 03:32:55 -05:00
Ajay
fa29cfd3c6 Add endpoint to get segment ID 2025-01-18 02:56:57 -05:00
Ajay
be9d97ae2b Add option to trim UUIDs in skip segments endpoint 2025-01-18 02:09:46 -05:00
Ajay
06f83cd8d4 Allow voting and viewing with partial UUID 2025-01-18 02:04:27 -05:00
Ajay
80b1019783 Allow video labels cashing with prefix of 4 2025-01-18 00:22:17 -05:00
Ajay
2455d2cd7e Make hasStartSegment result optional 2025-01-17 23:59:17 -05:00
Ajay
e2a9976cd0 Add hasStartSegment to video label 2025-01-17 23:30:32 -05:00
Ajay
bba06511ce Remove unnecessary parts of video labels request 2025-01-17 04:38:08 -05:00
Ajay
043268dc10 Merge branch 'master' of https://github.com/ajayyy/SponsorBlockServer 2024-12-04 14:49:36 -05:00
Ajay
003fe77e72 Change backup retention 2024-12-04 14:49:34 -05:00
Ajay Ramachandran
efe59c5098 Merge pull request #594 from mini-bomba/dearrow_locked_vip_downvotes
Send a different message for VIP downvotes on locked titles
2024-11-14 19:56:02 -05:00
Ajay Ramachandran
7ef6452eb5 Double quote 2024-11-14 19:49:25 -05:00
mini-bomba
9c01b711a5 Send a different message for VIP downvotes on locked titles 2024-11-14 22:30:14 +01:00
Ajay
b2981fe782 Don't allow multiple downvotes on one submission 2024-11-10 15:21:40 -05:00
Ajay
405805ff89 Add check against missing api video detail failing to fetch 2024-11-07 02:24:39 -05:00
Ajay
01c306287a Fix axios error handling 2024-11-06 21:23:15 -05:00
Ajay
826d49ba1f Add support for floatie proxy 2024-11-04 15:04:43 -05:00
Ajay
b03057c5bf Fix redis cache metrics generation 2024-10-30 02:35:59 -04:00
Ajay
54e03a389b Remove string from metrics 2024-10-30 02:32:33 -04:00
Ajay
93f7161724 Only uploaded warned info for upvotes 2024-10-27 02:17:34 -04:00
dependabot[bot]
efa6c10d56 Bump cookie and express
Bumps [cookie](https://github.com/jshttp/cookie) to 0.7.1 and updates ancestor dependency [express](https://github.com/expressjs/express). These dependencies need to be updated together.


Updates `cookie` from 0.6.0 to 0.7.1
- [Release notes](https://github.com/jshttp/cookie/releases)
- [Commits](https://github.com/jshttp/cookie/compare/v0.6.0...v0.7.1)

Updates `express` from 4.21.0 to 4.21.1
- [Release notes](https://github.com/expressjs/express/releases)
- [Changelog](https://github.com/expressjs/express/blob/4.21.1/History.md)
- [Commits](https://github.com/expressjs/express/compare/4.21.0...4.21.1)

---
updated-dependencies:
- dependency-name: cookie
  dependency-type: indirect
- dependency-name: express
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-10-19 12:03:41 +00:00
Ajay
e9c0c44528 Add new webhook for was warned 2024-10-18 03:08:24 -04:00
Brian Choromanski
4dfbb9039d Upgraded more actions dependencies 2024-10-08 21:17:01 -04:00
Brian Choromanski
05c5cf57e4 Upgraded actions dependencies 2024-10-08 21:10:10 -04:00
Ajay
566eabdc31 Add metrics endpoint 2024-10-02 20:06:57 -04:00
Ajay Ramachandran
f26db7238a Merge pull request #591 from ajayyy/dependabot/npm_and_yarn/multi-d66d039ac5
Bump serve-static and express
2024-09-17 02:30:27 -04:00
dependabot[bot]
fb05ec51d3 Bump serve-static and express
Bumps [serve-static](https://github.com/expressjs/serve-static) to 1.16.2 and updates ancestor dependency [express](https://github.com/expressjs/express). These dependencies need to be updated together.


Updates `serve-static` from 1.15.0 to 1.16.2
- [Release notes](https://github.com/expressjs/serve-static/releases)
- [Changelog](https://github.com/expressjs/serve-static/blob/v1.16.2/HISTORY.md)
- [Commits](https://github.com/expressjs/serve-static/compare/v1.15.0...v1.16.2)

Updates `express` from 4.19.2 to 4.21.0
- [Release notes](https://github.com/expressjs/express/releases)
- [Changelog](https://github.com/expressjs/express/blob/4.21.0/History.md)
- [Commits](https://github.com/expressjs/express/compare/4.19.2...4.21.0)

---
updated-dependencies:
- dependency-name: serve-static
  dependency-type: indirect
- dependency-name: express
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-09-17 06:18:07 +00:00
Ajay
eeb9f1b02f Log more when redis increment fails 2024-09-15 04:30:44 -04:00
Ajay
8ba68e1b4c One less call when dealing with lru cache for ttl result and ensure reset keys cleared 2024-09-14 18:02:22 -04:00
Ajay
17059fdbe6 One less call when dealing with lru cache 2024-09-14 17:52:50 -04:00
Ajay
6e5f4f7610 Fix active requests list not getting deleted 2024-09-14 17:33:17 -04:00
Ajay
c313590d36 persona's revenge 2024-09-13 21:51:06 -04:00
Ajay
4508ad11f2 Fix error when submitter ip not found 2024-09-13 14:37:37 -04:00
Ajay
dc5158257e Fix errors when postgres returns undefined and trying to save to redis 2024-09-13 14:36:52 -04:00
Ajay
6edd71194b Log redis stats on high db load 2024-09-13 14:29:32 -04:00
Ajay
7678be1e24 Add max redis response time for reads 2024-09-13 04:06:50 -04:00
Ajay
d28ac39d4f Allow newly used header 2024-09-07 23:31:16 -04:00
Ajay
5fd6b5eb8b Fix canSubmitOriginal query on postgres 2024-09-01 19:29:08 -04:00
Ajay
0e1a38c4d4 Merge branch 'master' of https://github.com/ajayyy/SponsorBlockServer 2024-09-01 19:01:00 -04:00
Ajay
c496be5651 Disable innertube tests while they are broken 2024-09-01 19:00:59 -04:00
Ajay Ramachandran
15a9c3a4eb Merge pull request #583 from ajayyy/dependabot/npm_and_yarn/braces-3.0.3
Bump braces from 3.0.2 to 3.0.3
2024-09-01 18:57:32 -04:00
Ajay Ramachandran
f1ebd56526 Merge pull request #590 from ajayyy/dependabot/npm_and_yarn/axios-1.7.7
Bump axios from 1.6.0 to 1.7.7
2024-09-01 18:57:18 -04:00
Ajay
258749ac31 Add more strict requirements for voting for original thumbnails 2024-09-01 18:56:29 -04:00
dependabot[bot]
ccccb1af3c Bump axios from 1.6.0 to 1.7.7
Bumps [axios](https://github.com/axios/axios) from 1.6.0 to 1.7.7.
- [Release notes](https://github.com/axios/axios/releases)
- [Changelog](https://github.com/axios/axios/blob/v1.x/CHANGELOG.md)
- [Commits](https://github.com/axios/axios/compare/v1.6.0...v1.7.7)

---
updated-dependencies:
- dependency-name: axios
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-09-01 22:43:59 +00:00
Ajay
13b8a988db update dep 2024-09-01 18:43:28 -04:00
Ajay
803fc18554 Verify old submissions right after someone votes on it 2024-08-16 00:36:42 -04:00
Ajay
59373cf346 Fix rejected server-side rendered ads issue not rejecting 2024-08-14 23:42:47 -04:00
Ajay
05fd6abe91 Use env vars in workflow 2024-08-12 01:04:12 -04:00
Ajay
090e185765 Add support for poToken and visitor data
Fixes api requests

https://github.com/iv-org/invidious/pull/4789
2024-08-12 00:33:11 -04:00
Ajay
d2df5cef98 Merge branch 'master' of https://github.com/ajayyy/SponsorBlockServer 2024-08-11 13:49:48 -04:00
Ajay
214946897d Hardcode nn-block reputation 2024-08-11 13:49:46 -04:00
Ajay Ramachandran
8da5de4d7b Merge pull request #588 from mini-bomba/dearrow-locked-titles-downvotes
postBranding.ts webhook changes
2024-08-04 09:38:24 +02:00
mini-bomba
380ec8d0ca Reformat SQL code in postBranding.ts webhook code 2024-08-03 22:01:46 +02:00
mini-bomba
72086b0195 Send webhook messages when a locked title is downvoted
also take downvotes & verification into consideration when comparing
titles in webhook code
2024-08-03 21:56:31 +02:00
mini-bomba
61dcfeb69f Don't send to #dearrow-locked-titles when downvoting unlocked title
voteType passed to sendWebhooks() function to avoid confusion in the
future should someone forget about the if statement
2024-08-03 21:39:21 +02:00
Ajay Ramachandran
19d6d85aa6 Merge pull request #589 from mini-bomba/tests-fix
fix postgres+redis tests
2024-08-03 21:32:18 +02:00
mini-bomba
814ceb56f1 fix postgres+redis tests
made on request
https://discord.com/channels/603643120093233162/607338052221665320/1269373542550470730
2024-08-03 21:23:44 +02:00
Ajay Ramachandran
195cc14d25 Merge pull request #585 from mini-bomba/unrelated_chapter_suggestions
Don't show completely unrelated chapter suggestions
2024-08-03 21:19:59 +02:00
Ajay Ramachandran
9427bf4f3d Merge pull request #586 from TristanWasTaken/db-schema
docs: fix typos in DatabaseSchema.md
2024-08-03 08:00:47 +02:00
mini-bomba
3f026409cd Don't show completely unrelated chapter suggestions
Chapter suggestions should be at least slightly related to what the user
has already typed.
This change stops the server from sending suggestions that postgresql
deems to be "less than 10% similar"

Also modified tests to reflect this change.
2024-07-29 02:26:53 +02:00
Ajay
d75b9ddcaa Show failure reason in webook 2024-07-24 13:42:40 -04:00
Ajay
2fb3d05055 private video? 2024-07-24 13:06:19 -04:00
Ajay
165ed8a6e0 Fix original thumbnail votes being shown because of fetch all 2024-07-09 19:49:37 -04:00
Ajay
495b8031e3 Add better logging for failed reputation call 2024-06-30 09:40:25 -04:00
HanYaodong
374ddc74bd Use join function for simplicity 2024-06-25 21:33:47 +08:00
Ajay
738f863581 Don't send server-side render error for title submissions 2024-06-25 14:36:05 +05:30
Tristan
8b5e69f36f docs: fix typos in DatabaseSchema.md 2024-06-24 03:14:05 +02:00
Ajay
10e37824d8 Merge branch 'master' of https://github.com/ajayyy/SponsorBlockServer 2024-06-21 15:43:30 +05:30
Ajay
428343e7d8 Require a vote for original to show 2024-06-21 15:43:26 +05:30
Ajay Ramachandran
4e69ac60bc Merge pull request #584 from TristanWasTaken/db
docs: update DatabaseSchema.md
2024-06-21 09:00:52 +05:30
Tristan
3b03792903 docs: fix userFeatures md list 2024-06-21 03:17:31 +02:00
Tristan
1a0b6ab097 Update DatabaseSchema.md 2024-06-21 03:15:07 +02:00
Tristan
8e5084cd72 docs: update private schemas 2024-06-21 03:11:28 +02:00
Tristan
96feaf3cbe docs: update public schemas 2024-06-21 03:08:38 +02:00
Tristan
d08cfee5b4 docs: update private indexes 2024-06-21 01:35:38 +02:00
Tristan
96dd9eceb3 docs: update public indexes 2024-06-21 01:34:25 +02:00
Tristan
4422104294 docs: format lists 2024-06-21 01:34:15 +02:00
Tristan
4ad553478b chore: fix misleading/unclear migration comments 2024-06-21 00:50:10 +02:00
dependabot[bot]
47323156c1 Bump braces from 3.0.2 to 3.0.3
Bumps [braces](https://github.com/micromatch/braces) from 3.0.2 to 3.0.3.
- [Changelog](https://github.com/micromatch/braces/blob/master/CHANGELOG.md)
- [Commits](https://github.com/micromatch/braces/compare/3.0.2...3.0.3)

---
updated-dependencies:
- dependency-name: braces
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-06-16 11:24:49 +00:00
Ajay
a181d52fb2 Fix types 2024-06-12 12:01:40 +05:30
Ajay
ee9ed6af1f Add server-side ads check for dearrow submissions 2024-06-12 11:57:59 +05:30
Ajay
ec1e6d63a4 Add protection against server-side ad injection (SSAP) 2024-06-12 09:55:41 +05:30
Ajay
5c10e071dc Change how video duration check works for submissions 2024-05-27 13:54:02 -04:00
Ajay
8eb6f5b2ea Merge branch 'master' of https://github.com/ajayyy/SponsorBlockServer 2024-05-07 15:37:59 -04:00
Ajay
bdfe4938d2 Fix vote webhook not working 2024-05-07 15:37:57 -04:00
Ajay Ramachandran
bcf29e4047 Merge pull request #579 from ajayyy/dependabot/npm_and_yarn/express-4.19.2
Bump express from 4.18.2 to 4.19.2
2024-05-05 01:28:41 -04:00
Ajay Ramachandran
622c3f27d6 Merge pull request #581 from mini-bomba/videoduration-inconsistency
Make returned video duration in getBranding.ts consistent
2024-05-05 01:28:22 -04:00
mini-bomba
7c1abd9747 Make returned video duration in getBranding.ts consistent
Instead of picking the first segment returned by the db (i.e. possibly
random), sort segments by submission time and use the oldest visible
segment with a non-zero video duration.
2024-05-04 21:56:03 +02:00
Ajay
709485e0e9 Increase frequency of docker forgets 2024-04-27 00:42:55 -04:00
Ajay
f841d8173b Fix ttl cache key not properly cleared 2024-04-22 00:53:09 -04:00
Ajay
b2f7e1b39b Fix locked check for thumbnail downvotes 2024-04-21 23:13:10 -04:00
Ajay
47ea6ae8d3 Only check request time for readiness if cache has filled up 2024-04-21 13:38:32 -04:00
Ajay
063607fe30 Add etags for branding as well 2024-04-20 13:16:34 -04:00
Ajay
4b795da5a0 Merge branch 'master' of https://github.com/ajayyy/SponsorBlockServer 2024-04-20 00:02:16 -04:00
Ajay
8043bd9006 Make max response time configurable 2024-04-20 00:02:15 -04:00
Ajay Ramachandran
bd8f4b7539 Merge pull request #577 from SuperStormer/master
cosmetic fix for lock reason
2024-04-19 21:22:57 -04:00
Ajay
0f97ce4a49 Make redis readiness check recoverable 2024-04-19 21:20:40 -04:00
Ajay
cfd7c3d8c4 Add more to ready check 2024-04-19 20:24:42 -04:00
Ajay
af7d8428ab Improve ready check 2024-04-19 20:05:52 -04:00
Ajay
7c51586664 Add error server 2024-04-16 03:01:44 -04:00
Ajay
2251ddc251 Add ready endpoint 2024-04-16 01:13:56 -04:00
Ajay
07d4dde4f6 Add connections to status 2024-04-16 00:13:51 -04:00
Ajay
b934b7a937 Use innertube when possible 2024-04-14 01:26:03 -04:00
Ajay
f2cf2e2aac Add db stats to logs 2024-04-13 03:00:26 -04:00
Ajay
2887a8505c Improve logging and fix ip fetch error breaking skip segments 2024-04-13 01:54:59 -04:00
Ajay
e289fe9075 Add ttl cache 2024-04-12 01:29:23 -04:00
Ajay
2cd9401a51 Fix etag tests 2024-04-11 18:12:02 -04:00
Ajay
47bea9ee6e Trigger usage of cache key when checking ttl 2024-04-11 17:57:53 -04:00
Ajay
0602fdd651 Use cache for ttl if possible
Also fixes etag when compression enabled
2024-04-11 17:54:32 -04:00
Ajay
7c77bf566e Remove quotes when processing etag 2024-04-11 17:07:13 -04:00
Ajay
1009fff9e9 Fix caching issues with one specific key form
.c regex was any character plus a c instead of intenced dot
2024-04-11 17:04:17 -04:00
Ajay
f43e59250f Add quotes to etag 2024-04-11 14:11:04 -04:00
Ajay
dc2115ef20 Change status timeout 2024-04-09 13:29:18 -04:00
dependabot[bot]
55c3e4f01f Bump express from 4.18.2 to 4.19.2
Bumps [express](https://github.com/expressjs/express) from 4.18.2 to 4.19.2.
- [Release notes](https://github.com/expressjs/express/releases)
- [Changelog](https://github.com/expressjs/express/blob/master/History.md)
- [Commits](https://github.com/expressjs/express/compare/4.18.2...4.19.2)

---
updated-dependencies:
- dependency-name: express
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-03-28 17:17:43 +00:00
Ajay
af31f511a5 Add tests for fetch all 2024-03-24 13:52:33 -04:00
Ajay
0d9cce0512 Fix wrong comparison with votes filtering 2024-03-24 13:42:39 -04:00
Ajay
c19d6fe97a Only send low voted segments when asked for 2024-03-22 18:37:39 -04:00
Ajay
47c109f012 Fix act as vip unlocking segments 2024-03-21 19:35:13 -04:00
Ajay
a921085da6 Fix vip downvotes unlocking 2024-03-21 19:28:05 -04:00
Ajay
d5ebd8ec1a Improve self downvoting for dearrow 2024-03-20 13:47:23 -04:00
Ajay
a7f10f7727 Attempt to fix docker build error 2024-03-17 13:40:53 -04:00
Ajay
1c234846db Merge branch 'master' of https://github.com/ajayyy/SponsorBlockServer 2024-03-17 13:20:25 -04:00
Ajay
bc1ca098e7 Fix bug causing people to accidentally super downvote 2024-03-17 13:20:23 -04:00
Ajay Ramachandran
cf21ebc2de Merge pull request #578 from ajayyy/dependabot/npm_and_yarn/follow-redirects-1.15.6
Bump follow-redirects from 1.15.4 to 1.15.6
2024-03-16 21:07:40 -04:00
dependabot[bot]
2426a6ee03 Bump follow-redirects from 1.15.4 to 1.15.6
Bumps [follow-redirects](https://github.com/follow-redirects/follow-redirects) from 1.15.4 to 1.15.6.
- [Release notes](https://github.com/follow-redirects/follow-redirects/releases)
- [Commits](https://github.com/follow-redirects/follow-redirects/compare/v1.15.4...v1.15.6)

---
updated-dependencies:
- dependency-name: follow-redirects
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-03-16 23:28:16 +00:00
SuperStormer
ba65c28459 Update postSkipSegments.ts 2024-03-15 02:20:24 -04:00
Ajay
591b342855 Add default user count, update url 2024-03-06 00:47:51 -05:00
Ajay Ramachandran
8d8388386e Merge pull request #571 from ajayyy/dependabot/npm_and_yarn/follow-redirects-1.15.4
Bump follow-redirects from 1.15.1 to 1.15.4
2024-02-27 03:49:44 -05:00
Ajay
a54bf556ed Revert "Fix usercounter behind cloudflare"
This reverts commit 9bcceb7e5b.
2024-02-27 03:49:03 -05:00
Ajay
f1c5b8a359 Merge branches 'master' and 'master' of https://github.com/ajayyy/SponsorBlockServer 2024-02-27 03:33:44 -05:00
Ajay
9bcceb7e5b Fix usercounter behind cloudflare 2024-02-27 03:33:38 -05:00
Ajay Ramachandran
da0cf0dedc Merge pull request #575 from ajayyy/dependabot/npm_and_yarn/axios-1.6.0
Bump axios from 1.1.3 to 1.6.0
2024-02-20 17:16:53 -05:00
dependabot[bot]
1cefdf4dac Bump axios from 1.1.3 to 1.6.0
Bumps [axios](https://github.com/axios/axios) from 1.1.3 to 1.6.0.
- [Release notes](https://github.com/axios/axios/releases)
- [Changelog](https://github.com/axios/axios/blob/v1.x/CHANGELOG.md)
- [Commits](https://github.com/axios/axios/compare/v1.1.3...v1.6.0)

---
updated-dependencies:
- dependency-name: axios
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-02-20 21:00:34 +00:00
Ajay
aec2aa4457 Fix keys not properly clearing 2024-02-16 22:14:09 -05:00
Ajay
3f29e11449 Fix submission and vote locks 2024-02-16 14:24:28 -05:00
Ajay
6d11e1c601 Support dragonfly with in memory cache 2024-02-09 18:16:28 -05:00
Ajay
9fa248037a Add to cache when calling set 2024-02-09 17:28:59 -05:00
Ajay
02a640d857 Use broadcast mode for redis 2024-02-09 15:34:36 -05:00
Ajay
17b002649e Add logging when too many active connections 2024-02-09 14:54:14 -05:00
Ajay
a74189b287 Fix cache invalidation with compression enabled 2024-02-09 14:19:56 -05:00
Ajay
09997d82ed Fix chrome extension user fetcher 2024-02-09 13:47:59 -05:00
Ajay
bf644d6899 Don't use broadcast mode for redis 2024-02-09 12:09:03 -05:00
Ajay
5929460239 Remove weighted randomness and change weight calculation 2024-02-09 12:08:52 -05:00
Ajay
09dd10ad6f Fix memory cache invalidation not invalidating every item 2024-02-09 00:34:12 -05:00
Ajay
af5e8cd68d Fix uncached misses tracking 2024-02-08 22:15:28 -05:00
Ajay
bd766ab430 Remove unused import 2024-02-08 22:12:53 -05:00
Ajay
bf1fe1ff61 Allow toggling redis compression and disable by default 2024-02-08 21:58:10 -05:00
Ajay
db225f8a84 Reuse running redis connections and handle redis race condition 2024-02-08 21:30:27 -05:00
Ajay
9364a7e654 Show general last invalidation message 2024-02-08 21:15:28 -05:00
Ajay
f3fffa56c9 Don't allow downvoting locked segments 2024-02-08 15:47:25 -05:00
Ajay
c478546128 Count invalidation only on successful delete 2024-02-08 15:12:48 -05:00
Ajay
e61f964d17 Add ttl to in memory cache cache 2024-02-08 14:37:01 -05:00
Ajay
5f8ef25d88 Use broadcast mode for client tracking and add new memory cache stat 2024-02-08 14:30:32 -05:00
Ajay
b76cfdf798 Allow more things to be cached 2024-02-08 03:40:41 -05:00
Ajay
3c6000f2da Rename config for clientCacheSize 2024-02-08 03:26:06 -05:00
Ajay
9944d70f6b Use size for lru limit instead of length 2024-02-08 03:23:55 -05:00
Ajay
27069cb5c2 Change what gets saved in memory cache 2024-02-08 03:08:02 -05:00
Ajay
8aa03c81a7 Improve cache miss calculation 2024-02-08 03:06:30 -05:00
Ajay
e8879f66b1 Add redis in memory cache stats 2024-02-08 02:58:51 -05:00
Ajay
acdbd3787b More specific on what should be client cached 2024-02-08 01:04:48 -05:00
Ajay
1f7156eb29 Don't crash if redis message invalid 2024-02-08 00:34:37 -05:00
Ajay
7405053b44 Reuse running reputation requests 2024-02-07 23:40:59 -05:00
Ajay
a929f69452 Fix same ip being fetched multiple times from postgres 2024-02-07 23:36:45 -05:00
Ajay
8574ec3a0c Fix is number check 2024-02-07 22:28:28 -05:00
Ajay
1475c91327 Clear cache again after setting up client tracking 2024-02-06 15:32:40 -05:00
Ajay
5b1b362bf0 Handle reconnects with client-side caching
Also upgrades redis to fix a library bug
2024-02-06 00:52:42 -05:00
Ajay
14da10bd8a Add client-side caching 2024-02-05 13:11:44 -05:00
Ajay
547632341a Add back redis compression optionally 2024-02-04 23:17:28 -05:00
Ajay
c54c25c73b Disable query cache for segment groups 2024-02-04 22:53:12 -05:00
Ajay
121cc7f481 Fix duplicate behavior with submitting full video labels 2024-01-31 13:05:47 -05:00
Ajay
e041b9c930 Don't throw 409 if only one segment was successfully submitted 2024-01-31 12:59:01 -05:00
Ajay
59d9ed390f Fix titles and thumbnails being unlocked 2024-01-28 22:05:04 -05:00
Ajay
4477ab7ca6 Remove bad test 2024-01-21 19:55:16 -05:00
Ajay
25ec9b0291 Revert adding redis compression
This reverts commit fce311377f and 2ad51842cc
2024-01-21 19:49:36 -05:00
Ajay
c3e00ac8b1 Merge branch 'master' of https://github.com/ajayyy/SponsorBlockServer 2024-01-21 19:46:18 -05:00
Ajay
2c9079f565 No more verification through sb submissions 2024-01-21 19:46:16 -05:00
Ajay Ramachandran
aee84a4b6e Merge pull request #572 from SashaXser/master
Promise.resolve and Using "forEach" instead of "map"
2024-01-20 00:08:55 -05:00
SashaXser
a8010b553d Merge branch 'master' into master 2024-01-20 07:07:03 +04:00
SashaXser
5b95aa8aba Resolve conflicts 2024-01-20 06:59:12 +04:00
Ajay
fce311377f Switch to lz4 compression 2024-01-19 15:16:50 -05:00
Ajay
dcb479f3d2 Fallback to allowing taking a lock if redis fails 2024-01-19 14:35:32 -05:00
Ajay
2ad51842cc Compress redis values 2024-01-19 14:34:18 -05:00
SashaXser
ea60947092 format fix 2024-01-19 14:31:03 +04:00
SashaXser
14b6f84f94 2 things
Consider using "forEach" instead of "map" as its return value is not being used here.
Replace this trivial promise with "Promise.resolve".
2024-01-19 08:50:45 +04:00
Ajay
8e13ec60d6 Fix other get missing throw 2024-01-18 11:57:50 -05:00
Ajay
c9f7275942 Only use redis timeout when db not under load 2024-01-18 09:22:00 -05:00
Ajay
d607d8b179 Don't fallback to db when too many redis connections 2024-01-15 14:07:34 -05:00
dependabot[bot]
5974b51391 Bump follow-redirects from 1.15.1 to 1.15.4
Bumps [follow-redirects](https://github.com/follow-redirects/follow-redirects) from 1.15.1 to 1.15.4.
- [Release notes](https://github.com/follow-redirects/follow-redirects/releases)
- [Commits](https://github.com/follow-redirects/follow-redirects/compare/v1.15.1...v1.15.4)

---
updated-dependencies:
- dependency-name: follow-redirects
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-01-10 02:08:26 +00:00
Ajay
7aaf000d99 Fix index for hashed video id dearrow fetching 2024-01-09 15:31:56 -05:00
Ajay
0edf0b9e1c Don't handle shadowhide on high load 2024-01-03 11:37:58 -05:00
Ajay
84fd7c170f Add test for VIP downvote without removing 2024-01-03 01:18:57 -05:00
Ajay
b04e0dcd97 DeArrow downvotes 2024-01-03 01:13:35 -05:00
Ajay
33dad0a5e4 Add option to submit without locking
Also fixes voting for an existing thumbnail not unlocking other thumbnails
2024-01-02 19:12:55 -05:00
Ajay
ad439fd368 Make sure latest dump is not deleted 2023-12-28 19:10:12 -05:00
Ajay
21bb893a47 Merge branch 'master' of https://github.com/ajayyy/SponsorBlockServer 2023-12-28 18:23:45 -05:00
Ajay
211ecf700b Reject on dump failure to trigger a retry 2023-12-28 18:23:41 -05:00
Ajay Ramachandran
951d678640 Merge pull request #569 from mchangrh/fix-shadowban
clean up shadowban code, exclude long running categories query
2023-12-21 20:36:28 -05:00
Michael C
15f19df8a4 clean up shadowban code, exclude long running categories query when possible 2023-12-21 18:37:24 -05:00
Ajay Ramachandran
4a4d5776a1 Merge pull request #568 from ajayyy/revert-566-dependabot/npm_and_yarn/axios-1.6.0
Revert "Bump axios from 1.1.3 to 1.6.0"
2023-12-06 00:17:05 -05:00
Ajay Ramachandran
b3a28f7df3 Revert "Bump axios from 1.1.3 to 1.6.0" 2023-12-06 00:16:55 -05:00
Ajay Ramachandran
f763139664 Merge pull request #566 from ajayyy/dependabot/npm_and_yarn/axios-1.6.0
Bump axios from 1.1.3 to 1.6.0
2023-11-11 10:50:35 -05:00
dependabot[bot]
da482054a4 Bump axios from 1.1.3 to 1.6.0
Bumps [axios](https://github.com/axios/axios) from 1.1.3 to 1.6.0.
- [Release notes](https://github.com/axios/axios/releases)
- [Changelog](https://github.com/axios/axios/blob/v1.x/CHANGELOG.md)
- [Commits](https://github.com/axios/axios/compare/v1.1.3...v1.6.0)

---
updated-dependencies:
- dependency-name: axios
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2023-11-11 08:17:36 +00:00
Ajay
134e89af00 Merge branch 'master' of https://github.com/ajayyy/SponsorBlockServer 2023-11-06 16:07:46 -05:00
Ajay
5cc80f9066 Use dearrow thumbnails in webhooks 2023-11-06 16:07:44 -05:00
Ajay Ramachandran
e1043aba05 Merge pull request #565 from mchangrh/ajay-has-good-tests
non-blocking coverage tests
2023-11-04 18:18:35 -04:00
Michael C
c0abedf67f non-blocking coverage tests 2023-11-04 18:03:05 -04:00
Ajay Ramachandran
d99b7dc2c6 Merge pull request #563 from ajayyy/dependabot/npm_and_yarn/babel/traverse-7.23.2
Bump @babel/traverse from 7.18.6 to 7.23.2
2023-11-02 15:31:11 -04:00
Ajay
579e2b90a3 Make chapters easier to submit 2023-10-29 10:51:37 -04:00
Ajay
3708d293dc Add warning when locked title probably outdated 2023-10-27 00:39:36 -04:00
dependabot[bot]
077a9ecc50 Bump @babel/traverse from 7.18.6 to 7.23.2
Bumps [@babel/traverse](https://github.com/babel/babel/tree/HEAD/packages/babel-traverse) from 7.18.6 to 7.23.2.
- [Release notes](https://github.com/babel/babel/releases)
- [Changelog](https://github.com/babel/babel/blob/main/CHANGELOG.md)
- [Commits](https://github.com/babel/babel/commits/v7.23.2/packages/babel-traverse)

---
updated-dependencies:
- dependency-name: "@babel/traverse"
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2023-10-18 05:31:40 +00:00
Ajay Ramachandran
5714f51ac0 Merge pull request #561 from mchangrh/test-helpers
long overdue test helpers (partial)
2023-10-15 02:18:31 -04:00
Michael C
68bb39c409 Merge branch 'master' of https://github.com/ajayyy/SponsorBlockServer into test-helpers 2023-10-15 00:31:09 -04:00
Ajay
9dd8b28812 Merge branch 'master' of https://github.com/ajayyy/SponsorBlockServer 2023-10-02 19:56:12 -04:00
Ajay
a659048afe Hide downvotes titles 2023-10-02 19:56:08 -04:00
Michael C
3c6803fb62 getViewsForUser 2023-09-30 21:20:42 -04:00
Michael C
467443a03f fix typings for getSubmissionUUID, update getStatus 2023-09-30 19:08:23 -04:00
Michael C
d8b93dec00 getLockCategories sort categories 2023-09-29 00:45:43 -04:00
Michael C
26b3ea6a50 use partialDeepEquals for getLockCategories 2023-09-29 00:40:20 -04:00
Michael C
f72b1abf41 getLockCategories
- add insertLock
2023-09-28 23:44:14 -04:00
Michael C
53e5dcb2f0 addUserAsVIP
- add genAnonUser
2023-09-28 20:45:02 -04:00
Michael C
73e5ade529 generate random title 2023-09-28 03:57:06 -04:00
Michael C
31e1f5bc3c original as bool not string 2023-09-28 03:51:15 -04:00
Michael C
df40047a4b getUserInfo
- add info property to User
- add insertWarning, Ban
- add insertTitle, TitleVote, Thumbnail, ThumbnailVote
- simplified insertSegments with destructuring
2023-09-28 03:45:28 -04:00
Michael C
ad9344c92f getChapterNames fix length 2023-09-27 23:57:53 -04:00
Michael C
726983bb9b getChapterNames
- remove identifier from segmentGen
- add multiGenRandomValue
- add videoInfo query
2023-09-27 23:53:18 -04:00
Michael C
7364499f11 lockCategoriesHTTP
- highLoad
- compact getUserID
- add genRandomValue method
2023-09-27 23:19:25 -04:00
Michael C
5e3ec895d8 add videoID for segment inserter 2023-09-27 22:25:18 -04:00
Michael C
a9ef3815e2 add segment generator
- getIsUserVIP
- postClearCache
- update boilerplate
2023-09-27 22:21:42 -04:00
Michael C
964634dc51 update addFeatures
- add case_boilerplate
- add grantFeature query
2023-09-27 21:03:53 -04:00
Ajay Ramachandran
1e8970859f Merge pull request #559 from mini-bomba/✝️ℹ️🅿️
Replace "warning" with "tip" in responses from postWarning.ts
2023-09-27 20:40:34 -04:00
Michael C
4438ce7db6 add genUser frameworks, start fixing tests
transformative:
- getUserID
- redisTest
- reputation

mocha test reconfig:
- etag
- getIP
- userCounter
- validateVideoIDs
2023-09-27 20:18:35 -04:00
Ajay
86ea0f582b Merge branch 'master' of https://github.com/ajayyy/SponsorBlockServer 2023-09-27 20:09:16 -04:00
Ajay
e329bccca5 Remove shuffling for thumbnails 2023-09-27 20:09:15 -04:00
Ajay Ramachandran
1275afa25b Merge pull request #560 from mchangrh/no-empty-warnings
disallow empty new warnings
2023-09-27 15:17:25 -04:00
Michael C
1b5a079bbd disallow empty new warnings 2023-09-27 15:09:35 -04:00
Ajay
ad666ff487 Don't allow random time after 90% of video if no endcard submitted 2023-09-24 16:53:55 -04:00
mini-bomba
7196155d3a Replace "Warning reason" with "Tip message" 2023-09-16 10:00:33 +02:00
mini-bomba
934ce79728 Replace "warning" with "tip" in responses from postWarning.ts 2023-09-12 23:40:39 +02:00
Ajay
65e7d24b7d Fix get branding by hash rong query 2023-09-09 19:07:50 -04:00
Ajay
d08c423c6a Verify old submissions when adding dearrow feature 2023-09-06 10:53:14 -04:00
Ajay
8361f602c7 Less arm 2023-09-04 02:52:04 -04:00
Ajay Ramachandran
1e3a50b884 Merge pull request #558 from mini-bomba/dearrow-bans
Fix Dearrow bans + some bug fixes
2023-09-04 02:43:39 -04:00
mini-bomba
55150cb301 do user & IP queries asynchronously in checkbanStatus()
Co-authored-by: Kendell R <KTibow@users.noreply.github.com>
2023-08-31 14:59:22 +02:00
mini-bomba
2015cf1488 DB migration: Hide any visible dearrow submissions from banned users 2023-08-29 16:38:41 +02:00
mini-bomba
141f105b79 fix dearrow bans 2023-08-29 16:38:41 +02:00
mini-bomba
c2a3630d49 create an isUserBanned utility function 2023-08-29 16:38:41 +02:00
mini-bomba
c77e71e66a it's called a ✝️ℹ️🅿️, not warning 2023-08-29 13:48:50 +02:00
Ajay
345c740fdc Fix local key regex 2023-08-22 12:00:59 -04:00
Ajay
d84276a86a Merge branch 'master' of https://github.com/ajayyy/SponsorBlockServer 2023-08-21 17:48:17 -04:00
Ajay
16c7ad5531 Return 404 for non existent feature flag 2023-08-21 17:48:13 -04:00
Ajay Ramachandran
7cb0a0705c Merge pull request #555 from mchangrh/shadowban-stats
show stats to shadowhidden users
2023-08-15 20:00:09 -04:00
Michael C
4600b8a599 show stats to shadowhidden users 2023-08-15 19:45:17 -04:00
Ajay
e9e1fd5228 Require time when generating key 2023-08-06 13:42:33 -04:00
Ajay
48fa55cc7a Add feature flag endpoint 2023-08-05 23:01:24 -04:00
Ajay
ecfc2c14c8 Remove minimum submission duration for mutes 2023-08-05 13:59:02 -04:00
Ajay
f58da275eb Fix group by not working on postgres 2023-08-04 14:36:08 -04:00
Ajay
0723503a98 Add DeArrow submitter feature 2023-08-04 14:17:41 -04:00
Ajay
9d1af3bdff Verify old submissions when you become verified 2023-08-04 14:15:46 -04:00
Ajay
b3cec20215 Better handling of verification for self downvotes 2023-08-04 13:53:23 -04:00
Ajay
b02134c016 Don't send angle brackets 2023-08-04 13:15:43 -04:00
Ajay
c3c8f38423 Rename var to be more clear 2023-08-03 01:50:50 -04:00
Ajay
1dbb393e4d Fix type error in tests 2023-08-03 01:16:57 -04:00
Ajay
dfa4578d28 Better token generation 2023-08-03 00:58:01 -04:00
Ajay
99cb22a5e6 Also clear branding cache when clearing segment cache 2023-07-29 18:44:52 -04:00
Ajay
665b91eb65 Revert distinct selection change as it seems to be misbehaving 2023-07-27 03:04:46 -04:00
Ajay Ramachandran
e942ac5e22 Merge pull request #549 from mini-bomba/voting-requirements
Make voting requirements more strict
2023-07-26 16:06:18 -04:00
Ajay Ramachandran
83b561d943 Merge pull request #548 from mchangrh/restic-update
change container to do chmod in builder
2023-07-26 16:03:49 -04:00
Ajay
f0b0217c78 Fix distinct query on postgres 2023-07-26 16:02:21 -04:00
Ajay
d23e9b9940 Only show one title/thumbnail per userID 2023-07-26 15:19:22 -04:00
Ajay
4b214767a0 Add buildx 2023-07-25 11:49:33 -04:00
Ajay
8c687934c2 build arm images 2023-07-25 11:36:59 -04:00
Ajay
f63fa09605 Handle exceptions, and prevent crashing from unhandled exceptions 2023-07-24 21:25:18 -04:00
Ajay
4e93a007c2 Remove unnecessary call in userInfo 2023-07-24 14:55:31 -04:00
Ajay
2fc31655ff Add different max for private db connection 2023-07-24 13:56:09 -04:00
Ajay
79515ccc8b Add unlocking to long title error 2023-07-23 23:45:02 -04:00
Ajay
b6f29b8b6d Fix shadow ban test 2023-07-23 23:35:50 -04:00
Ajay
a52ecf2d37 Add more unlock calls 2023-07-23 23:32:38 -04:00
Ajay
8d518b184b Change default lock timeout 2023-07-23 23:30:17 -04:00
Ajay
3924a65e02 Don't use locks when redis disabled 2023-07-23 23:28:41 -04:00
Ajay
a4de94bede Remove leftover timeout 2023-07-23 23:28:26 -04:00
Ajay
8bcc781da7 Add locks to different write operations 2023-07-23 23:21:50 -04:00
Ajay
b2081fe155 Add unique constraint for titles 2023-07-23 22:53:14 -04:00
Ajay
ea80a413ba Add postgres private db stats 2023-07-23 14:26:25 -04:00
Ajay
528f24a431 Merge branch 'master' of https://github.com/ajayyy/SponsorBlockServer 2023-07-23 14:01:50 -04:00
Ajay
0463165f1a Add timing trace to set username 2023-07-23 14:01:49 -04:00
Ajay Ramachandran
38b7ddfd07 Merge pull request #553 from ajayyy/dependabot/npm_and_yarn/word-wrap-1.2.4
Bump word-wrap from 1.2.3 to 1.2.4
2023-07-19 01:08:16 -04:00
dependabot[bot]
79bac69c41 Bump word-wrap from 1.2.3 to 1.2.4
Bumps [word-wrap](https://github.com/jonschlinkert/word-wrap) from 1.2.3 to 1.2.4.
- [Release notes](https://github.com/jonschlinkert/word-wrap/releases)
- [Commits](https://github.com/jonschlinkert/word-wrap/compare/1.2.3...1.2.4)

---
updated-dependencies:
- dependency-name: word-wrap
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2023-07-19 03:10:24 +00:00
Ajay
85fc0477ad Fix type check for warning duplicate 2023-07-17 23:38:52 -04:00
Ajay
a5501b9655 Fix user info not checking warning type 2023-07-17 22:53:12 -04:00
Ajay
e9fcf6b445 Add seperate type for dearrow warning
Also add dearrow warning reason as option for user info
2023-07-17 22:42:29 -04:00
Ajay
808066a5ed Look at -1 votes for branding sb segments 2023-07-17 19:36:58 -04:00
Ajay
93f4cfd82d Add option to return userID in branding call 2023-07-16 15:35:22 -04:00
Ajay
d030de83bd Add leaderboard for dearrow 2023-07-07 14:32:45 -04:00
Ajay
d1d2b011f8 Add max title length 2023-07-06 16:36:37 -04:00
Ajay
d2f8e3aee4 Fix shadow hide video branding logic 2023-07-06 14:53:48 -04:00
Ajay
2e29666781 Remove mysql code 2023-07-05 01:25:38 -04:00
Ajay
8b418c8851 Add hiding dearrow submissions in ban code 2023-07-05 01:23:48 -04:00
Ajay
5f80562772 Make dearrow verification easier to get 2023-06-28 22:12:32 -04:00
Ajay
69db87f5e1 Fix 0 second submissions not allowed 2023-06-23 23:05:52 -04:00
Ajay
fa6919a1d0 Add branding stats 2023-06-14 19:50:26 -04:00
Ajay
633f128e90 Fix voting on an existing submission not working. 2023-06-13 00:12:16 -04:00
Ajay
9f7fa53b14 Fix sort order 2023-06-12 11:51:17 -04:00
Ajay
bbb7102e37 Derank original submissions 2023-06-12 11:41:57 -04:00
Ajay
3bb8d5b58b Add verification where new users start with lower votes 2023-06-10 12:35:43 -04:00
Ajay
1cacb2dd69 Fix random time calculator for starting empty segment 2023-06-08 18:22:26 -04:00
Ajay
fe185234cf Add fallback video duration when finding random timestamp 2023-06-08 18:05:40 -04:00
Ajay
ef3e48ec24 Send video duration if known 2023-06-08 15:28:37 -04:00
mini-bomba
777944665d Make voting requirements more strict
This aims to reduce the amount of false votes by users with no valid segments of the category they're voting for.
New tests included, one modified to work under new requirements.
Also merged userAbleToVote and ableToVote in voteOnSponsorTime.ts to skip unnecessary queries for VIPs.
2023-06-08 16:50:31 +02:00
Ajay
0932f63398 Await in post branding test case 2023-06-08 04:01:23 -04:00
Ajay
5834643ba0 Add random timestamp generation to get branding 2023-06-08 03:39:44 -04:00
Ajay
8e5be402e1 Fix VIP title and thumbnail unlock unlocking everything 2023-05-31 14:28:01 -04:00
Michael C
e253c7bb47 change container to do chmod in builder 2023-05-23 19:39:16 -04:00
Ajay
9129cee9f0 Fix tests 2023-05-21 20:12:17 -04:00
Ajay
39fcdb1d95 Allow more chapter names to appear in suggestions 2023-05-21 20:03:53 -04:00
Ajay
8d1025e17d Add title and thumbnails to user stats 2023-05-09 23:53:18 -04:00
Ajay
6f0abddd3e Reenable locks 2023-04-28 14:15:18 -04:00
Ajay
a1b5c38e5a Disable lock tests for now 2023-04-17 19:38:31 -04:00
168 changed files with 10048 additions and 4942 deletions

View File

@@ -31,15 +31,15 @@ module.exports = {
},
overrides: [
{
files: ["src/**/*.ts"],
files: ["**/*.ts"],
parserOptions: {
project: ["./tsconfig.json"],
project: ["./tsconfig.eslint.json"],
},
rules: {
"@typescript-eslint/no-misused-promises": "warn",
"@typescript-eslint/no-floating-promises" : "warn"
"@typescript-eslint/no-misused-promises": "error",
"@typescript-eslint/no-floating-promises" : "error"
}
},
],

View File

@@ -22,10 +22,10 @@ jobs:
permissions:
packages: write
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
- name: Docker meta
id: meta
uses: docker/metadata-action@v4
uses: docker/metadata-action@v5
with:
images: |
ghcr.io/${{ inputs.username }}/${{ inputs.name }}
@@ -34,14 +34,21 @@ jobs:
flavor: |
latest=true
- name: Login to GHCR
uses: docker/login-action@v2
uses: docker/login-action@v3
with:
registry: ghcr.io
username: ${{ github.repository_owner }}
password: ${{ secrets.GH_TOKEN }}
- name: Set up QEMU
uses: docker/setup-qemu-action@v3
with:
platforms: arm,arm64
- name: Set up buildx
uses: docker/setup-buildx-action@v3
- name: push
uses: docker/build-push-action@v3
uses: docker/build-push-action@v6
with:
context: ${{ inputs.folder }}
platforms: linux/amd64,linux/arm64
push: true
tags: ${{ steps.meta.outputs.tags }}

16
.github/workflows/error-server.yml vendored Normal file
View File

@@ -0,0 +1,16 @@
name: Docker image builds
on:
push:
branches:
- master
workflow_dispatch:
jobs:
error-server:
uses: ./.github/workflows/docker-build.yml
with:
name: "error-server"
username: "ajayyy"
folder: "./containers/error-server"
secrets:
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}

View File

@@ -14,8 +14,8 @@ jobs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions/setup-node@v3
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
with:
node-version: 18
cache: npm
@@ -26,7 +26,7 @@ jobs:
- name: Run Server
timeout-minutes: 10
run: npm start
- uses: actions/upload-artifact@v3
- uses: actions/upload-artifact@v4
with:
name: SponsorTimesDB.db
path: databases/sponsorTimes.db

View File

@@ -2,15 +2,24 @@ name: Docker image builds
on:
push:
branches:
- debug
- master
workflow_dispatch:
jobs:
sb-server:
uses: ./.github/workflows/docker-build.yml
with:
name: "sb-server-debug"
name: "sb-server"
username: "ajayyy"
folder: "."
secrets:
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
rsync-host:
needs: sb-server
uses: ./.github/workflows/docker-build.yml
with:
name: "rsync-host"
username: "ajayyy"
folder: "./containers/rsync"
secrets:
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}

View File

@@ -12,8 +12,8 @@ jobs:
name: Lint with ESLint and build
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions/setup-node@v3
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
with:
node-version: 18
cache: npm
@@ -21,7 +21,7 @@ jobs:
- run: npm run lint
- run: npm run tsc
- name: cache dist build
uses: actions/cache/save@v3
uses: actions/cache/save@v4
with:
key: dist-${{ github.sha }}
path: |
@@ -32,13 +32,13 @@ jobs:
runs-on: ubuntu-latest
needs: lint-build
steps:
- uses: actions/checkout@v3
- uses: actions/setup-node@v3
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
with:
node-version: 18
cache: npm
- id: cache
uses: actions/cache/restore@v3
uses: actions/cache/restore@v4
with:
key: dist-${{ github.sha }}
path: |
@@ -46,11 +46,14 @@ jobs:
${{ github.workspace }}/node_modules
- if: steps.cache.outputs.cache-hit != 'true'
run: npm ci
env:
youTubeKeys_visitorData: ${{ secrets.YOUTUBEKEYS_VISITORDATA }}
youTubeKeys_poToken: ${{ secrets.YOUTUBEKEYS_POTOKEN }}
- name: Run SQLite Tests
timeout-minutes: 5
run: npx nyc --silent npm test
- name: cache nyc output
uses: actions/cache/save@v3
uses: actions/cache/save@v4
with:
key: nyc-sqlite-${{ github.sha }}
path: ${{ github.workspace }}/.nyc_output
@@ -59,20 +62,20 @@ jobs:
runs-on: ubuntu-latest
needs: lint-build
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
- name: Build the docker-compose stack
env:
PG_USER: ci_db_user
PG_PASS: ci_db_pass
run: docker-compose -f docker/docker-compose-ci.yml up -d
run: docker compose -f docker/docker-compose-ci.yml up -d
- name: Check running containers
run: docker ps
- uses: actions/setup-node@v3
- uses: actions/setup-node@v4
with:
node-version: 18
cache: npm
- id: cache
uses: actions/cache/restore@v3
uses: actions/cache/restore@v4
with:
key: dist-${{ github.sha }}
path: |
@@ -83,10 +86,12 @@ jobs:
- name: Run Postgres Tests
env:
TEST_POSTGRES: true
youTubeKeys_visitorData: ${{ secrets.YOUTUBEKEYS_VISITORDATA }}
youTubeKeys_poToken: ${{ secrets.YOUTUBEKEYS_POTOKEN }}
timeout-minutes: 5
run: npx nyc --silent npm test
- name: cache nyc output
uses: actions/cache/save@v3
uses: actions/cache/save@v4
with:
key: nyc-postgres-${{ github.sha }}
path: ${{ github.workspace }}/.nyc_output
@@ -95,22 +100,22 @@ jobs:
name: Run Codecov
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions/setup-node@v3
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
with:
node-version: 18
cache: npm
- run: npm ci
- name: restore postgres nyc output
uses: actions/cache/restore@v3
uses: actions/cache/restore@v4
with:
key: nyc-postgres-${{ github.sha }}
path: ${{ github.workspace }}/.nyc_output
- name: restore sqlite nyc output
uses: actions/cache/restore@v3
uses: actions/cache/restore@v4
with:
key: nyc-sqlite-${{ github.sha }}
path: ${{ github.workspace }}/.nyc_output
- run: npx nyc report --reporter=lcov
- name: Upload coverage reports to Codecov
uses: codecov/codecov-action@v3
uses: codecov/codecov-action@v4

4
.gitignore vendored
View File

@@ -48,4 +48,6 @@ working
# nyc coverage output
.nyc_output/
coverage/
coverage/
.vscode

View File

@@ -1,20 +1,29 @@
# SponsorTimesDB
[vipUsers](#vipUsers)
[sponsorTimes](#sponsorTimes)
[userNames](#userNames)
[categoryVotes](#categoryVotes)
[lockCategories](#lockCategories)
[warnings](#warnings)
[shadowBannedUsers](#shadowBannedUsers)
[unlistedVideos](#unlistedVideos)
[config](#config)
[archivedSponsorTimes](#archivedSponsorTimes)
- [vipUsers](#vipusers)
- [sponsorTimes](#sponsortimes)
- [userNames](#usernames)
- [categoryVotes](#categoryvotes)
- [lockCategories](#lockcategories)
- [warnings](#warnings)
- [shadowBannedUsers](#shadowbannedusers)
- [videoInfo](#videoinfo)
- [unlistedVideos](#unlistedvideos)
- [config](#config)
- [archivedSponsorTimes](#archivedsponsortimes)
- [ratings](#ratings)
- [userFeatures](#userFeatures)
- [shadowBannedIPs](#shadowBannedIPs)
- [titles](#titles)
- [titleVotes](#titleVotes)
- [thumbnails](#thumbnails)
- [thumbnailTimestamps](#thumbnailTimestamps)
- [thumbnailVotes](#thumbnailVotes)
### vipUsers
| Name | Type | |
| -- | :--: | -- |
| userID | TEXT | not null |
| userID | TEXT | not null, primary key |
| index | field |
| -- | :--: |
@@ -30,7 +39,7 @@
| votes | INTEGER | not null |
| locked | INTEGER | not null, default '0' |
| incorrectVotes | INTEGER | not null, default 1 |
| UUID | TEXT | not null, unique |
| UUID | TEXT | not null, unique, primary key |
| userID | TEXT | not null |
| timeSubmitted | INTEGER | not null |
| views | INTEGER | not null |
@@ -50,14 +59,16 @@
| sponsorTime_timeSubmitted | timeSubmitted |
| sponsorTime_userID | userID |
| sponsorTimes_UUID | UUID |
| sponsorTimes_hashedVideoID | hashedVideoID, category |
| sponsorTimes_videoID | videoID, service, category, timeSubmitted |
| sponsorTimes_hashedVideoID | service, hashedVideoID, startTime |
| sponsorTimes_videoID | service, videoID, startTime |
| sponsorTimes_videoID_category | videoID, category |
| sponsorTimes_description_gin | description, category |
### userNames
| Name | Type | |
| -- | :--: | -- |
| userID | TEXT | not null |
| userID | TEXT | not null, primary key |
| userName | TEXT | not null |
| locked | INTEGER | not nul, default '0' |
@@ -72,6 +83,7 @@
| UUID | TEXT | not null |
| category | TEXT | not null |
| votes | INTEGER | not null, default 0 |
| id | SERIAL | primary key
| index | field |
| -- | :--: |
@@ -88,6 +100,7 @@
| hashedVideoID | TEXT | not null, default '' |
| reason | TEXT | not null, default '' |
| service | TEXT | not null, default 'YouTube' |
| id | SERIAL | primary key
| index | field |
| -- | :--: |
@@ -102,17 +115,22 @@
| issuerUserID | TEXT | not null |
| enabled | INTEGER | not null |
| reason | TEXT | not null, default '' |
| type | INTEGER | default 0 |
| constraint | field |
| -- | :--: |
| PRIMARY KEY | userID, issueTime |
| index | field |
| -- | :--: |
| warnings_index | userID |
| warnings_index | userID, issueTime, enabled |
| warnings_issueTime | issueTime |
### shadowBannedUsers
| Name | Type | |
| -- | :--: | -- |
| userID | TEXT | not null |
| userID | TEXT | not null, primary key |
| index | field |
| -- | :--: |
@@ -129,8 +147,8 @@
| index | field |
| -- | :--: |
| videoInfo_videoID | timeSubmitted |
| videoInfo_channelID | userID |
| videoInfo_videoID | videoID |
| videoInfo_channelID | channelID |
### unlistedVideos
@@ -142,12 +160,13 @@
| channelID | TEXT | not null |
| timeSubmitted | INTEGER | not null |
| service | TEXT | not null, default 'YouTube' |
| id | SERIAL | primary key
### config
| Name | Type | |
| -- | :--: | -- |
| key | TEXT | not null, unique |
| key | TEXT | not null, unique, primary key |
| value | TEXT | not null |
### archivedSponsorTimes
@@ -160,7 +179,7 @@
| votes | INTEGER | not null |
| locked | INTEGER | not null, default '0' |
| incorrectVotes | INTEGER | not null, default 1 |
| UUID | TEXT | not null, unique |
| UUID | TEXT | not null, unique, primary key |
| userID | TEXT | not null |
| timeSubmitted | INTEGER | not null |
| views | INTEGER | not null |
@@ -173,6 +192,7 @@
| shadowHidden | INTEGER | not null |
| hashedVideoID | TEXT | not null, default '', sha256 |
| userAgent | TEXT | not null, default '' |
| description | TEXT | not null, default '' |
### ratings
@@ -183,6 +203,7 @@
| type | INTEGER | not null |
| count | INTEGER | not null |
| hashedVideoID | TEXT | not null |
| id | SERIAL | primary key
| index | field |
| -- | :--: |
@@ -190,15 +211,125 @@
| ratings_hashedVideoID | hashedVideoID, service |
| ratings_videoID | videoID, service |
### userFeatures
| Name | Type | |
| -- | :--: | -- |
| userID | TEXT | not null |
| feature | INTEGER | not null |
| issuerUserID | TEXT | not null |
| timeSubmitted | INTEGER | not null |
| constraint | field |
| -- | :--: |
| primary key | userID, feature |
| index | field |
| -- | :--: |
| userFeatures_userID | userID, feature |
### shadowBannedIPs
| Name | Type | |
| -- | :--: | -- |
| hashedIP | TEXT | not null, primary key |
### titles
| Name | Type | |
| -- | :--: | -- |
| videoID | TEXT | not null |
| title | TEXT | not null |
| original | INTEGER | default 0 |
| userID | TEXT | not null
| service | TEXT | not null |
| hashedVideoID | TEXT | not null |
| timeSubmitted | INTEGER | not null |
| UUID | TEXT | not null, primary key
| index | field |
| -- | :--: |
| titles_timeSubmitted | timeSubmitted |
| titles_userID_timeSubmitted | videoID, service, userID, timeSubmitted |
| titles_videoID | videoID, service |
| titles_hashedVideoID_2 | service, hashedVideoID, timeSubmitted |
### titleVotes
| Name | Type | |
| -- | :--: | -- |
| UUID | TEXT | not null, primary key |
| votes | INTEGER | not null, default 0 |
| locked | INTEGER | not null, default 0 |
| shadowHidden | INTEGER | not null, default 0 |
| verification | INTEGER | default 0 |
| downvotes | INTEGER | default 0 |
| removed | INTEGER | default 0 |
| constraint | field |
| -- | :--: |
| foreign key | UUID references "titles"("UUID")
| index | field |
| -- | :--: |
| titleVotes_votes | UUID, votes
### thumbnails
| Name | Type | |
| -- | :--: | -- |
| original | INTEGER | default 0 |
| userID | TEXT | not null |
| service | TEXT | not null |
| hashedVideoID | TEXT | not null |
| timeSubmitted | INTEGER | not null |
| UUID | TEXT | not null, primary key |
| index | field |
| -- | :--: |
| thumbnails_timeSubmitted | timeSubmitted |
| thumbnails_votes_timeSubmitted | videoID, service, userID, timeSubmitted |
| thumbnails_videoID | videoID, service |
| thumbnails_hashedVideoID_2 | service, hashedVideoID, timeSubmitted |
### thumbnailTimestamps
| index | field |
| -- | :--: |
| UUID | TEXT | not null, primary key
| timestamp | INTEGER | not null, default 0
| constraint | field |
| -- | :--: |
| foreign key | UUID references "thumbnails"("UUID")
### thumbnailVotes
| Name | Type | |
| -- | :--: | -- |
| UUID | TEXT | not null, primary key |
| votes | INTEGER | not null, default 0 |
| locked | INTEGER |not null, default 0 |
| shadowHidden | INTEGER | not null, default 0 |
| downvotes | INTEGER | default 0 |
| removed | INTEGER | default 0 |
| constraint | field |
| -- | :--: |
| foreign key | UUID references "thumbnails"("UUID")
| index | field |
| -- | :--: |
| thumbnailVotes_votes | UUID, votes
# Private
[votes](#votes)
[categoryVotes](#categoryVotes)
[sponsorTimes](#sponsorTimes)
[config](#config)
[ratings](#ratings)
[tempVipLog](#tempVipLog)
[userNameLogs](#userNameLogs)
- [votes](#votes)
- [categoryVotes](#categoryVotes)
- [sponsorTimes](#sponsorTimes)
- [config](#config)
- [ratings](#ratings)
- [tempVipLog](#tempVipLog)
- [userNameLogs](#userNameLogs)
### votes
@@ -209,6 +340,7 @@
| hashedIP | TEXT | not null |
| type | INTEGER | not null |
| originalVoteType | INTEGER | not null | # Since type was reused to also specify the number of votes removed when less than 0, this is being used for the actual type
| id | SERIAL | primary key |
| index | field |
| -- | :--: |
@@ -223,10 +355,11 @@
| hashedIP | TEXT | not null |
| category | TEXT | not null |
| timeSubmitted | INTEGER | not null |
| id | SERIAL | primary key |
| index | field |
| -- | :--: |
| categoryVotes_UUID | UUID, userID, hasedIP, category |
| categoryVotes_UUID | UUID, userID, hashedIP, category |
### sponsorTimes
@@ -236,17 +369,17 @@
| hashedIP | TEXT | not null |
| timeSubmitted | INTEGER | not null |
| service | TEXT | not null, default 'YouTube' |
| id | SERIAL | primary key |
| index | field |
| -- | :--: |
| sponsorTimes_hashedIP | hashedIP |
| privateDB_sponsorTimes_videoID_v2 | videoID, service |
| privateDB_sponsorTimes_v4 | videoID, service, timeSubmitted |
### config
| Name | Type | |
| -- | :--: | -- |
| key | TEXT | not null |
| key | TEXT | not null, primary key |
| value | TEXT | not null |
### ratings
@@ -259,6 +392,7 @@
| type | INTEGER | not null |
| timeSubmitted | INTEGER | not null |
| hashedIP | TEXT | not null |
| id | SERIAL | primary key |
| index | field |
| -- | :--: |
@@ -271,6 +405,7 @@
| targetUserID | TEXT | not null |
| enabled | BOOLEAN | not null |
| updatedAt | INTEGER | not null |
| id | SERIAL | primary key |
### userNameLogs
@@ -281,3 +416,4 @@
| oldUserName | TEXT | not null |
| updatedByAdmin | BOOLEAN | not null |
| updatedAt | INTEGER | not null |
| id | SERIAL | primary key |

View File

@@ -9,7 +9,7 @@ WORKDIR /usr/src/app
RUN apk add --no-cache git postgresql-client
COPY --from=builder ./node_modules ./node_modules
COPY --from=builder ./dist ./dist
COPY ./.git ./.git
COPY ./.git/ ./.git
COPY entrypoint.sh .
COPY databases/*.sql databases/
EXPOSE 8080

View File

@@ -56,7 +56,6 @@
]
}
],
"hoursAfterWarningExpires": 24,
"rateLimit": {
"vote": {
"windowMs": 900000,

View File

@@ -1 +1,9 @@
comment: false
coverage:
status:
project:
default:
informational: true
patch:
default:
informational: true

View File

@@ -25,8 +25,6 @@
"webhooks": [],
"categoryList": ["sponsor", "intro", "outro", "interaction", "selfpromo", "preview", "music_offtopic", "poi_highlight"], // List of supported categories any other category will be rejected
"getTopUsersCacheTimeMinutes": 5, // cacheTime for getTopUsers result in minutes
"maxNumberOfActiveWarnings": 3, // Users with this number of warnings will be blocked until warnings expire
"hoursAfterWarningExpire": 24,
"rateLimit": {
"vote": {
"windowMs": 900000, // 15 minutes

View File

@@ -1,13 +1,13 @@
FROM alpine
RUN apk add postgresql-client
RUN apk add restic --repository http://dl-cdn.alpinelinux.org/alpine/latest-stable/community/
FROM alpine as builder
WORKDIR /scripts
COPY ./backup.sh ./backup.sh
COPY ./forget.sh ./forget.sh
COPY ./backup.sh /usr/src/app/backup.sh
RUN chmod +x /usr/src/app/backup.sh
COPY ./forget.sh /usr/src/app/forget.sh
RUN chmod +x /usr/src/app/forget.sh
FROM alpine
RUN apk add --no-cache postgresql-client restic
COPY --from=builder --chmod=755 /scripts /usr/src/app/
RUN echo '30 * * * * /usr/src/app/backup.sh' >> /etc/crontabs/root
RUN echo '10 0 * * 1 /usr/src/app/forget.sh' >> /etc/crontabs/root
RUN echo '10 0 * * */2 /usr/src/app/forget.sh' >> /etc/crontabs/root
CMD crond -l 2 -f

View File

@@ -1 +1 @@
restic forget --prune --keep-last 48 --keep-daily 7 --keep-weekly 8
restic forget --prune --keep-hourly 24 --keep-daily 7 --keep-weekly 8

View File

@@ -0,0 +1,4 @@
FROM nginx as app
EXPOSE 80
COPY nginx.conf /etc/nginx/nginx.conf
COPY default.conf /etc/nginx/conf.d/default.conf

View File

@@ -0,0 +1,9 @@
server {
listen 80;
listen [::]:80;
server_name localhost;
location / {
return 503;
}
}

View File

@@ -0,0 +1,19 @@
user nginx;
worker_processes auto;
error_log /var/log/nginx/error.log notice;
pid /var/run/nginx.pid;
events {
worker_connections 4096;
}
http {
include /etc/nginx/mime.types;
default_type application/octet-stream;
access_log off;
error_log /dev/null crit;
include /etc/nginx/conf.d/*.conf;
}

View File

@@ -1,6 +1,6 @@
FROM ghcr.io/ajayyy/sb-server:latest
EXPOSE 873/tcp
RUN apk add rsync>3.2.4-r0
RUN apk add rsync>3.4.1-r0
RUN mkdir /usr/src/app/database-export
CMD rsync --no-detach --daemon & ./entrypoint.sh
CMD rsync --no-detach --daemon & ./entrypoint.sh

View File

@@ -44,4 +44,15 @@ CREATE TABLE IF NOT EXISTS "thumbnailVotes" (
"type" INTEGER NOT NULL
);
CREATE TABLE IF NOT EXISTS "casualVotes" (
"UUID" SERIAL PRIMARY KEY,
"videoID" TEXT NOT NULL,
"service" TEXT NOT NULL,
"userID" TEXT NOT NULL,
"hashedIP" TEXT NOT NULL,
"category" TEXT NOT NULL,
"type" INTEGER NOT NULL,
"timeSubmitted" INTEGER NOT NULL
);
COMMIT;

View File

@@ -23,4 +23,16 @@ CREATE INDEX IF NOT EXISTS "categoryVotes_UUID"
CREATE INDEX IF NOT EXISTS "ratings_videoID"
ON public."ratings" USING btree
("videoID" COLLATE pg_catalog."default" ASC NULLS LAST, service COLLATE pg_catalog."default" ASC NULLS LAST, "userID" COLLATE pg_catalog."default" ASC NULLS LAST, "timeSubmitted" ASC NULLS LAST)
TABLESPACE pg_default;
-- casualVotes
CREATE INDEX IF NOT EXISTS "casualVotes_videoID"
ON public."casualVotes" USING btree
("videoID" COLLATE pg_catalog."default" ASC NULLS LAST, "service" COLLATE pg_catalog."default" ASC NULLS LAST, "userID" COLLATE pg_catalog."default" ASC NULLS LAST)
TABLESPACE pg_default;
CREATE INDEX IF NOT EXISTS "casualVotes_userID"
ON public."casualVotes" USING btree
("userID" COLLATE pg_catalog."default" ASC NULLS LAST)
TABLESPACE pg_default;

View File

@@ -84,6 +84,26 @@ CREATE TABLE IF NOT EXISTS "thumbnailVotes" (
FOREIGN KEY("UUID") REFERENCES "thumbnails"("UUID")
);
CREATE TABLE IF NOT EXISTS "casualVotes" (
"UUID" TEXT PRIMARY KEY,
"videoID" TEXT NOT NULL,
"service" TEXT NOT NULL,
"hashedVideoID" TEXT NOT NULL,
"category" TEXT NOT NULL,
"upvotes" INTEGER NOT NULL default 0,
"downvotes" INTEGER NOT NULL default 0,
"timeSubmitted" INTEGER NOT NULL
);
CREATE TABLE IF NOT EXISTS "casualVoteTitles" (
"videoID" TEXT NOT NULL,
"service" TEXT NOT NULL,
"id" INTEGER NOT NULL,
"hashedVideoID" TEXT NOT NULL,
"title" TEXT NOT NULL,
PRIMARY KEY("videoID", "service", "id")
);
CREATE EXTENSION IF NOT EXISTS pgcrypto; --!sqlite-ignore
CREATE EXTENSION IF NOT EXISTS pg_trgm; --!sqlite-ignore

View File

@@ -124,14 +124,26 @@ CREATE INDEX IF NOT EXISTS "titles_timeSubmitted"
("timeSubmitted" ASC NULLS LAST)
TABLESPACE pg_default;
CREATE INDEX IF NOT EXISTS "titles_userID_timeSubmitted"
ON public."titles" USING btree
("videoID" COLLATE pg_catalog."default" ASC NULLS LAST, "service" COLLATE pg_catalog."default" ASC NULLS LAST, "userID" COLLATE pg_catalog."default" DESC NULLS LAST, "timeSubmitted" DESC NULLS LAST)
TABLESPACE pg_default;
CREATE INDEX IF NOT EXISTS "titles_videoID"
ON public."titles" USING btree
("videoID" COLLATE pg_catalog."default" ASC NULLS LAST, "service" COLLATE pg_catalog."default" ASC NULLS LAST)
TABLESPACE pg_default;
CREATE INDEX IF NOT EXISTS "titles_hashedVideoID"
CREATE INDEX IF NOT EXISTS "titles_hashedVideoID_2"
ON public."titles" USING btree
("hashedVideoID" COLLATE pg_catalog."default" ASC NULLS LAST, "service" COLLATE pg_catalog."default" ASC NULLS LAST)
(service COLLATE pg_catalog."default" ASC NULLS LAST, "hashedVideoID" text_pattern_ops ASC NULLS LAST, "timeSubmitted" ASC NULLS LAST)
TABLESPACE pg_default;
-- titleVotes
CREATE INDEX IF NOT EXISTS "titleVotes_votes"
ON public."titleVotes" USING btree
("UUID" COLLATE pg_catalog."default" ASC NULLS LAST, "votes" DESC NULLS LAST)
TABLESPACE pg_default;
-- thumbnails
@@ -141,12 +153,46 @@ CREATE INDEX IF NOT EXISTS "thumbnails_timeSubmitted"
("timeSubmitted" ASC NULLS LAST)
TABLESPACE pg_default;
CREATE INDEX IF NOT EXISTS "thumbnails_votes_timeSubmitted"
ON public."thumbnails" USING btree
("videoID" COLLATE pg_catalog."default" ASC NULLS LAST, "service" COLLATE pg_catalog."default" ASC NULLS LAST, "userID" COLLATE pg_catalog."default" DESC NULLS LAST, "timeSubmitted" DESC NULLS LAST)
TABLESPACE pg_default;
CREATE INDEX IF NOT EXISTS "thumbnails_videoID"
ON public."thumbnails" USING btree
("videoID" COLLATE pg_catalog."default" ASC NULLS LAST, "service" COLLATE pg_catalog."default" ASC NULLS LAST)
TABLESPACE pg_default;
CREATE INDEX IF NOT EXISTS "thumbnails_hashedVideoID"
CREATE INDEX IF NOT EXISTS "thumbnails_hashedVideoID_2"
ON public."thumbnails" USING btree
("hashedVideoID" COLLATE pg_catalog."default" ASC NULLS LAST, "service" COLLATE pg_catalog."default" ASC NULLS LAST)
(service COLLATE pg_catalog."default" ASC NULLS LAST, "hashedVideoID" text_pattern_ops ASC NULLS LAST, "timeSubmitted" ASC NULLS LAST)
TABLESPACE pg_default;
-- thumbnailVotes
CREATE INDEX IF NOT EXISTS "thumbnailVotes_votes"
ON public."thumbnailVotes" USING btree
("UUID" COLLATE pg_catalog."default" ASC NULLS LAST, "votes" DESC NULLS LAST)
TABLESPACE pg_default;
-- casualVotes
CREATE INDEX IF NOT EXISTS "casualVotes_timeSubmitted"
ON public."casualVotes" USING btree
("timeSubmitted" ASC NULLS LAST)
TABLESPACE pg_default;
CREATE INDEX IF NOT EXISTS "casualVotes_userID_timeSubmitted"
ON public."casualVotes" USING btree
("videoID" COLLATE pg_catalog."default" ASC NULLS LAST, "service" COLLATE pg_catalog."default" ASC NULLS LAST, "timeSubmitted" DESC NULLS LAST)
TABLESPACE pg_default;
CREATE INDEX IF NOT EXISTS "casualVotes_videoID"
ON public."casualVotes" USING btree
("videoID" COLLATE pg_catalog."default" ASC NULLS LAST, "service" COLLATE pg_catalog."default" ASC NULLS LAST)
TABLESPACE pg_default;
CREATE INDEX IF NOT EXISTS "casualVotes_hashedVideoID_2"
ON public."casualVotes" USING btree
(service COLLATE pg_catalog."default" ASC NULLS LAST, "hashedVideoID" text_pattern_ops ASC NULLS LAST, "timeSubmitted" ASC NULLS LAST)
TABLESPACE pg_default;

View File

@@ -0,0 +1,7 @@
BEGIN TRANSACTION;
ALTER TABLE "casualVotes" DROP COLUMN "type";
UPDATE "config" SET value = 12 WHERE key = 'version';
COMMIT;

View File

@@ -0,0 +1,7 @@
BEGIN TRANSACTION;
ALTER TABLE "casualVotes" ADD "titleID" INTEGER default 0;
UPDATE "config" SET value = 13 WHERE key = 'version';
COMMIT;

View File

@@ -0,0 +1,7 @@
BEGIN TRANSACTION;
ALTER TABLE "titleVotes" ADD "verification" INTEGER default 0;
UPDATE "config" SET value = 35 WHERE key = 'version';
COMMIT;

View File

@@ -0,0 +1,7 @@
BEGIN TRANSACTION;
ALTER TABLE "warnings" ADD "type" INTEGER default 0;
UPDATE "config" SET value = 36 WHERE key = 'version';
COMMIT;

View File

@@ -0,0 +1,7 @@
BEGIN TRANSACTION;
ALTER TABLE "titles" ADD UNIQUE ("videoID", "title"); --!sqlite-ignore
UPDATE "config" SET value = 37 WHERE key = 'version';
COMMIT;

View File

@@ -0,0 +1,11 @@
BEGIN TRANSACTION;
UPDATE "titleVotes" SET "shadowHidden" = 1
WHERE "UUID" IN (SELECT "UUID" FROM "titles" INNER JOIN "shadowBannedUsers" "bans" ON "titles"."userID" = "bans"."userID");
UPDATE "thumbnailVotes" SET "shadowHidden" = 1
WHERE "UUID" IN (SELECT "UUID" FROM "thumbnails" INNER JOIN "shadowBannedUsers" "bans" ON "thumbnails"."userID" = "bans"."userID");
UPDATE "config" SET value = 38 WHERE key = 'version';
COMMIT;

View File

@@ -0,0 +1,11 @@
BEGIN TRANSACTION;
ALTER TABLE "titleVotes" ADD "downvotes" INTEGER default 0;
ALTER TABLE "titleVotes" ADD "removed" INTEGER default 0;
ALTER TABLE "thumbnailVotes" ADD "downvotes" INTEGER default 0;
ALTER TABLE "thumbnailVotes" ADD "removed" INTEGER default 0;
UPDATE "config" SET value = 39 WHERE key = 'version';
COMMIT;

View File

@@ -0,0 +1,8 @@
BEGIN TRANSACTION;
DROP INDEX IF EXISTS "titles_hashedVideoID";
DROP INDEX IF EXISTS "thumbnails_hashedVideoID";
UPDATE "config" SET value = 40 WHERE key = 'version';
COMMIT;

View File

@@ -0,0 +1,8 @@
BEGIN TRANSACTION;
ALTER TABLE "titles" ADD "casualMode" INTEGER default 0;
ALTER TABLE "thumbnails" ADD "casualMode" INTEGER default 0;
UPDATE "config" SET value = 41 WHERE key = 'version';
COMMIT;

View File

@@ -0,0 +1,7 @@
BEGIN TRANSACTION;
ALTER TABLE "casualVotes" DROP COLUMN "downvotes";
UPDATE "config" SET value = 42 WHERE key = 'version';
COMMIT;

View File

@@ -0,0 +1,7 @@
BEGIN TRANSACTION;
ALTER TABLE "casualVotes" ADD "titleID" INTEGER default 0;
UPDATE "config" SET value = 43 WHERE key = 'version';
COMMIT;

View File

@@ -0,0 +1,8 @@
BEGIN TRANSACTION;
ALTER TABLE "titles" ADD "userAgent" TEXT NOT NULL default '';
ALTER TABLE "thumbnails" ADD "userAgent" TEXT NOT NULL default '';
UPDATE "config" SET value = 44 WHERE key = 'version';
COMMIT;

View File

@@ -0,0 +1,7 @@
BEGIN TRANSACTION;
ALTER TABLE "warnings" ADD "disableTime" INTEGER NULL;
UPDATE "config" SET value = 45 WHERE key = 'version';
COMMIT;

View File

@@ -1,6 +1,6 @@
BEGIN TRANSACTION;
/* Add new voting field */
/* Add 'locked' field */
CREATE TABLE "sqlb_temp_table_6" (
"videoID" TEXT NOT NULL,
"startTime" REAL NOT NULL,

View File

@@ -1,6 +1,6 @@
BEGIN TRANSACTION;
/* Add Service field */
/* Add 'videoDuration' field */
CREATE TABLE "sqlb_temp_table_8" (
"videoID" TEXT NOT NULL,
"startTime" REAL NOT NULL,

View File

@@ -1,6 +1,6 @@
BEGIN TRANSACTION;
/* Add Service field */
/* Change 'videoDuration' field from INTEGER to REAL */
CREATE TABLE "sqlb_temp_table_9" (
"videoID" TEXT NOT NULL,
"startTime" REAL NOT NULL,

View File

@@ -9,4 +9,4 @@ test -e config.json || cat <<EOF > config.json
}
EOF
node --inspect dist/src/index.js
node dist/src/index.js

2582
package-lock.json generated

File diff suppressed because it is too large Load Diff

View File

@@ -19,18 +19,20 @@
"author": "Ajay Ramachandran",
"license": "AGPL-3.0-only",
"dependencies": {
"axios": "^1.1.3",
"better-sqlite3": "^8.0.1",
"axios": "^1.12.1",
"better-sqlite3": "^11.2.1",
"cron": "^2.1.0",
"express": "^4.18.2",
"express": "^4.21.2",
"express-promise-router": "^4.1.1",
"express-rate-limit": "^6.7.0",
"form-data": "^4.0.0",
"form-data": "^4.0.4",
"lodash": "^4.17.21",
"lru-cache": "^10.2.0",
"lz4-napi": "^2.2.0",
"pg": "^8.8.0",
"rate-limit-redis": "^3.0.1",
"redis": "^4.5.0",
"sync-mysql": "^3.0.1"
"redis": "^4.6.13",
"seedrandom": "^3.0.5"
},
"devDependencies": {
"@istanbuljs/nyc-config-typescript": "^1.0.2",
@@ -41,13 +43,14 @@
"@types/mocha": "^10.0.0",
"@types/node": "^18.11.9",
"@types/pg": "^8.6.5",
"@types/seedrandom": "^3.0.5",
"@types/sinon": "^10.0.13",
"@typescript-eslint/eslint-plugin": "^5.44.0",
"@typescript-eslint/parser": "^5.44.0",
"axios-mock-adapter": "^1.21.2",
"eslint": "^8.28.0",
"mocha": "^10.1.0",
"nodemon": "^2.0.20",
"mocha": "^10.8.2",
"nodemon": "^3.1.9",
"nyc": "^15.1.0",
"sinon": "^14.0.2",
"ts-mock-imports": "^1.3.8",

View File

@@ -53,6 +53,15 @@ import { getBranding, getBrandingByHashEndpoint } from "./routes/getBranding";
import { postBranding } from "./routes/postBranding";
import { cacheMiddlware } from "./middleware/etag";
import { hostHeader } from "./middleware/hostHeader";
import { getBrandingStats } from "./routes/getBrandingStats";
import { getTopBrandingUsers } from "./routes/getTopBrandingUsers";
import { getFeatureFlag } from "./routes/getFeatureFlag";
import { getReady } from "./routes/getReady";
import { getMetrics } from "./routes/getMetrics";
import { getSegmentID } from "./routes/getSegmentID";
import { postCasual } from "./routes/postCasual";
import { getConfigEndpoint } from "./routes/getConfig";
import { setConfig } from "./routes/setConfig";
export function createServer(callback: () => void): Server {
// Create a service (the app object is just a callback).
@@ -78,13 +87,15 @@ export function createServer(callback: () => void): Server {
// Set production mode
app.set("env", config.mode || "production");
setupRoutes(router);
const server = app.listen(config.port, callback);
return app.listen(config.port, callback);
setupRoutes(router, server);
return server;
}
/* eslint-disable @typescript-eslint/no-misused-promises */
function setupRoutes(router: Router) {
function setupRoutes(router: Router, server: Server) {
// Rate limit endpoint lists
const voteEndpoints: RequestHandler[] = [voteOnSponsorTime];
const viewEndpoints: RequestHandler[] = [viewedVideoSponsorTime];
@@ -115,6 +126,8 @@ function setupRoutes(router: Router) {
router.get("/api/viewedVideoSponsorTime", ...viewEndpoints);
router.post("/api/viewedVideoSponsorTime", ...viewEndpoints);
router.get("/api/segmentID", getSegmentID);
//To set your username for the stats view
router.post("/api/setUsername", setUsername);
@@ -140,11 +153,14 @@ function setupRoutes(router: Router) {
router.get("/api/getTopUsers", getTopUsers);
router.get("/api/getTopCategoryUsers", getTopCategoryUsers);
router.get("/api/getTopBrandingUsers", getTopBrandingUsers);
//send out totals
//send the total submissions, total views and total minutes saved
router.get("/api/getTotalStats", getTotalStats);
router.get("/api/brandingStats", getBrandingStats);
router.get("/api/getUserInfo", getUserInfo);
router.get("/api/userInfo", getUserInfo);
@@ -194,8 +210,11 @@ function setupRoutes(router: Router) {
router.get("/api/chapterNames", getChapterNames);
// get status
router.get("/api/status/:value", getStatus);
router.get("/api/status", getStatus);
router.get("/api/status/:value", (req, res) => getStatus(req, res, server));
router.get("/api/status", (req, res) => getStatus(req, res, server));
router.get("/metrics", (req, res) => getMetrics(req, res, server));
router.get("/api/ready", (req, res) => getReady(req, res, server));
router.get("/api/youtubeApiProxy", youtubeApiProxy);
// get user category stats
@@ -205,6 +224,8 @@ function setupRoutes(router: Router) {
router.post("/api/feature", addFeature);
router.get("/api/featureFlag/:name", getFeatureFlag);
router.get("/api/generateToken/:type", generateTokenRequest);
router.get("/api/verifyToken", verifyTokenRequest);
@@ -216,6 +237,11 @@ function setupRoutes(router: Router) {
router.get("/api/branding/:prefix", getBrandingByHashEndpoint);
router.post("/api/branding", postBranding);
router.get("/api/config", getConfigEndpoint);
router.post("/api/config", setConfig);
router.post("/api/casual", postCasual);
/* istanbul ignore next */
if (config.postgres?.enabled) {
router.get("/database", (req, res) => dumpDatabase(req, res, true));

View File

@@ -1,7 +1,6 @@
import fs from "fs";
import { SBSConfig } from "./types/config.model";
import packageJson from "../package.json";
import { isNumber } from "lodash";
const isTestMode = process.env.npm_lifecycle_script === packageJson.scripts.test;
const configFile = process.env.TEST_POSTGRES ? "ci.json"
@@ -20,7 +19,8 @@ addDefaults(config, {
privateDBSchema: "./databases/_private.db.sql",
readOnly: false,
webhooks: [],
categoryList: ["sponsor", "selfpromo", "exclusive_access", "interaction", "intro", "outro", "preview", "music_offtopic", "filler", "poi_highlight", "chapter"],
categoryList: ["sponsor", "selfpromo", "exclusive_access", "interaction", "intro", "outro", "preview", "hook", "music_offtopic", "filler", "poi_highlight", "chapter"],
casualCategoryList: ["funny", "creative", "clever", "descriptive", "other"],
categorySupport: {
sponsor: ["skip", "mute", "full"],
selfpromo: ["skip", "mute", "full"],
@@ -29,13 +29,14 @@ addDefaults(config, {
intro: ["skip", "mute"],
outro: ["skip", "mute"],
preview: ["skip", "mute"],
hook: ["skip", "mute"],
filler: ["skip", "mute"],
music_offtopic: ["skip"],
poi_highlight: ["poi"],
chapter: ["chapter"]
},
maxNumberOfActiveWarnings: 1,
hoursAfterWarningExpires: 16300000,
deArrowTypes: ["title", "thumbnail"],
maxTitleLength: 110,
adminUserID: "",
discordCompletelyIncorrectReportWebhookURL: null,
discordFirstTimeSubmissionsWebhookURL: null,
@@ -43,6 +44,10 @@ addDefaults(config, {
discordFailedReportChannelWebhookURL: null,
discordReportChannelWebhookURL: null,
discordMaliciousReportWebhookURL: null,
discordDeArrowLockedWebhookURL: null,
discordDeArrowWarnedWebhookURL: null,
discordNewUserWebhookURL: null,
discordRejectedNewUserWebhookURL: null,
minReputationToSubmitChapter: 0,
minReputationToSubmitFiller: 0,
getTopUsersCacheTimeMinutes: 240,
@@ -64,6 +69,7 @@ addDefaults(config, {
message: "OK",
}
},
requestValidatorRules: [],
userCounterURL: null,
userCounterRatio: 10,
newLeafURLs: null,
@@ -80,7 +86,8 @@ addDefaults(config, {
maxTries: 3,
maxActiveRequests: 0,
timeout: 60000,
highLoadThreshold: 10
highLoadThreshold: 10,
redisTimeoutThreshold: 1000
},
postgresReadOnly: {
enabled: false,
@@ -96,6 +103,7 @@ addDefaults(config, {
fallbackOnFail: true,
stopRetryThreshold: 800
},
postgresPrivateMax: 10,
dumpDatabase: {
enabled: false,
minTimeBetweenMs: 180000,
@@ -143,6 +151,13 @@ addDefaults(config, {
},
{
name: "thumbnailVotes"
},
{
name: "casualVotes",
order: "timeSubmitted"
},
{
name: "casualVoteTitles"
}]
},
diskCacheURL: null,
@@ -161,7 +176,11 @@ addDefaults(config, {
commandsQueueMaxLength: 3000,
stopWritingAfterResponseTime: 50,
responseTimePause: 1000,
disableHashCache: false
maxReadResponseTime: 500,
disableHashCache: false,
clientCacheSize: 2000,
useCompression: false,
dragonflyMode: false
},
redisRead: {
enabled: false,
@@ -182,7 +201,20 @@ addDefaults(config, {
gumroad: {
productPermalinks: ["sponsorblock"]
},
minUserIDLength: 30
tokenSeed: "",
minUserIDLength: 30,
deArrowPaywall: false,
useCacheForSegmentGroups: false,
maxConnections: 100,
maxResponseTime: 1000,
maxResponseTimeWhileLoadingCache: 2000,
etagExpiry: 5000,
youTubeKeys: {
visitorData: null,
poToken: null,
floatieUrl: null,
floatieAuth: null
}
});
loadFromEnv(config);
migrate(config);
@@ -230,15 +262,17 @@ function loadFromEnv(config: SBSConfig, prefix = "") {
loadFromEnv(data, fullKey);
} else if (process.env[fullKey]) {
const value = process.env[fullKey];
if (isNumber(value)) {
if (value !== "" && !isNaN(value as unknown as number)) {
config[key] = parseFloat(value);
} else if (value.toLowerCase() === "true" || value.toLowerCase() === "false") {
config[key] = value === "true";
} else if (key === "newLeafURLs") {
config[key] = [value];
} else if (key === "requestValidatorRules") {
config[key] = JSON.parse(value) ?? [];
} else {
config[key] = value;
}
}
}
}
}

View File

@@ -3,7 +3,6 @@ import { CronJob } from "cron";
import { config as serverConfig } from "../config";
import { Logger } from "../utils/logger";
import { db } from "../databases/databases";
import { DBSegment } from "../types/segments.model";
const jobConfig = serverConfig?.crons?.downvoteSegmentArchive;
@@ -14,18 +13,18 @@ export const archiveDownvoteSegment = async (dayLimit: number, voteLimit: number
Logger.info(`DownvoteSegmentArchiveJob starts at ${timeNow}`);
try {
// insert into archive sponsorTime
await db.prepare(
"run",
`INSERT INTO "archivedSponsorTimes"
SELECT *
FROM "sponsorTimes"
WHERE "votes" < ? AND (? - "timeSubmitted") > ?`,
[
voteLimit,
timeNow,
threshold
]
) as DBSegment[];
await db.prepare(
"run",
`INSERT INTO "archivedSponsorTimes"
SELECT *
FROM "sponsorTimes"
WHERE "votes" < ? AND (? - "timeSubmitted") > ?`,
[
voteLimit,
timeNow,
threshold
]
);
} catch (err) {
Logger.error("Execption when insert segment in archivedSponsorTimes");
@@ -35,15 +34,15 @@ export const archiveDownvoteSegment = async (dayLimit: number, voteLimit: number
// remove from sponsorTime
try {
await db.prepare(
"run",
'DELETE FROM "sponsorTimes" WHERE "votes" < ? AND (? - "timeSubmitted") > ?',
[
voteLimit,
timeNow,
threshold
]
) as DBSegment[];
await db.prepare(
"run",
'DELETE FROM "sponsorTimes" WHERE "votes" < ? AND (? - "timeSubmitted") > ?',
[
voteLimit,
timeNow,
threshold
]
);
} catch (err) {
Logger.error("Execption when deleting segment in sponsorTimes");

View File

@@ -6,9 +6,14 @@ export interface QueryOption {
export interface IDatabase {
init(): Promise<void>;
prepare(type: QueryType, query: string, params?: any[], options?: QueryOption): Promise<any | any[] | void>;
prepare(type: "run", query: string, params?: any[], options?: QueryOption): Promise<void>;
prepare(type: "get", query: string, params?: any[], options?: QueryOption): Promise<any>;
prepare(type: "all", query: string, params?: any[], options?: QueryOption): Promise<any[]>;
prepare(type: QueryType, query: string, params?: any[], options?: QueryOption): Promise<any>;
highLoad(): boolean;
shouldUseRedisTimeout(): boolean;
}
export type QueryType = "get" | "all" | "run";
export type QueryType = "get" | "all" | "run";

View File

@@ -1,39 +0,0 @@
import { Logger } from "../utils/logger";
import { IDatabase, QueryType } from "./IDatabase";
// eslint-disable-next-line @typescript-eslint/ban-ts-comment
// @ts-ignore
import MysqlInterface from "sync-mysql";
export class Mysql implements IDatabase {
private connection: any;
constructor(private config: unknown) {
}
// eslint-disable-next-line require-await
async init(): Promise<void> {
this.connection = new MysqlInterface(this.config);
}
prepare(type: QueryType, query: string, params?: any[]): Promise<any[]> {
Logger.debug(`prepare (mysql): type: ${type}, query: ${query}, params: ${params}`);
const queryResult = this.connection.query(query, params);
switch (type) {
case "get": {
return queryResult[0];
}
case "all": {
return queryResult;
}
case "run": {
break;
}
}
}
highLoad() {
return false;
}
}

View File

@@ -109,7 +109,7 @@ export class Postgres implements IDatabase {
}
}
async prepare(type: QueryType, query: string, params?: any[], options: QueryOption = {}): Promise<any[]> {
async prepare(type: QueryType, query: string, params?: any[], options: QueryOption = {}): Promise<any> {
// Convert query to use numbered parameters
let count = 1;
for (let char = 0; char < query.length; char++) {
@@ -283,4 +283,8 @@ export class Postgres implements IDatabase {
highLoad() {
return this.activePostgresRequests > this.config.postgres.highLoadThreshold;
}
shouldUseRedisTimeout() {
return this.activePostgresRequests < this.config.postgres.redisTimeoutThreshold;
}
}

View File

@@ -13,7 +13,7 @@ export class Sqlite implements IDatabase {
}
// eslint-disable-next-line require-await
async prepare(type: QueryType, query: string, params: any[] = []): Promise<any[]> {
async prepare(type: QueryType, query: string, params: any[] = []): Promise<any> {
// Logger.debug(`prepare (sqlite): type: ${type}, query: ${query}, params: ${params}`);
const preparedQuery = this.db.prepare(Sqlite.processQuery(query));
@@ -72,6 +72,15 @@ export class Sqlite implements IDatabase {
}
private static processQuery(query: string): string {
if (query.includes("DISTINCT ON")) {
const column = query.match(/DISTINCT ON \((.*)\) (.*)/)[1];
query = query.replace(/DISTINCT ON \((.*)\)/g, "");
const parts = query.split("ORDER BY");
query = `${parts[0]} GROUP BY ${column} ORDER BY ${parts[1]}`;
}
return query.replace(/ ~\* /g, " REGEXP ");
}
@@ -93,12 +102,18 @@ export class Sqlite implements IDatabase {
}
private static processUpgradeQuery(query: string): string {
return query.replace(/^.*--!sqlite-ignore/gm, "");
return query
.replace(/SERIAL PRIMARY KEY/gi, "INTEGER PRIMARY KEY AUTOINCREMENT")
.replace(/^.*--!sqlite-ignore/gm, "");
}
highLoad() {
return false;
}
shouldUseRedisTimeout() {
return false;
}
}
export interface SqliteConfig {

View File

@@ -1,15 +1,11 @@
import { config } from "../config";
import { Sqlite } from "./Sqlite";
import { Mysql } from "./Mysql";
import { Postgres } from "./Postgres";
import { IDatabase } from "./IDatabase";
let db: IDatabase;
let privateDB: IDatabase;
if (config.mysql) {
db = new Mysql(config.mysql);
privateDB = new Mysql(config.privateMysql);
} else if (config.postgres?.enabled) {
if (config.postgres?.enabled) {
db = new Postgres({
dbSchemaFileName: config.dbSchema,
dbSchemaFolder: config.schemaFolder,
@@ -34,6 +30,7 @@ if (config.mysql) {
createDbIfNotExists: config.createDatabaseIfNotExist,
postgres: {
...config.postgres,
max: config.postgresPrivateMax ?? config.postgres.max,
database: "privateDB"
},
postgresReadOnly: config.postgresReadOnly ? {

View File

@@ -10,7 +10,11 @@ async function init() {
process.on("unhandledRejection", (error: any) => {
// eslint-disable-next-line no-console
console.dir(error?.stack);
process.exit(1);
});
process.on("uncaughtExceptions", (error: any) => {
// eslint-disable-next-line no-console
console.dir(error?.stack);
});
try {

View File

@@ -3,6 +3,6 @@ import { NextFunction, Request, Response } from "express";
export function corsMiddleware(req: Request, res: Response, next: NextFunction): void {
res.header("Access-Control-Allow-Origin", "*");
res.header("Access-Control-Allow-Methods", "GET, POST, OPTIONS, DELETE");
res.header("Access-Control-Allow-Headers", "Content-Type, If-None-Match");
res.header("Access-Control-Allow-Headers", "Content-Type, If-None-Match, x-client-name");
next();
}

View File

@@ -1,10 +1,10 @@
import { NextFunction, Request, Response } from "express";
import { VideoID, VideoIDHash, Service } from "../types/segments.model";
import { QueryCacher } from "../utils/queryCacher";
import { skipSegmentsHashKey, skipSegmentsKey, videoLabelsHashKey, videoLabelsKey } from "../utils/redisKeys";
import { brandingHashKey, brandingKey, skipSegmentsHashKey, skipSegmentsKey, skipSegmentsLargerHashKey, videoLabelsHashKey, videoLabelsKey, videoLabelsLargerHashKey } from "../utils/redisKeys";
type hashType = "skipSegments" | "skipSegmentsHash" | "videoLabel" | "videoLabelHash";
type ETag = `${hashType};${VideoIDHash};${Service};${number}`;
type hashType = "skipSegments" | "skipSegmentsHash" | "skipSegmentsLargerHash" | "videoLabel" | "videoLabelHash" | "videoLabelsLargerHash" | "branding" | "brandingHash";
type ETag = `"${hashType};${VideoIDHash};${Service};${number}"`;
type hashKey = string | VideoID | VideoIDHash;
export function cacheMiddlware(req: Request, res: Response, next: NextFunction): void {
@@ -12,13 +12,13 @@ export function cacheMiddlware(req: Request, res: Response, next: NextFunction):
// if weak etag, do not handle
if (!reqEtag || reqEtag.startsWith("W/")) return next();
// split into components
const [hashType, hashKey, service, lastModified] = reqEtag.split(";");
const [hashType, hashKey, service, lastModified] = reqEtag.replace(/^"|"$/g, "").split(";");
// fetch last-modified
getLastModified(hashType as hashType, hashKey as VideoIDHash, service as Service)
.then(redisLastModified => {
if (redisLastModified <= new Date(Number(lastModified) + 1000)) {
// match cache, generate etag
const etag = `${hashType};${hashKey};${service};${redisLastModified.getTime()}` as ETag;
const etag = `"${hashType};${hashKey};${service};${redisLastModified.getTime()}"` as ETag;
res.status(304).set("etag", etag).send();
}
else next();
@@ -30,15 +30,19 @@ function getLastModified(hashType: hashType, hashKey: hashKey, service: Service)
let redisKey: string | null;
if (hashType === "skipSegments") redisKey = skipSegmentsKey(hashKey as VideoID, service);
else if (hashType === "skipSegmentsHash") redisKey = skipSegmentsHashKey(hashKey as VideoIDHash, service);
else if (hashType === "skipSegmentsLargerHash") redisKey = skipSegmentsLargerHashKey(hashKey as VideoIDHash, service);
else if (hashType === "videoLabel") redisKey = videoLabelsKey(hashKey as VideoID, service);
else if (hashType === "videoLabelHash") redisKey = videoLabelsHashKey(hashKey as VideoIDHash, service);
else if (hashType === "videoLabelsLargerHash") redisKey = videoLabelsLargerHashKey(hashKey as VideoIDHash, service);
else if (hashType === "branding") redisKey = brandingKey(hashKey as VideoID, service);
else if (hashType === "brandingHash") redisKey = brandingHashKey(hashKey as VideoIDHash, service);
else return Promise.reject();
return QueryCacher.getKeyLastModified(redisKey);
}
export async function getEtag(hashType: hashType, hashKey: hashKey, service: Service): Promise<ETag> {
const lastModified = await getLastModified(hashType, hashKey, service);
return `${hashType};${hashKey};${service};${lastModified.getTime()}` as ETag;
return `"${hashType};${hashKey};${service};${lastModified.getTime()}"` as ETag;
}
/* example usage

View File

@@ -6,6 +6,7 @@ import { isUserVIP } from "../utils/isUserVIP";
import { Feature, HashedUserID, UserID } from "../types/user.model";
import { Logger } from "../utils/logger";
import { QueryCacher } from "../utils/queryCacher";
import { getVerificationValue, verifyOldSubmissions } from "./postBranding";
interface AddFeatureRequest extends Request {
body: {
@@ -19,11 +20,13 @@ interface AddFeatureRequest extends Request {
const allowedFeatures = {
vip: [
Feature.ChapterSubmitter,
Feature.FillerSubmitter
Feature.FillerSubmitter,
Feature.DeArrowTitleSubmitter,
],
admin: [
Feature.ChapterSubmitter,
Feature.FillerSubmitter
Feature.FillerSubmitter,
Feature.DeArrowTitleSubmitter,
]
};
@@ -56,6 +59,10 @@ export async function addFeature(req: AddFeatureRequest, res: Response): Promise
await db.prepare("run", 'INSERT INTO "userFeatures" ("userID", "feature", "issuerUserID", "timeSubmitted") VALUES(?, ?, ?, ?)'
, [userID, feature, adminUserID, Date.now()]);
}
if (feature === Feature.DeArrowTitleSubmitter) {
await verifyOldSubmissions(userID, await getVerificationValue(userID, false));
}
} else {
await db.prepare("run", 'DELETE FROM "userFeatures" WHERE "userID" = ? AND "feature" = ?', [userID, feature]);
}

View File

@@ -4,6 +4,7 @@ import { config } from "../config";
import { Request, Response } from "express";
import { isUserVIP } from "../utils/isUserVIP";
import { HashedUserID } from "../types/user.model";
import { Logger } from "../utils/logger";
interface AddUserAsVIPRequest extends Request {
query: {
@@ -34,15 +35,21 @@ export async function addUserAsVIP(req: AddUserAsVIPRequest, res: Response): Pro
// check to see if this user is already a vip
const userIsVIP = await isUserVIP(userID);
if (enabled && !userIsVIP) {
// add them to the vip list
await db.prepare("run", 'INSERT INTO "vipUsers" VALUES(?)', [userID]);
try {
if (enabled && !userIsVIP) {
// add them to the vip list
await db.prepare("run", 'INSERT INTO "vipUsers" VALUES(?)', [userID]);
}
if (!enabled && userIsVIP) {
//remove them from the shadow ban list
await db.prepare("run", 'DELETE FROM "vipUsers" WHERE "userID" = ?', [userID]);
}
return res.sendStatus(200);
} catch (e) {
Logger.error(e as string);
return res.sendStatus(500);
}
if (!enabled && userIsVIP) {
//remove them from the shadow ban list
await db.prepare("run", 'DELETE FROM "vipUsers" WHERE "userID" = ?', [userID]);
}
return res.sendStatus(200);
}

View File

@@ -6,6 +6,7 @@ import { ActionType, Category, Service, VideoID } from "../types/segments.model"
import { UserID } from "../types/user.model";
import { getService } from "../utils/getService";
import { config } from "../config";
import { Logger } from "../utils/logger";
interface DeleteLockCategoriesRequest extends Request {
body: {
@@ -53,7 +54,12 @@ export async function deleteLockCategoriesEndpoint(req: DeleteLockCategoriesRequ
});
}
await deleteLockCategories(videoID, categories, actionTypes, getService(service));
try {
await deleteLockCategories(videoID, categories, actionTypes, getService(service));
} catch (e) {
Logger.error(e as string);
return res.status(500);
}
return res.status(200).json({ message: `Removed lock categories entries for video ${videoID}` });
}

View File

@@ -96,10 +96,12 @@ function removeOutdatedDumps(exportPath: string): Promise<void> {
for (const tableName in tableFiles) {
const files = tableFiles[tableName].sort((a, b) => b.timestamp - a.timestamp);
for (let i = 2; i < files.length; i++) {
// remove old file
await unlink(files[i].file).catch((error: any) => {
Logger.error(`[dumpDatabase] Garbage collection failed ${error}`);
});
if (!latestDumpFiles.some((file) => file.fileName === files[i].file.match(/[^/]+$/)[0])) {
// remove old file
await unlink(files[i].file).catch((error: any) => {
Logger.error(`[dumpDatabase] Garbage collection failed ${error}`);
});
}
}
}
resolve();
@@ -164,18 +166,23 @@ export default async function dumpDatabase(req: Request, res: Response, showPage
<hr/>
${updateQueued ? `Update queued.` : ``} Last updated: ${lastUpdate ? new Date(lastUpdate).toUTCString() : `Unknown`}`);
} else {
res.send({
dbVersion: await getDbVersion(),
lastUpdated: lastUpdate,
updateQueued,
links: latestDumpFiles.map((item:any) => {
return {
table: item.tableName,
url: `/database/${item.tableName}.csv`,
size: item.fileSize,
};
}),
});
try {
res.send({
dbVersion: await getDbVersion(),
lastUpdated: lastUpdate,
updateQueued,
links: latestDumpFiles.map((item:any) => {
return {
table: item.tableName,
url: `/database/${item.tableName}.csv`,
size: item.fileSize,
};
}),
});
} catch (e) {
Logger.error(e as string);
res.sendStatus(500);
}
}
await queueDump();
@@ -229,11 +236,11 @@ async function queueDump(): Promise<void> {
const fileName = `${table.name}_${startTime}.csv`;
const file = `${appExportPath}/${fileName}`;
await new Promise<string>((resolve) => {
await new Promise<string>((resolve, reject) => {
exec(`psql -c "\\copy (SELECT * FROM \\"${table.name}\\"${table.order ? ` ORDER BY \\"${table.order}\\"` : ``})`
+ ` TO '${file}' WITH (FORMAT CSV, HEADER true);"`, credentials, (error, stdout, stderr) => {
if (error) {
Logger.error(`[dumpDatabase] Failed to dump ${table.name} to ${file} due to ${stderr}`);
reject(`[dumpDatabase] Failed to dump ${table.name} to ${file} due to ${stderr}`);
}
resolve(error ? stderr : stdout);
@@ -248,10 +255,10 @@ async function queueDump(): Promise<void> {
latestDumpFiles = [...dumpFiles];
lastUpdate = startTime;
updateQueued = false;
} catch(e) {
Logger.error(e as string);
} finally {
updateQueued = false;
updateRunning = false;
}
}

View File

@@ -7,6 +7,8 @@ interface GenerateTokenRequest extends Request {
query: {
code: string;
adminUserID?: string;
total?: string;
key?: string;
},
params: {
type: TokenType;
@@ -14,31 +16,45 @@ interface GenerateTokenRequest extends Request {
}
export async function generateTokenRequest(req: GenerateTokenRequest, res: Response): Promise<Response> {
const { query: { code, adminUserID }, params: { type } } = req;
const { query: { code, adminUserID, total, key }, params: { type } } = req;
const adminUserIDHash = adminUserID ? (await getHashCache(adminUserID)) : null;
if (!code || !type) {
if (!type || (!code && type === TokenType.patreon)) {
return res.status(400).send("Invalid request");
}
if (type === TokenType.patreon || (type === TokenType.local && adminUserIDHash === config.adminUserID)) {
const licenseKey = await createAndSaveToken(type, code);
if (type === TokenType.free && (!key || Math.abs(Date.now() - parseInt(key)) > 1000 * 60 * 60 * 24)) {
return res.status(400).send("Invalid request");
}
if (type === TokenType.patreon
|| ([TokenType.local, TokenType.gift].includes(type) && adminUserIDHash === config.adminUserID)
|| type === TokenType.free) {
const licenseKeys = await createAndSaveToken(type, code, adminUserIDHash === config.adminUserID ? parseInt(total) : 1);
/* istanbul ignore else */
if (licenseKey) {
return res.status(200).send(`
<h1>
Your license key:
</h1>
<p>
<b>
${licenseKey}
</b>
</p>
<p>
Copy this into the textbox in the other tab
</p>
`);
if (licenseKeys) {
if (type === TokenType.patreon) {
return res.status(200).send(`
<h1>
Your license key:
</h1>
<p>
<b>
${licenseKeys[0]}
</b>
</p>
<p>
Copy this into the textbox in the other tab
</p>
`);
} else if (type === TokenType.free) {
return res.status(200).send({
licenseKey: licenseKeys[0]
});
} else {
return res.status(200).send(licenseKeys.join("<br/>"));
}
} else {
return res.status(401).send(`
<h1>

View File

@@ -3,7 +3,7 @@ import { isEmpty } from "lodash";
import { config } from "../config";
import { db, privateDB } from "../databases/databases";
import { Postgres } from "../databases/Postgres";
import { BrandingDBSubmission, BrandingHashDBResult, BrandingResult, ThumbnailDBResult, ThumbnailResult, TitleDBResult, TitleResult } from "../types/branding.model";
import { BrandingDBSubmission, BrandingDBSubmissionData, BrandingHashDBResult, BrandingResult, BrandingSegmentDBResult, BrandingSegmentHashDBResult, CasualVoteDBResult, CasualVoteHashDBResult, ThumbnailDBResult, ThumbnailResult, TitleDBResult, TitleResult } from "../types/branding.model";
import { HashedIP, IPAddress, Service, VideoID, VideoIDHash, Visibility } from "../types/segments.model";
import { shuffleArray } from "../utils/array";
import { getHashCache } from "../utils/getHashCache";
@@ -14,35 +14,70 @@ import { Logger } from "../utils/logger";
import { promiseOrTimeout } from "../utils/promise";
import { QueryCacher } from "../utils/queryCacher";
import { brandingHashKey, brandingIPKey, brandingKey } from "../utils/redisKeys";
import * as SeedRandom from "seedrandom";
import { getEtag } from "../middleware/etag";
enum BrandingSubmissionType {
Title = "title",
Thumbnail = "thumbnail"
}
export async function getVideoBranding(res: Response, videoID: VideoID, service: Service, ip: IPAddress): Promise<BrandingResult> {
export async function getVideoBranding(res: Response, videoID: VideoID, service: Service, ip: IPAddress, returnUserID: boolean, fetchAll: boolean): Promise<BrandingResult> {
const getTitles = () => db.prepare(
"all",
`SELECT "titles"."title", "titles"."original", "titleVotes"."votes", "titleVotes"."locked", "titleVotes"."shadowHidden", "titles"."UUID", "titles"."videoID", "titles"."hashedVideoID"
`SELECT "titles"."title", "titles"."original", "titleVotes"."votes", "titleVotes"."downvotes", "titleVotes"."locked", "titleVotes"."shadowHidden", "titles"."UUID", "titles"."videoID", "titles"."hashedVideoID", "titleVotes"."verification", "titles"."userID"
FROM "titles" JOIN "titleVotes" ON "titles"."UUID" = "titleVotes"."UUID"
WHERE "titles"."videoID" = ? AND "titles"."service" = ? AND "titleVotes"."votes" > -2`,
WHERE "titles"."videoID" = ? AND "titles"."service" = ? AND "titleVotes"."votes" > -1 AND "titleVotes"."votes" - "titleVotes"."downvotes" > -2 AND "titleVotes"."removed" = 0`,
[videoID, service],
{ useReplica: true }
) as Promise<TitleDBResult[]>;
const getThumbnails = () => db.prepare(
"all",
`SELECT "thumbnailTimestamps"."timestamp", "thumbnails"."original", "thumbnailVotes"."votes", "thumbnailVotes"."locked", "thumbnailVotes"."shadowHidden", "thumbnails"."UUID", "thumbnails"."videoID", "thumbnails"."hashedVideoID"
`SELECT "thumbnailTimestamps"."timestamp", "thumbnails"."original", "thumbnailVotes"."votes", "thumbnailVotes"."downvotes", "thumbnailVotes"."locked", "thumbnailVotes"."shadowHidden", "thumbnails"."UUID", "thumbnails"."videoID", "thumbnails"."hashedVideoID", "thumbnails"."userID"
FROM "thumbnails" LEFT JOIN "thumbnailVotes" ON "thumbnails"."UUID" = "thumbnailVotes"."UUID" LEFT JOIN "thumbnailTimestamps" ON "thumbnails"."UUID" = "thumbnailTimestamps"."UUID"
WHERE "thumbnails"."videoID" = ? AND "thumbnails"."service" = ? AND "thumbnailVotes"."votes" > -2`,
WHERE "thumbnails"."videoID" = ? AND "thumbnails"."service" = ? AND "thumbnailVotes"."votes" - "thumbnailVotes"."downvotes" > -2 AND "thumbnailVotes"."removed" = 0
ORDER BY "thumbnails"."timeSubmitted" ASC`,
[videoID, service],
{ useReplica: true }
) as Promise<ThumbnailDBResult[]>;
const getBranding = async () => ({
titles: await getTitles(),
thumbnails: await getThumbnails()
});
const getSegments = () => db.prepare(
"all",
`SELECT "startTime", "endTime", "category", "videoDuration" FROM "sponsorTimes"
WHERE "votes" > -2 AND "shadowHidden" = 0 AND "hidden" = 0 AND "actionType" = 'skip' AND "videoID" = ? AND "service" = ?
ORDER BY "timeSubmitted" ASC`,
[videoID, service],
{ useReplica: true }
) as Promise<BrandingSegmentDBResult[]>;
const getCasualVotes = () => db.prepare(
"all",
`SELECT "casualVotes"."category", "casualVotes"."upvotes", "casualVoteTitles"."title"
FROM "casualVotes" LEFT JOIN "casualVoteTitles" ON "casualVotes"."videoID" = "casualVoteTitles"."videoID" AND "casualVotes"."service" = "casualVoteTitles"."service" AND "casualVotes"."titleID" = "casualVoteTitles"."id"
WHERE "casualVotes"."videoID" = ? AND "casualVotes"."service" = ?
ORDER BY "casualVotes"."timeSubmitted" ASC`,
[videoID, service],
{ useReplica: true }
) as Promise<CasualVoteDBResult[]>;
const getBranding = async () => {
const titles = getTitles();
const thumbnails = getThumbnails();
const segments = getSegments();
const casualVotes = getCasualVotes();
for (const title of await titles) {
title.title = title.title.replaceAll("<", "");
}
return {
titles: await titles,
thumbnails: await thumbnails,
segments: await segments,
casualVotes: await casualVotes
};
};
const brandingTrace = await QueryCacher.getTraced(getBranding, brandingKey(videoID, service));
const branding = brandingTrace.data;
@@ -62,52 +97,89 @@ export async function getVideoBranding(res: Response, videoID: VideoID, service:
currentIP: null as Promise<HashedIP> | null
};
return filterAndSortBranding(branding.titles, branding.thumbnails, ip, cache);
return filterAndSortBranding(videoID, returnUserID, fetchAll, branding.titles,
branding.thumbnails, branding.segments, branding.casualVotes, ip, cache);
}
export async function getVideoBrandingByHash(videoHashPrefix: VideoIDHash, service: Service, ip: IPAddress): Promise<Record<VideoID, BrandingResult>> {
export async function getVideoBrandingByHash(videoHashPrefix: VideoIDHash, service: Service, ip: IPAddress, returnUserID: boolean, fetchAll: boolean): Promise<Record<VideoID, BrandingResult>> {
const getTitles = () => db.prepare(
"all",
`SELECT "titles"."title", "titles"."original", "titleVotes"."votes", "titleVotes"."locked", "titleVotes"."shadowHidden", "titles"."UUID", "titles"."videoID", "titles"."hashedVideoID"
`SELECT "titles"."title", "titles"."original", "titleVotes"."votes", "titleVotes"."downvotes", "titleVotes"."locked", "titleVotes"."shadowHidden", "titles"."UUID", "titles"."videoID", "titles"."hashedVideoID", "titleVotes"."verification"
FROM "titles" JOIN "titleVotes" ON "titles"."UUID" = "titleVotes"."UUID"
WHERE "titles"."hashedVideoID" LIKE ? AND "titles"."service" = ? AND "titleVotes"."votes" > -2`,
WHERE "titles"."hashedVideoID" LIKE ? AND "titles"."service" = ? AND "titleVotes"."votes" > -1 AND "titleVotes"."votes" - "titleVotes"."downvotes" > -2 AND "titleVotes"."removed" = 0`,
[`${videoHashPrefix}%`, service],
{ useReplica: true }
) as Promise<TitleDBResult[]>;
const getThumbnails = () => db.prepare(
"all",
`SELECT "thumbnailTimestamps"."timestamp", "thumbnails"."original", "thumbnailVotes"."votes", "thumbnailVotes"."locked", "thumbnailVotes"."shadowHidden", "thumbnails"."UUID", "thumbnails"."videoID", "thumbnails"."hashedVideoID"
`SELECT "thumbnailTimestamps"."timestamp", "thumbnails"."original", "thumbnailVotes"."votes", "thumbnailVotes"."downvotes", "thumbnailVotes"."locked", "thumbnailVotes"."shadowHidden", "thumbnails"."UUID", "thumbnails"."videoID", "thumbnails"."hashedVideoID"
FROM "thumbnails" LEFT JOIN "thumbnailVotes" ON "thumbnails"."UUID" = "thumbnailVotes"."UUID" LEFT JOIN "thumbnailTimestamps" ON "thumbnails"."UUID" = "thumbnailTimestamps"."UUID"
WHERE "thumbnails"."hashedVideoID" LIKE ? AND "thumbnails"."service" = ? AND "thumbnailVotes"."votes" > -2`,
WHERE "thumbnails"."hashedVideoID" LIKE ? AND "thumbnails"."service" = ? AND "thumbnailVotes"."votes" - "thumbnailVotes"."downvotes" > -2 AND "thumbnailVotes"."removed" = 0
ORDER BY "thumbnails"."timeSubmitted" ASC`,
[`${videoHashPrefix}%`, service],
{ useReplica: true }
) as Promise<ThumbnailDBResult[]>;
const getSegments = () => db.prepare(
"all",
`SELECT "videoID", "startTime", "endTime", "category", "videoDuration" FROM "sponsorTimes"
WHERE "votes" > -2 AND "shadowHidden" = 0 AND "hidden" = 0 AND "actionType" = 'skip' AND "hashedVideoID" LIKE ? AND "service" = ?
ORDER BY "timeSubmitted" ASC`,
[`${videoHashPrefix}%`, service],
{ useReplica: true }
) as Promise<BrandingSegmentHashDBResult[]>;
const getCasualVotes = () => db.prepare(
"all",
`SELECT "casualVotes"."videoID", "casualVotes"."category", "casualVotes"."upvotes", "casualVoteTitles"."title"
FROM "casualVotes" LEFT JOIN "casualVoteTitles" ON "casualVotes"."videoID" = "casualVoteTitles"."videoID" AND "casualVotes"."service" = "casualVoteTitles"."service" AND "casualVotes"."titleID" = "casualVoteTitles"."id"
WHERE "casualVotes"."hashedVideoID" LIKE ? AND "casualVotes"."service" = ?
ORDER BY "casualVotes"."timeSubmitted" ASC`,
[`${videoHashPrefix}%`, service],
{ useReplica: true }
) as Promise<CasualVoteHashDBResult[]>;
const branding = await QueryCacher.get(async () => {
// Make sure they are both called in parallel
const branding = {
titles: getTitles(),
thumbnails: getThumbnails()
thumbnails: getThumbnails(),
segments: getSegments(),
casualVotes: getCasualVotes()
};
const dbResult: Record<VideoID, BrandingHashDBResult> = {};
const initResult = (submission: BrandingDBSubmission) => {
const initResult = (submission: BrandingDBSubmissionData) => {
dbResult[submission.videoID] = dbResult[submission.videoID] || {
titles: [],
thumbnails: []
thumbnails: [],
segments: [],
casualVotes: []
};
};
(await branding.titles).map((title) => {
(await branding.titles).forEach((title) => {
title.title = title.title.replaceAll("<", "");
initResult(title);
dbResult[title.videoID].titles.push(title);
});
(await branding.thumbnails).map((thumbnail) => {
(await branding.thumbnails).forEach((thumbnail) => {
initResult(thumbnail);
dbResult[thumbnail.videoID].thumbnails.push(thumbnail);
});
(await branding.segments).forEach((segment) => {
initResult(segment);
dbResult[segment.videoID].segments.push(segment);
});
(await branding.casualVotes).forEach((casualVote) => {
initResult(casualVote);
dbResult[casualVote.videoID].casualVotes.push(casualVote);
});
return dbResult;
}, brandingHashKey(videoHashPrefix, service));
@@ -119,41 +191,62 @@ export async function getVideoBrandingByHash(videoHashPrefix: VideoIDHash, servi
const processedResult: Record<VideoID, BrandingResult> = {};
await Promise.all(Object.keys(branding).map(async (key) => {
const castedKey = key as VideoID;
processedResult[castedKey] = await filterAndSortBranding(branding[castedKey].titles, branding[castedKey].thumbnails, ip, cache);
processedResult[castedKey] = await filterAndSortBranding(castedKey, returnUserID, fetchAll, branding[castedKey].titles,
branding[castedKey].thumbnails, branding[castedKey].segments, branding[castedKey].casualVotes, ip, cache);
}));
return processedResult;
}
async function filterAndSortBranding(dbTitles: TitleDBResult[], dbThumbnails: ThumbnailDBResult[], ip: IPAddress, cache: { currentIP: Promise<HashedIP> | null }): Promise<BrandingResult> {
async function filterAndSortBranding(videoID: VideoID, returnUserID: boolean, fetchAll: boolean, dbTitles: TitleDBResult[],
dbThumbnails: ThumbnailDBResult[], dbSegments: BrandingSegmentDBResult[], dbCasualVotes: CasualVoteDBResult[],
ip: IPAddress, cache: { currentIP: Promise<HashedIP> | null }): Promise<BrandingResult> {
const shouldKeepTitles = shouldKeepSubmission(dbTitles, BrandingSubmissionType.Title, ip, cache);
const shouldKeepThumbnails = shouldKeepSubmission(dbThumbnails, BrandingSubmissionType.Thumbnail, ip, cache);
const titles = shuffleArray(dbTitles.filter(await shouldKeepTitles))
.sort((a, b) => b.votes - a.votes)
.sort((a, b) => b.locked - a.locked)
.map((r) => ({
title: r.title,
original: r.original === 1,
votes: r.votes,
votes: r.votes + r.verification - r.downvotes,
locked: r.locked === 1,
UUID: r.UUID,
})) as TitleResult[];
userID: returnUserID ? r.userID : undefined
}))
.filter((a) => fetchAll || a.votes >= 0 || a.locked)
.sort((a, b) => b.votes - a.votes)
.sort((a, b) => +b.locked - +a.locked) as TitleResult[];
const thumbnails = shuffleArray(dbThumbnails.filter(await shouldKeepThumbnails))
const thumbnails = dbThumbnails.filter(await shouldKeepThumbnails)
.sort((a, b) => +a.original - +b.original)
.sort((a, b) => b.votes - a.votes)
.sort((a, b) => b.locked - a.locked)
.map((r) => ({
timestamp: r.timestamp,
original: r.original === 1,
votes: r.votes,
votes: r.votes - r.downvotes,
locked: r.locked === 1,
UUID: r.UUID
})) as ThumbnailResult[];
UUID: r.UUID,
userID: returnUserID ? r.userID : undefined
}))
.filter((a) => (fetchAll && !a.original) || a.votes >= 1 || (a.votes >= 0 && !a.original) || a.locked) as ThumbnailResult[];
const casualDownvotes = dbCasualVotes.filter((r) => r.category === "downvote")[0];
const casualVotes = dbCasualVotes.filter((r) => r.category !== "downvote").map((r) => ({
id: r.category,
count: r.upvotes - (casualDownvotes?.upvotes ?? 0),
title: r.title
})).filter((a) => a.count > 0);
const videoDuration = dbSegments.filter(s => s.videoDuration !== 0)[0]?.videoDuration ?? null;
return {
titles,
thumbnails
thumbnails,
casualVotes,
randomTime: findRandomTime(videoID, dbSegments, videoDuration),
videoDuration: videoDuration,
};
}
@@ -161,7 +254,7 @@ async function shouldKeepSubmission(submissions: BrandingDBSubmission[], type: B
cache: { currentIP: Promise<HashedIP> | null }): Promise<(_: unknown, index: number) => boolean> {
const shouldKeep = await Promise.all(submissions.map(async (s) => {
if (s.shadowHidden != Visibility.HIDDEN) return true;
if (s.shadowHidden === Visibility.VISIBLE) return true;
const table = type === BrandingSubmissionType.Title ? "titleVotes" : "thumbnailVotes";
const fetchData = () => privateDB.prepare("get", `SELECT "hashedIP" FROM "${table}" WHERE "UUID" = ?`,
[s.UUID], { useReplica: true }) as Promise<{ hashedIP: HashedIP }>;
@@ -170,9 +263,11 @@ async function shouldKeepSubmission(submissions: BrandingDBSubmission[], type: B
if (cache.currentIP === null) cache.currentIP = getHashCache((ip + config.globalSalt) as IPAddress);
const hashedIP = await cache.currentIP;
return submitterIP.hashedIP !== hashedIP;
return submitterIP?.hashedIP === hashedIP;
} catch (e) {
// give up on shadow hide for now
Logger.error(`getBranding: Error while trying to find IP: ${e}`);
return false;
}
}));
@@ -180,9 +275,59 @@ async function shouldKeepSubmission(submissions: BrandingDBSubmission[], type: B
return (_, index) => shouldKeep[index];
}
export function findRandomTime(videoID: VideoID, segments: BrandingSegmentDBResult[], videoDuration: number): number {
let randomTime = SeedRandom.alea(videoID)();
// Don't allow random times past 90% of the video if no endcard
if (!segments.some((s) => s.category === "outro") && randomTime > 0.9) {
randomTime -= 0.9;
}
if (segments.length === 0) return randomTime;
videoDuration ||= Math.max(...segments.map((s) => s.endTime)); // use highest end time as a fallback here
// There are segments, treat this as a relative time in the chopped up video
const sorted = segments.sort((a, b) => a.startTime - b.startTime);
const emptySegments: [number, number][] = [];
let totalTime = 0;
let nextEndTime = 0;
for (const segment of sorted) {
if (segment.startTime > nextEndTime) {
emptySegments.push([nextEndTime, segment.startTime]);
totalTime += segment.startTime - nextEndTime;
}
nextEndTime = Math.max(segment.endTime, nextEndTime);
}
if (nextEndTime < videoDuration) {
emptySegments.push([nextEndTime, videoDuration]);
totalTime += videoDuration - nextEndTime;
}
let cursor = 0;
for (const segment of emptySegments) {
const duration = segment[1] - segment[0];
if (cursor + duration >= randomTime * totalTime) {
// Found it
return (segment[0] + (randomTime * totalTime - cursor)) / videoDuration;
}
cursor += duration;
}
// Fallback to just the random time
return randomTime;
}
export async function getBranding(req: Request, res: Response) {
const videoID: VideoID = req.query.videoID as VideoID;
const service: Service = getService(req.query.service as string);
const returnUserID = req.query.returnUserID === "true";
const fetchAll = req.query.fetchAll === "true";
if (!videoID) {
return res.status(400).send("Missing parameter: videoID");
@@ -190,9 +335,13 @@ export async function getBranding(req: Request, res: Response) {
const ip = getIP(req);
try {
const result = await getVideoBranding(res, videoID, service, ip);
const result = await getVideoBranding(res, videoID, service, ip, returnUserID, fetchAll);
const status = result.titles.length > 0 || result.thumbnails.length > 0 ? 200 : 404;
await getEtag("branding", (videoID as string), service)
.then(etag => res.set("ETag", etag))
.catch(() => null);
const status = result.titles.length > 0 || result.thumbnails.length > 0 || result.casualVotes.length > 0 ? 200 : 404;
return res.status(status).json(result);
} catch (e) {
Logger.error(e as string);
@@ -209,9 +358,15 @@ export async function getBrandingByHashEndpoint(req: Request, res: Response) {
const service: Service = getService(req.query.service as string);
const ip = getIP(req);
const returnUserID = req.query.returnUserID === "true";
const fetchAll = req.query.fetchAll === "true";
try {
const result = await getVideoBrandingByHash(hashPrefix, service, ip);
const result = await getVideoBrandingByHash(hashPrefix, service, ip, returnUserID, fetchAll);
await getEtag("brandingHash", (hashPrefix as string), service)
.then(etag => res.set("ETag", etag))
.catch(() => null);
const status = !isEmpty(result) ? 200 : 404;
return res.status(status).json(result);
@@ -219,4 +374,4 @@ export async function getBrandingByHashEndpoint(req: Request, res: Response) {
Logger.error(e as string);
return res.status(500).send([]);
}
}
}

View File

@@ -0,0 +1,82 @@
/* istanbul ignore file */
import { db } from "../databases/databases";
import { Request, Response } from "express";
import axios from "axios";
import { Logger } from "../utils/logger";
import { getCWSUsers, getChromeUsers } from "../utils/getCWSUsers";
// A cache of the number of chrome web store users
let chromeUsersCache = 30000;
let firefoxUsersCache = 0;
interface DBStatsData {
userCount: number,
titles: number,
thumbnails: number,
}
let lastFetch: DBStatsData = {
userCount: 0,
titles: 0,
thumbnails: 0
};
updateExtensionUsers();
export async function getBrandingStats(req: Request, res: Response): Promise<void> {
try {
const row = await getStats();
lastFetch = row;
/* istanbul ignore if */
if (!row) res.sendStatus(500);
const extensionUsers = chromeUsersCache + firefoxUsersCache;
//send this result
res.send({
userCount: row.userCount ?? 0,
activeUsers: extensionUsers,
titles: row.titles,
thumbnails: row.thumbnails,
});
} catch (e) {
Logger.error(e as string);
res.sendStatus(500);
}
}
async function getStats(): Promise<DBStatsData> {
if (db.highLoad()) {
return Promise.resolve(lastFetch);
} else {
const userCount = (await db.prepare("get", `SELECT COUNT(DISTINCT "userID") as "userCount" FROM titles`, []))?.userCount;
const titles = (await db.prepare("get", `SELECT COUNT(*) as "titles" FROM titles`, []))?.titles;
const thumbnails = (await db.prepare("get", `SELECT COUNT(*) as "thumbnails" FROM thumbnails`, []))?.thumbnails;
return {
userCount: userCount ?? 0,
titles: titles ?? 0,
thumbnails: thumbnails ?? 0
};
}
}
function updateExtensionUsers() {
const mozillaAddonsUrl = "https://addons.mozilla.org/api/v3/addons/addon/dearrow/";
const chromeExtensionUrl = "https://chromewebstore.google.com/detail/dearrow-better-titles-and/enamippconapkdmgfgjchkhakpfinmaj";
const chromeExtId = "enamippconapkdmgfgjchkhakpfinmaj";
axios.get(mozillaAddonsUrl)
.then(res => firefoxUsersCache = res.data.average_daily_users )
.catch( /* istanbul ignore next */ () => {
Logger.debug(`Failing to connect to ${mozillaAddonsUrl}`);
return 0;
});
getCWSUsers(chromeExtId)
.then(res => chromeUsersCache = res)
.catch(/* istanbul ignore next */ () =>
getChromeUsers(chromeExtensionUrl)
.then(res => chromeUsersCache = res)
);
}

View File

@@ -22,15 +22,16 @@ export async function getChapterNames(req: Request, res: Response): Promise<Resp
const descriptions = await db.prepare("all", `
SELECT "description"
FROM "sponsorTimes"
WHERE ("locked" = 1 OR "votes" > 0 OR ("views" > 25 AND "votes" >= 0)) AND "videoID" IN (
WHERE ("locked" = 1 OR "votes" >= 0) AND "videoID" IN (
SELECT "videoID"
FROM "videoInfo"
WHERE "channelID" = ?
) AND "description" != ''
AND similarity("description", ?) >= 0.1
GROUP BY "description"
ORDER BY SUM("votes"), similarity("description", ?) DESC
LIMIT 5;`
, [channelID, description]) as { description: string }[];
, [channelID, description, description]) as { description: string }[];
if (descriptions?.length > 0) {
return res.status(200).json(descriptions.map(d => ({

35
src/routes/getConfig.ts Normal file
View File

@@ -0,0 +1,35 @@
import { getHashCache } from "../utils/getHashCache";
import { Request, Response } from "express";
import { isUserVIP } from "../utils/isUserVIP";
import { UserID } from "../types/user.model";
import { Logger } from "../utils/logger";
import { getServerConfig } from "../utils/serverConfig";
export async function getConfigEndpoint(req: Request, res: Response): Promise<Response> {
const userID = req.query.userID as string;
const key = req.query.key as string;
if (!userID || !key) {
// invalid request
return res.sendStatus(400);
}
// hash the userID
const hashedUserID = await getHashCache(userID as UserID);
const isVIP = (await isUserVIP(hashedUserID));
if (!isVIP) {
// not authorized
return res.sendStatus(403);
}
try {
return res.status(200).json({
value: await getServerConfig(key)
});
} catch (e) {
Logger.error(e as string);
return res.sendStatus(500);
}
}

View File

@@ -1,17 +1,23 @@
import { db } from "../databases/databases";
import { Request, Response } from "express";
import { Logger } from "../utils/logger";
export async function getDaysSavedFormatted(req: Request, res: Response): Promise<Response> {
const row = await db.prepare("get", 'SELECT SUM(("endTime" - "startTime") / 60 / 60 / 24 * "views") as "daysSaved" from "sponsorTimes" where "shadowHidden" != 1', []);
try {
const row = await db.prepare("get", 'SELECT SUM(("endTime" - "startTime") / 60 / 60 / 24 * "views") as "daysSaved" from "sponsorTimes" where "shadowHidden" != 1', []);
if (row !== undefined) {
//send this result
return res.send({
daysSaved: row.daysSaved?.toFixed(2) ?? "0",
});
} else {
return res.send({
daysSaved: 0
});
if (row !== undefined) {
//send this result
return res.send({
daysSaved: row.daysSaved?.toFixed(2) ?? "0",
});
} else {
return res.send({
daysSaved: 0
});
}
} catch (err) {
Logger.error(err as string);
return res.sendStatus(500);
}
}

View File

@@ -0,0 +1,15 @@
import { config } from "../config";
import { Request, Response } from "express";
export function getFeatureFlag(req: Request, res: Response): Response {
const { params: { name } } = req;
switch (name) {
case "deArrowPaywall":
return res.status(200).json({
enabled: config.deArrowPaywall,
});
}
return res.status(404).json();
}

106
src/routes/getMetrics.ts Normal file
View File

@@ -0,0 +1,106 @@
import { db, privateDB } from "../databases/databases";
import { Request, Response } from "express";
import os from "os";
import redis, { getRedisStats } from "../utils/redis";
import { Postgres } from "../databases/Postgres";
import { Server } from "http";
export async function getMetrics(req: Request, res: Response, server: Server): Promise<Response> {
const redisStats = getRedisStats();
return res.type("text").send([
`# HELP sb_uptime Uptime of this instance`,
`# TYPE sb_uptime counter`,
`sb_uptime ${process.uptime()}`,
`# HELP sb_db_version The version of the database`,
`# TYPE sb_db_version counter`,
`sb_db_version ${await db.prepare("get", "SELECT key, value FROM config where key = ?", ["version"]).then(e => e.value).catch(() => -1)}`,
`# HELP sb_start_time The time this instance was started`,
`# TYPE sb_start_time gauge`,
`sb_start_time ${Date.now()}`,
`# HELP sb_loadavg_5 The 5 minute load average of the system`,
`# TYPE sb_loadavg_5 gauge`,
`sb_loadavg_5 ${os.loadavg()[0]}`,
`# HELP sb_loadavg_15 The 15 minute load average of the system`,
`# TYPE sb_loadavg_15 gauge`,
`sb_loadavg_15 ${os.loadavg()[1]}`,
`# HELP sb_connections The number of connections to this instance`,
`# TYPE sb_connections gauge`,
`sb_connections ${await new Promise((resolve) => server.getConnections((_, count) => resolve(count)) as any)}`,
`# HELP sb_status_requests The number of status requests made to this instance`,
`# TYPE sb_status_requests gauge`,
`sb_status_requests ${await redis.increment("statusRequest").then(e => e[0]).catch(() => -1)}`,
`# HELP sb_postgres_active_requests The number of active requests to the postgres database`,
`# TYPE sb_postgres_active_requests gauge`,
`sb_postgres_active_requests ${(db as Postgres)?.getStats?.()?.activeRequests ?? -1}`,
`# HELP sb_postgres_avg_read_time The average read time of the postgres database`,
`# TYPE sb_postgres_avg_read_time gauge`,
`sb_postgres_avg_read_time ${(db as Postgres)?.getStats?.()?.avgReadTime ?? -1}`,
`# HELP sb_postgres_avg_write_time The average write time of the postgres database`,
`# TYPE sb_postgres_avg_write_time gauge`,
`sb_postgres_avg_write_time ${(db as Postgres)?.getStats?.()?.avgWriteTime ?? -1}`,
`# HELP sb_postgres_avg_failed_time The average failed time of the postgres database`,
`# TYPE sb_postgres_avg_failed_time gauge`,
`sb_postgres_avg_failed_time ${(db as Postgres)?.getStats?.()?.avgFailedTime ?? -1}`,
`# HELP sb_postgres_pool_total The total number of connections in the postgres pool`,
`# TYPE sb_postgres_pool_total gauge`,
`sb_postgres_pool_total ${(db as Postgres)?.getStats?.()?.pool?.total ?? -1}`,
`# HELP sb_postgres_pool_idle The number of idle connections in the postgres pool`,
`# TYPE sb_postgres_pool_idle gauge`,
`sb_postgres_pool_idle ${(db as Postgres)?.getStats?.()?.pool?.idle ?? -1}`,
`# HELP sb_postgres_pool_waiting The number of connections waiting in the postgres pool`,
`# TYPE sb_postgres_pool_waiting gauge`,
`sb_postgres_pool_waiting ${(db as Postgres)?.getStats?.()?.pool?.waiting ?? -1}`,
`# HELP sb_postgres_private_active_requests The number of active requests to the private postgres database`,
`# TYPE sb_postgres_private_active_requests gauge`,
`sb_postgres_private_active_requests ${(privateDB as Postgres)?.getStats?.()?.activeRequests ?? -1}`,
`# HELP sb_postgres_private_avg_read_time The average read time of the private postgres database`,
`# TYPE sb_postgres_private_avg_read_time gauge`,
`sb_postgres_private_avg_read_time ${(privateDB as Postgres)?.getStats?.()?.avgReadTime ?? -1}`,
`# HELP sb_postgres_private_avg_write_time The average write time of the private postgres database`,
`# TYPE sb_postgres_private_avg_write_time gauge`,
`sb_postgres_private_avg_write_time ${(privateDB as Postgres)?.getStats?.()?.avgWriteTime ?? -1}`,
`# HELP sb_postgres_private_avg_failed_time The average failed time of the private postgres database`,
`# TYPE sb_postgres_private_avg_failed_time gauge`,
`sb_postgres_private_avg_failed_time ${(privateDB as Postgres)?.getStats?.()?.avgFailedTime ?? -1}`,
`# HELP sb_postgres_private_pool_total The total number of connections in the private postgres pool`,
`# TYPE sb_postgres_private_pool_total gauge`,
`sb_postgres_private_pool_total ${(privateDB as Postgres)?.getStats?.()?.pool?.total ?? -1}`,
`# HELP sb_postgres_private_pool_idle The number of idle connections in the private postgres pool`,
`# TYPE sb_postgres_private_pool_idle gauge`,
`sb_postgres_private_pool_idle ${(privateDB as Postgres)?.getStats?.()?.pool?.idle ?? -1}`,
`# HELP sb_postgres_private_pool_waiting The number of connections waiting in the private postgres pool`,
`# TYPE sb_postgres_private_pool_waiting gauge`,
`sb_postgres_private_pool_waiting ${(privateDB as Postgres)?.getStats?.()?.pool?.waiting ?? -1}`,
`# HELP sb_redis_active_requests The number of active requests to redis`,
`# TYPE sb_redis_active_requests gauge`,
`sb_redis_active_requests ${redisStats.activeRequests}`,
`# HELP sb_redis_write_requests The number of write requests to redis`,
`# TYPE sb_redis_write_requests gauge`,
`sb_redis_write_requests ${redisStats.writeRequests}`,
`# HELP sb_redis_avg_read_time The average read time of redis`,
`# TYPE sb_redis_avg_read_time gauge`,
`sb_redis_avg_read_time ${redisStats?.avgReadTime}`,
`# HELP sb_redis_avg_write_time The average write time of redis`,
`# TYPE sb_redis_avg_write_time gauge`,
`sb_redis_avg_write_time ${redisStats.avgWriteTime}`,
`# HELP sb_redis_memory_cache_hits The cache hit ratio in redis`,
`# TYPE sb_redis_memory_cache_hits gauge`,
`sb_redis_memory_cache_hits ${redisStats.memoryCacheHits}`,
`# HELP sb_redis_memory_cache_total_hits The cache hit ratio in redis including uncached items`,
`# TYPE sb_redis_memory_cache_total_hits gauge`,
`sb_redis_memory_cache_total_hits ${redisStats.memoryCacheTotalHits}`,
`# HELP sb_redis_memory_cache_length The length of the memory cache in redis`,
`# TYPE sb_redis_memory_cache_length gauge`,
`sb_redis_memory_cache_length ${redisStats.memoryCacheLength}`,
`# HELP sb_redis_memory_cache_size The size of the memory cache in redis`,
`# TYPE sb_redis_memory_cache_size gauge`,
`sb_redis_memory_cache_size ${redisStats.memoryCacheSize}`,
`# HELP sb_redis_last_invalidation The time of the last successful invalidation in redis`,
`# TYPE sb_redis_last_invalidation gauge`,
`sb_redis_last_invalidation ${redisStats.lastInvalidation}`,
`# HELP sb_redis_last_invalidation_message The time of the last invalidation message in redis`,
`# TYPE sb_redis_last_invalidation_message gauge`,
`sb_redis_last_invalidation_message ${redisStats.lastInvalidationMessage}`,
].join("\n"));
}

26
src/routes/getReady.ts Normal file
View File

@@ -0,0 +1,26 @@
import { Request, Response } from "express";
import { Server } from "http";
import { config } from "../config";
import { getRedisStats } from "../utils/redis";
import { Postgres } from "../databases/Postgres";
import { db } from "../databases/databases";
export async function getReady(req: Request, res: Response, server: Server): Promise<Response> {
const connections = await new Promise((resolve) => server.getConnections((_, count) => resolve(count))) as number;
const redisStats = getRedisStats();
const postgresStats = (db as Postgres).getStats?.();
if (!connections
|| (connections < config.maxConnections
&& (!config.redis || redisStats.activeRequests < config.redis.maxConnections * 0.8)
&& (!config.redis || redisStats.activeRequests < 1 || redisStats.avgReadTime < config.maxResponseTime
|| (redisStats.memoryCacheSize < config.redis.clientCacheSize * 0.8 && redisStats.avgReadTime < config.maxResponseTimeWhileLoadingCache))
&& (!config.postgres || postgresStats.activeRequests < config.postgres.maxActiveRequests * 0.8)
&& (!config.postgres || postgresStats.avgReadTime < config.maxResponseTime
|| (redisStats.memoryCacheSize < config.redis.clientCacheSize * 0.8 && postgresStats.avgReadTime < config.maxResponseTimeWhileLoadingCache)))) {
return res.sendStatus(200);
} else {
return res.sendStatus(500);
}
}

View File

@@ -0,0 +1,22 @@
import { db } from "../databases/databases";
import { Request, Response } from "express";
import { getService } from "../utils/getService";
export async function getSegmentID(req: Request, res: Response): Promise<Response> {
const partialUUID = req.query?.UUID;
const videoID = req.query?.videoID;
const service = getService(req.query?.service as string);
if (!partialUUID || !videoID) {
//invalid request
return res.sendStatus(400);
}
const data = await db.prepare("get", `SELECT "UUID" from "sponsorTimes" WHERE "UUID" LIKE ? AND "videoID" = ? AND "service" = ?`, [`${partialUUID}%`, videoID, service]);
if (data) {
return res.status(200).send(data.UUID);
} else {
return res.sendStatus(404);
}
}

View File

@@ -2,7 +2,7 @@ import { Request, Response } from "express";
import { partition } from "lodash";
import { config } from "../config";
import { db, privateDB } from "../databases/databases";
import { skipSegmentsHashKey, skipSegmentsKey, skipSegmentGroupsKey, shadowHiddenIPKey } from "../utils/redisKeys";
import { skipSegmentsHashKey, skipSegmentsKey, skipSegmentGroupsKey, shadowHiddenIPKey, skipSegmentsLargerHashKey } from "../utils/redisKeys";
import { SBRecord } from "../types/lib.model";
import { ActionType, Category, DBSegment, HashedIP, IPAddress, OverlappingSegmentGroup, Segment, SegmentCache, SegmentUUID, Service, VideoData, VideoID, VideoIDHash, Visibility, VotableObject } from "../types/segments.model";
import { getHashCache } from "../utils/getHashCache";
@@ -14,6 +14,9 @@ import { getService } from "../utils/getService";
import { promiseOrTimeout } from "../utils/promise";
import { parseSkipSegments } from "../utils/parseSkipSegments";
import { getEtag } from "../middleware/etag";
import { shuffleArray } from "../utils/array";
import { Postgres } from "../databases/Postgres";
import { getRedisStats } from "../utils/redis";
async function prepareCategorySegments(req: Request, videoID: VideoID, service: Service, segments: DBSegment[], cache: SegmentCache = { shadowHiddenSegmentIPs: {} }, useCache: boolean): Promise<Segment[]> {
const shouldFilter: boolean[] = await Promise.all(segments.map(async (segment) => {
@@ -21,7 +24,9 @@ async function prepareCategorySegments(req: Request, videoID: VideoID, service:
return true; //required - always send
}
if (segment.hidden || segment.votes < -1) {
if (segment.hidden
|| segment.votes < -1
|| segment.shadowHidden === Visibility.MORE_HIDDEN) {
return false; //too untrustworthy, just ignore it
}
@@ -41,20 +46,41 @@ async function prepareCategorySegments(req: Request, videoID: VideoID, service:
const fetchData = () => privateDB.prepare("all", 'SELECT "hashedIP" FROM "sponsorTimes" WHERE "videoID" = ? AND "timeSubmitted" = ? AND "service" = ?',
[videoID, segment.timeSubmitted, service], { useReplica: true }) as Promise<{ hashedIP: HashedIP }[]>;
try {
cache.shadowHiddenSegmentIPs[videoID][segment.timeSubmitted] = await promiseOrTimeout(QueryCacher.get(fetchData, shadowHiddenIPKey(videoID, segment.timeSubmitted, service)), 150);
if (db.highLoad() || privateDB.highLoad()) {
Logger.error("High load, not handling shadowhide");
if (db instanceof Postgres && privateDB instanceof Postgres) {
Logger.error(`Postgres stats: ${JSON.stringify(db.getStats())}`);
Logger.error(`Postgres private stats: ${JSON.stringify(privateDB.getStats())}`);
}
Logger.error(`Redis stats: ${JSON.stringify(getRedisStats())}`);
return false;
}
cache.shadowHiddenSegmentIPs[videoID][segment.timeSubmitted] = promiseOrTimeout(QueryCacher.get(fetchData, shadowHiddenIPKey(videoID, segment.timeSubmitted, service)), 150);
} catch (e) {
// give up on shadowhide for now
cache.shadowHiddenSegmentIPs[videoID][segment.timeSubmitted] = null;
}
}
const ipList = cache.shadowHiddenSegmentIPs[videoID][segment.timeSubmitted];
let ipList = [];
try {
ipList = await cache.shadowHiddenSegmentIPs[videoID][segment.timeSubmitted];
} catch (e) {
Logger.error(`skipSegments: Error while trying to find IP: ${e}`);
if (db instanceof Postgres && privateDB instanceof Postgres) {
Logger.error(`Postgres stats: ${JSON.stringify(db.getStats())}`);
Logger.error(`Postgres private stats: ${JSON.stringify(privateDB.getStats())}`);
}
return false;
}
if (ipList?.length > 0 && cache.userHashedIP === undefined) {
cache.userHashedIP = await cache.userHashedIPPromise;
}
//if this isn't their ip, don't send it to them
const shouldShadowHide = cache.shadowHiddenSegmentIPs[videoID][segment.timeSubmitted]?.some(
const shouldShadowHide = ipList?.some(
(shadowHiddenSegment) => shadowHiddenSegment.hashedIP === cache.userHashedIP) ?? false;
if (shouldShadowHide) useCache = false;
@@ -124,7 +150,7 @@ async function getSegmentsByVideoID(req: Request, videoID: VideoID, categories:
}
async function getSegmentsByHash(req: Request, hashedVideoIDPrefix: VideoIDHash, categories: Category[],
actionTypes: ActionType[], requiredSegments: SegmentUUID[], service: Service): Promise<SBRecord<VideoID, VideoData>> {
actionTypes: ActionType[], trimUUIDs: number, requiredSegments: SegmentUUID[], service: Service): Promise<SBRecord<VideoID, VideoData>> {
const cache: SegmentCache = { shadowHiddenSegmentIPs: {} };
const segments: SBRecord<VideoID, VideoData> = {};
@@ -156,13 +182,32 @@ async function getSegmentsByHash(req: Request, hashedVideoIDPrefix: VideoIDHash,
};
const canUseCache = requiredSegments.length === 0;
data.segments = (await prepareCategorySegments(req, videoID as VideoID, service, videoData.segments, cache, canUseCache))
.filter((segment: Segment) => categories.includes(segment?.category) && actionTypes.includes(segment?.actionType))
const filteredSegments = (await prepareCategorySegments(req, videoID as VideoID, service, videoData.segments, cache, canUseCache))
.filter((segment: Segment) => categories.includes(segment?.category) && actionTypes.includes(segment?.actionType));
// Make sure no hash duplicates exist
if (trimUUIDs) {
const seen = new Set<string>();
for (const segment of filteredSegments) {
const shortUUID = segment.UUID.substring(0, trimUUIDs);
if (seen.has(shortUUID)) {
// Duplicate found, disable trimming
trimUUIDs = undefined;
break;
}
seen.add(shortUUID);
}
seen.clear();
}
data.segments = filteredSegments
.map((segment) => ({
category: segment.category,
actionType: segment.actionType,
segment: segment.segment,
UUID: segment.UUID,
UUID: trimUUIDs ? segment.UUID.substring(0, trimUUIDs) as SegmentUUID : segment.UUID,
videoDuration: segment.videoDuration,
locked: segment.locked,
votes: segment.votes,
@@ -183,7 +228,7 @@ async function getSegmentsByHash(req: Request, hashedVideoIDPrefix: VideoIDHash,
return segments;
} catch (err) /* istanbul ignore next */ {
Logger.error(err as string);
Logger.error(`get segments by hash error: ${err}`);
return null;
}
}
@@ -200,6 +245,8 @@ async function getSegmentsFromDBByHash(hashedVideoIDPrefix: VideoIDHash, service
if (hashedVideoIDPrefix.length === 4) {
return await QueryCacher.get(fetchFromDB, skipSegmentsHashKey(hashedVideoIDPrefix, service));
} else if (hashedVideoIDPrefix.length === 5) {
return await QueryCacher.get(fetchFromDB, skipSegmentsLargerHashKey(hashedVideoIDPrefix, service));
}
return await fetchFromDB();
@@ -218,11 +265,11 @@ async function getSegmentsFromDBByVideoID(videoID: VideoID, service: Service): P
return await QueryCacher.get(fetchFromDB, skipSegmentsKey(videoID, service));
}
// Gets a weighted random choice from the choices array based on their `votes` property.
// Gets the best choice from the choices array based on their `votes` property.
// amountOfChoices specifies the maximum amount of choices to return, 1 or more.
// Choices are unique
// If a predicate is given, it will only filter choices following it, and will leave the rest in the list
function getWeightedRandomChoice<T extends VotableObject>(choices: T[], amountOfChoices: number, filterLocked = false, predicate?: (choice: T) => void): T[] {
function getBestChoice<T extends VotableObject>(choices: T[], amountOfChoices: number, filterLocked = false, predicate?: (choice: T) => void): T[] {
//trivial case: no need to go through the whole process
if (amountOfChoices >= choices.length) {
return choices;
@@ -245,39 +292,22 @@ function getWeightedRandomChoice<T extends VotableObject>(choices: T[], amountOf
}
//assign a weight to each choice
let totalWeight = 0;
const choicesWithWeights: TWithWeight[] = filteredChoices.map(choice => {
const boost = Math.min(choice.reputation, 4);
//The 3 makes -2 the minimum votes before being ignored completely
//this can be changed if this system increases in popularity.
const repFactor = choice.votes > 0 ? Math.max(1, choice.reputation + 1) : 1;
const weight = Math.exp(choice.votes * repFactor + 3 + boost);
totalWeight += Math.max(weight, 0);
const choicesWithWeights: TWithWeight[] = shuffleArray(filteredChoices.map(choice => {
const boost = choice.reputation;
const weight = choice.votes + boost;
return { ...choice, weight };
});
})).sort((a, b) => b.weight - a.weight);
// Nothing to filter for
if (amountOfChoices >= choicesWithWeights.length) {
return [...forceIncludedChoices, ...filteredChoices];
}
//iterate and find amountOfChoices choices
// Pick the top options
const chosen = [...forceIncludedChoices];
while (amountOfChoices-- > 0) {
//weighted random draw of one element of choices
const randomNumber = Math.random() * totalWeight;
let stackWeight = choicesWithWeights[0].weight;
let i = 0;
while (stackWeight < randomNumber) {
stackWeight += choicesWithWeights[++i].weight;
}
//add it to the chosen ones and remove it from the choices before the next iteration
for (let i = 0; i < amountOfChoices; i++) {
chosen.push(choicesWithWeights[i]);
totalWeight -= choicesWithWeights[i].weight;
choicesWithWeights.splice(i, 1);
}
return chosen;
@@ -286,20 +316,20 @@ function getWeightedRandomChoice<T extends VotableObject>(choices: T[], amountOf
async function chooseSegments(videoID: VideoID, service: Service, segments: DBSegment[], useCache: boolean): Promise<DBSegment[]> {
const fetchData = async () => await buildSegmentGroups(segments);
const groups = useCache
const groups = useCache && config.useCacheForSegmentGroups
? await QueryCacher.get(fetchData, skipSegmentGroupsKey(videoID, service))
: await fetchData();
// Filter for only 1 item for POI categories and Full video
let chosenGroups = getWeightedRandomChoice(groups, 1, true, (choice) => choice.segments[0].actionType === ActionType.Full);
chosenGroups = getWeightedRandomChoice(chosenGroups, 1, true, (choice) => choice.segments[0].actionType === ActionType.Poi);
return chosenGroups.map(//randomly choose 1 good segment per group and return them
group => getWeightedRandomChoice(group.segments, 1)[0]
let chosenGroups = getBestChoice(groups, 1, true, (choice) => choice.segments[0].actionType === ActionType.Full);
chosenGroups = getBestChoice(chosenGroups, 1, true, (choice) => choice.segments[0].actionType === ActionType.Poi);
return chosenGroups.map(// choose 1 good segment per group and return them
group => getBestChoice(group.segments, 1)[0]
);
}
//This function will find segments that are contained inside of eachother, called similar segments
//Only one similar time will be returned, randomly generated based on the sqrt of votes.
//Only one similar time will be returned, based on its score
//This allows new less voted items to still sometimes appear to give them a chance at getting votes.
//Segments with less than -1 votes are already ignored before this function is called
async function buildSegmentGroups(segments: DBSegment[]): Promise<OverlappingSegmentGroup[]> {
@@ -413,7 +443,7 @@ async function getSkipSegments(req: Request, res: Response): Promise<Response> {
await getEtag("skipSegments", (videoID as string), service)
.then(etag => res.set("ETag", etag))
.catch(() => null);
.catch(() => ({}));
return res.send(segments);
}

View File

@@ -17,13 +17,14 @@ export async function getSkipSegmentsByHash(req: Request, res: Response): Promis
if (parseResult.errors.length > 0) {
return res.status(400).send(parseResult.errors);
}
const { categories, actionTypes, requiredSegments, service } = parseResult;
const { categories, actionTypes, trimUUIDs, requiredSegments, service } = parseResult;
// Get all video id's that match hash prefix
const segments = await getSegmentsByHash(req, hashPrefix, categories, actionTypes, requiredSegments, service);
const segments = await getSegmentsByHash(req, hashPrefix, categories, actionTypes, trimUUIDs, requiredSegments, service);
try {
await getEtag("skipSegmentsHash", hashPrefix, service)
const hashKey = hashPrefix.length === 4 ? "skipSegmentsHash" : "skipSegmentsLargerHash";
await getEtag(hashKey, hashPrefix, service)
.then(etag => res.set("ETag", etag))
.catch(/* istanbul ignore next */ () => null);
const output = Object.entries(segments).map(([videoID, data]) => ({

View File

@@ -1,19 +1,20 @@
import { db } from "../databases/databases";
import { db, privateDB } from "../databases/databases";
import { Logger } from "../utils/logger";
import { Request, Response } from "express";
import os from "os";
import redis, { getRedisStats } from "../utils/redis";
import { promiseOrTimeout } from "../utils/promise";
import { Postgres } from "../databases/Postgres";
import { Server } from "http";
export async function getStatus(req: Request, res: Response): Promise<Response> {
export async function getStatus(req: Request, res: Response, server: Server): Promise<Response> {
const startTime = Date.now();
let value = req.params.value as string[] | string;
value = Array.isArray(value) ? value[0] : value;
let processTime, redisProcessTime = -1;
try {
const dbStartTime = Date.now();
const dbVersion = await promiseOrTimeout(db.prepare("get", "SELECT key, value FROM config where key = ?", ["version"]), 5000)
const dbVersion = await promiseOrTimeout(db.prepare("get", "SELECT key, value FROM config where key = ?", ["version"]), 1000)
.then(e => {
processTime = Date.now() - dbStartTime;
return e.value;
@@ -24,12 +25,12 @@ export async function getStatus(req: Request, res: Response): Promise<Response>
});
let statusRequests: unknown = 0;
const redisStartTime = Date.now();
const numberRequests = await promiseOrTimeout(redis.increment("statusRequest"), 5000)
const numberRequests = await promiseOrTimeout(redis.increment("statusRequest"), 1000)
.then(e => {
redisProcessTime = Date.now() - redisStartTime;
return e;
}).catch(e => /* istanbul ignore next */ {
Logger.error(`status: redis increment timed out ${e}`);
Logger.error(`status: redis increment timed out ${e}\nload: ${os.loadavg().slice(1)} with ${JSON.stringify(getRedisStats())}\n${JSON.stringify((db as Postgres)?.getStats?.())}`);
return [-1];
});
statusRequests = numberRequests?.[0];
@@ -42,9 +43,11 @@ export async function getStatus(req: Request, res: Response): Promise<Response>
processTime,
redisProcessTime,
loadavg: os.loadavg().slice(1), // only return 5 & 15 minute load average
connections: await new Promise((resolve) => server.getConnections((_, count) => resolve(count))),
statusRequests,
hostname: os.hostname(),
postgresStats: (db as Postgres)?.getStats?.(),
postgresPrivateStats: (privateDB as Postgres)?.getStats?.(),
redisStats: getRedisStats(),
};
return value ? res.send(JSON.stringify(statusValues[value])) : res.send(statusValues);

View File

@@ -0,0 +1,49 @@
import { db } from "../databases/databases";
import { Request, Response } from "express";
import { Logger } from "../utils/logger";
async function generateTopUsersStats(sortBy: string) {
const rows = await db.prepare("all", `SELECT COUNT(distinct "titles"."UUID") as "titleCount", COUNT(distinct "thumbnails"."UUID") as "thumbnailCount", COALESCE("userName", "titles"."userID") as "userName"
FROM "titles"
LEFT JOIN "titleVotes" ON "titles"."UUID" = "titleVotes"."UUID"
LEFT JOIN "userNames" ON "titles"."userID"="userNames"."userID"
LEFT JOIN "thumbnails" ON "titles"."userID" = "thumbnails"."userID"
LEFT JOIN "thumbnailVotes" ON "thumbnails"."UUID" = "thumbnailVotes"."UUID"
WHERE "titleVotes"."votes" > -1 AND "titleVotes"."shadowHidden" != 1
GROUP BY COALESCE("userName", "titles"."userID") HAVING SUM("titleVotes"."votes") > 2 OR SUM("thumbnailVotes"."votes") > 2
ORDER BY "${sortBy}" DESC LIMIT 100`, []) as { titleCount: number, thumbnailCount: number, userName: string }[];
return rows.map((row) => ({
userName: row.userName,
titles: row.titleCount,
thumbnails: row.thumbnailCount
}));
}
export async function getTopBrandingUsers(req: Request, res: Response): Promise<Response> {
const sortType = parseInt(req.query.sortType as string);
let sortBy = "";
if (sortType == 0) {
sortBy = "titleCount";
} else if (sortType == 1) {
sortBy = "thumbnailCount";
} else {
//invalid request
return res.sendStatus(400);
}
if (db.highLoad()) {
return res.status(503).send("Disabled for load reasons");
}
try {
const stats = await generateTopUsersStats(sortBy);
//send this result
return res.send(stats);
} catch (e) {
Logger.error(e as string);
return res.sendStatus(500);
}
}

View File

@@ -3,6 +3,7 @@ import { createMemoryCache } from "../utils/createMemoryCache";
import { config } from "../config";
import { Request, Response } from "express";
import { validateCategories } from "../utils/parseParams";
import { Logger } from "../utils/logger";
const MILLISECONDS_IN_MINUTE = 60000;
// eslint-disable-next-line @typescript-eslint/no-misused-promises
@@ -74,8 +75,13 @@ export async function getTopCategoryUsers(req: Request, res: Response): Promise<
return res.sendStatus(400);
}
const stats = await getTopCategoryUsersWithCache(sortBy, category);
try {
const stats = await getTopCategoryUsersWithCache(sortBy, category);
//send this result
return res.send(stats);
//send this result
return res.send(stats);
} catch (e) {
Logger.error(e as string);
return res.sendStatus(500);
}
}

View File

@@ -2,6 +2,7 @@ import { db } from "../databases/databases";
import { createMemoryCache } from "../utils/createMemoryCache";
import { config } from "../config";
import { Request, Response } from "express";
import { Logger } from "../utils/logger";
const MILLISECONDS_IN_MINUTE = 60000;
// eslint-disable-next-line @typescript-eslint/no-misused-promises
@@ -92,8 +93,13 @@ export async function getTopUsers(req: Request, res: Response): Promise<Response
return res.status(503).send("Disabled for load reasons");
}
const stats = await getTopUsersWithCache(sortBy, categoryStatsEnabled);
try {
const stats = await getTopUsersWithCache(sortBy, categoryStatsEnabled);
//send this result
return res.send(stats);
//send this result
return res.send(stats);
} catch (e) {
Logger.error(e as string);
return res.sendStatus(500);
}
}

View File

@@ -3,7 +3,7 @@ import { config } from "../config";
import { Request, Response } from "express";
import axios from "axios";
import { Logger } from "../utils/logger";
import { getCWSUsers } from "../utils/getCWSUsers";
import { getCWSUsers, getChromeUsers } from "../utils/getCWSUsers";
// A cache of the number of chrome web store users
let chromeUsersCache = 0;
@@ -30,30 +30,35 @@ let lastFetch: DBStatsData = {
updateExtensionUsers();
export async function getTotalStats(req: Request, res: Response): Promise<void> {
const countContributingUsers = Boolean(req.query?.countContributingUsers == "true");
const row = await getStats(countContributingUsers);
lastFetch = row;
try {
const countContributingUsers = Boolean(req.query?.countContributingUsers == "true");
const row = await getStats(countContributingUsers);
lastFetch = row;
/* istanbul ignore if */
if (!row) res.sendStatus(500);
const extensionUsers = chromeUsersCache + firefoxUsersCache;
/* istanbul ignore if */
if (!row) res.sendStatus(500);
const extensionUsers = chromeUsersCache + firefoxUsersCache;
//send this result
res.send({
userCount: row.userCount ?? 0,
activeUsers: extensionUsers,
apiUsers: Math.max(apiUsersCache, extensionUsers),
viewCount: row.viewCount,
totalSubmissions: row.totalSubmissions,
minutesSaved: row.minutesSaved,
});
//send this result
res.send({
userCount: row.userCount ?? 0,
activeUsers: extensionUsers,
apiUsers: Math.max(apiUsersCache, extensionUsers),
viewCount: row.viewCount,
totalSubmissions: row.totalSubmissions,
minutesSaved: row.minutesSaved,
});
// Check if the cache should be updated (every ~14 hours)
const now = Date.now();
if (now - lastUserCountCheck > 5000000) {
lastUserCountCheck = now;
// Check if the cache should be updated (every ~14 hours)
const now = Date.now();
if (now - lastUserCountCheck > 5000000) {
lastUserCountCheck = now;
updateExtensionUsers();
updateExtensionUsers();
}
} catch (e) {
Logger.error(e as string);
res.sendStatus(500);
}
}
@@ -92,29 +97,4 @@ function updateExtensionUsers() {
getChromeUsers(chromeExtensionUrl)
.then(res => chromeUsersCache = res)
);
}
/* istanbul ignore next */
function getChromeUsers(chromeExtensionUrl: string): Promise<number> {
return axios.get(chromeExtensionUrl)
.then(res => {
const body = res.data;
// 2021-01-05
// [...]<span><meta itemprop="interactionCount" content="UserDownloads:100.000+"/><meta itemprop="opera[...]
const matchingString = '"UserDownloads:';
const matchingStringLen = matchingString.length;
const userDownloadsStartIndex = body.indexOf(matchingString);
/* istanbul ignore else */
if (userDownloadsStartIndex >= 0) {
const closingQuoteIndex = body.indexOf('"', userDownloadsStartIndex + matchingStringLen);
const userDownloadsStr = body.substr(userDownloadsStartIndex + matchingStringLen, closingQuoteIndex - userDownloadsStartIndex).replace(",", "").replace(".", "");
return parseInt(userDownloadsStr);
} else {
lastUserCountCheck = 0;
}
})
.catch(/* istanbul ignore next */ () => {
Logger.debug(`Failing to connect to ${chromeExtensionUrl}`);
return 0;
});
}

View File

@@ -1,6 +1,7 @@
import { db } from "../databases/databases";
import { Request, Response } from "express";
import { UserID } from "../types/user.model";
import { Logger } from "../utils/logger";
function getFuzzyUserID(userName: string): Promise<{userName: string, userID: UserID }[]> {
// escape [_ % \] to avoid ReDOS
@@ -37,16 +38,22 @@ export async function getUserID(req: Request, res: Response): Promise<Response>
// invalid request
return res.sendStatus(400);
}
const results = exactSearch
? await getExactUserID(userName)
: await getFuzzyUserID(userName);
if (results === undefined || results === null) {
/* istanbul ignore next */
try {
const results = exactSearch
? await getExactUserID(userName)
: await getFuzzyUserID(userName);
if (results === undefined || results === null) {
/* istanbul ignore next */
return res.sendStatus(500);
} else if (results.length === 0) {
return res.sendStatus(404);
} else {
return res.send(results);
}
} catch (e) {
Logger.error(e as string);
return res.sendStatus(500);
} else if (results.length === 0) {
return res.sendStatus(404);
} else {
return res.send(results);
}
}

View File

@@ -1,4 +1,4 @@
import { db } from "../databases/databases";
import { db, privateDB } from "../databases/databases";
import { getHashCache } from "../utils/getHashCache";
import { isUserVIP } from "../utils/isUserVIP";
import { Request, Response } from "express";
@@ -8,14 +8,16 @@ import { getReputation } from "../utils/reputation";
import { Category, SegmentUUID } from "../types/segments.model";
import { config } from "../config";
import { canSubmit } from "../utils/permissions";
import { isUserBanned } from "../utils/checkBan";
const maxRewardTime = config.maxRewardTimePerSegmentInSeconds;
async function dbGetSubmittedSegmentSummary(userID: HashedUserID): Promise<{ minutesSaved: number, segmentCount: number }> {
try {
const countShadowHidden = await isUserBanned(userID) ? 2 : 1; // if shadowbanned, count shadowhidden as well
const row = await db.prepare("get",
`SELECT SUM(CASE WHEN "actionType" = 'chapter' THEN 0 ELSE ((CASE WHEN "endTime" - "startTime" > ? THEN ? ELSE "endTime" - "startTime" END) / 60) * "views" END) as "minutesSaved",
count(*) as "segmentCount" FROM "sponsorTimes"
WHERE "userID" = ? AND "votes" > -2 AND "shadowHidden" != 1`, [maxRewardTime, maxRewardTime, userID], { useReplica: true });
WHERE "userID" = ? AND "votes" > -2 AND "shadowHidden" != ?`, [maxRewardTime, maxRewardTime, userID, countShadowHidden], { useReplica: true });
if (row.minutesSaved != null) {
return {
minutesSaved: row.minutesSaved,
@@ -70,7 +72,7 @@ async function dbGetIgnoredViewsForUser(userID: HashedUserID) {
async function dbGetWarningsForUser(userID: HashedUserID): Promise<number> {
try {
const row = await db.prepare("get", `SELECT COUNT(*) as total FROM "warnings" WHERE "userID" = ? AND "enabled" = 1`, [userID], { useReplica: true });
const row = await db.prepare("get", `SELECT COUNT(*) as total FROM "warnings" WHERE "userID" = ? AND "enabled" = 1 AND "type" = 0`, [userID], { useReplica: true });
return row?.total ?? 0;
} catch (err) /* istanbul ignore next */ {
Logger.error(`Couldn't get warnings for user ${userID}. returning 0`);
@@ -78,6 +80,16 @@ async function dbGetWarningsForUser(userID: HashedUserID): Promise<number> {
}
}
async function dbGetDeArrowWarningReasonForUser(userID: HashedUserID): Promise<number> {
try {
const row = await db.prepare("get", `SELECT reason FROM "warnings" WHERE "userID" = ? AND "enabled" = 1 AND "type" = 1`, [userID], { useReplica: true });
return row?.reason ?? 0;
} catch (err) /* istanbul ignore next */ {
Logger.error(`Couldn't get warnings for user ${userID}. returning 0`);
return 0;
}
}
async function dbGetLastSegmentForUser(userID: HashedUserID): Promise<SegmentUUID> {
try {
const row = await db.prepare("get", `SELECT "UUID" FROM "sponsorTimes" WHERE "userID" = ? ORDER BY "timeSubmitted" DESC LIMIT 1`, [userID], { useReplica: true });
@@ -89,7 +101,7 @@ async function dbGetLastSegmentForUser(userID: HashedUserID): Promise<SegmentUUI
async function dbGetActiveWarningReasonForUser(userID: HashedUserID): Promise<string> {
try {
const row = await db.prepare("get", `SELECT reason FROM "warnings" WHERE "userID" = ? AND "enabled" = 1 ORDER BY "issueTime" DESC LIMIT 1`, [userID], { useReplica: true });
const row = await db.prepare("get", `SELECT reason FROM "warnings" WHERE "userID" = ? AND "enabled" = 1 AND "type" = 0 ORDER BY "issueTime" DESC LIMIT 1`, [userID], { useReplica: true });
return row?.reason ?? "";
} catch (err) /* istanbul ignore next */ {
Logger.error(`Couldn't get reason for user ${userID}. returning blank`);
@@ -99,8 +111,7 @@ async function dbGetActiveWarningReasonForUser(userID: HashedUserID): Promise<st
async function dbGetBanned(userID: HashedUserID): Promise<boolean> {
try {
const row = await db.prepare("get", `SELECT count(*) as "userCount" FROM "shadowBannedUsers" WHERE "userID" = ? LIMIT 1`, [userID], { useReplica: true });
return row?.userCount > 0 ?? false;
return await isUserBanned(userID);
} catch (err) /* istanbul ignore next */ {
return false;
}
@@ -115,6 +126,34 @@ async function getPermissions(userID: HashedUserID): Promise<Record<string, bool
return result;
}
async function getTitleSubmissionCount(userID: HashedUserID): Promise<number> {
try {
const row = await db.prepare("get", `SELECT COUNT(*) as "titleSubmissionCount" FROM "titles" JOIN "titleVotes" ON "titles"."UUID" = "titleVotes"."UUID" WHERE "titles"."userID" = ? AND "titleVotes"."votes" >= 0`, [userID], { useReplica: true });
return row?.titleSubmissionCount ?? 0;
} catch (err) /* istanbul ignore next */ {
return null;
}
}
async function getThumbnailSubmissionCount(userID: HashedUserID): Promise<number> {
try {
const row = await db.prepare("get", `SELECT COUNT(*) as "thumbnailSubmissionCount" FROM "thumbnails" JOIN "thumbnailVotes" ON "thumbnails"."UUID" = "thumbnailVotes"."UUID" WHERE "thumbnails"."userID" = ? AND "thumbnailVotes"."votes" >= 0`, [userID], { useReplica: true });
return row?.thumbnailSubmissionCount ?? 0;
} catch (err) /* istanbul ignore next */ {
return null;
}
}
async function getCasualSubmissionCount(userID: HashedUserID): Promise<number> {
try {
const row = await privateDB.prepare("get", `SELECT COUNT(DISTINCT "videoID") as "casualSubmissionCount" FROM "casualVotes" WHERE "userID" = ?`, [userID], { useReplica: true });
return row?.casualSubmissionCount ?? 0;
} catch (err) /* istanbul ignore next */ {
return null;
}
}
type cases = Record<string, any>
const executeIfFunction = (f: any) =>
@@ -135,12 +174,16 @@ const dbGetValue = (userID: HashedUserID, property: string): Promise<string|Segm
ignoredViewCount: () => dbGetIgnoredViewsForUser(userID),
warnings: () => dbGetWarningsForUser(userID),
warningReason: () => dbGetActiveWarningReasonForUser(userID),
deArrowWarningReason: () => dbGetDeArrowWarningReasonForUser(userID),
banned: () => dbGetBanned(userID),
reputation: () => getReputation(userID),
vip: () => isUserVIP(userID),
lastSegmentID: () => dbGetLastSegmentForUser(userID),
permissions: () => getPermissions(userID),
freeChaptersAccess: () => true
freeChaptersAccess: () => true,
titleSubmissionCount: () => getTitleSubmissionCount(userID),
thumbnailSubmissionCount: () => getThumbnailSubmissionCount(userID),
casualSubmissionCount: () => getCasualSubmissionCount(userID),
})("")(property);
};
@@ -150,7 +193,8 @@ async function getUserInfo(req: Request, res: Response): Promise<Response> {
const defaultProperties: string[] = ["userID", "userName", "minutesSaved", "segmentCount", "ignoredSegmentCount",
"viewCount", "ignoredViewCount", "warnings", "warningReason", "reputation",
"vip", "lastSegmentID"];
const allProperties: string[] = [...defaultProperties, "banned", "permissions", "freeChaptersAccess"];
const allProperties: string[] = [...defaultProperties, "banned", "permissions", "freeChaptersAccess",
"ignoredSegmentCount", "titleSubmissionCount", "thumbnailSubmissionCount", "casualSubmissionCount", "deArrowWarningReason"];
let paramValues: string[] = req.query.values
? JSON.parse(req.query.values as string)
: req.query.value
@@ -173,15 +217,24 @@ async function getUserInfo(req: Request, res: Response): Promise<Response> {
return res.status(400).send("Invalid userID or publicUserID parameter");
}
const segmentsSummary = await dbGetSubmittedSegmentSummary(hashedUserID);
const responseObj = {} as Record<string, string|SegmentUUID|number>;
for (const property of paramValues) {
responseObj[property] = await dbGetValue(hashedUserID, property);
try {
const responseObj = {} as Record<string, string|SegmentUUID|number>;
for (const property of paramValues) {
responseObj[property] = await dbGetValue(hashedUserID, property);
}
// add minutesSaved and segmentCount after to avoid getting overwritten
if (paramValues.includes("minutesSaved") || paramValues.includes("segmentCount")) {
const segmentsSummary = await dbGetSubmittedSegmentSummary(hashedUserID);
responseObj["minutesSaved"] = segmentsSummary.minutesSaved;
responseObj["segmentCount"] = segmentsSummary.segmentCount;
}
return res.send(responseObj);
} catch (err) {
Logger.error(err as string);
return res.sendStatus(500);
}
// add minutesSaved and segmentCount after to avoid getting overwritten
if (paramValues.includes("minutesSaved")) responseObj["minutesSaved"] = segmentsSummary.minutesSaved;
if (paramValues.includes("segmentCount")) responseObj["segmentCount"] = segmentsSummary.segmentCount;
return res.send(responseObj);
}
export async function endpoint(req: Request, res: Response): Promise<Response> {

View File

@@ -4,6 +4,7 @@ import { Request, Response } from "express";
import { HashedUserID, UserID } from "../types/user.model";
import { config } from "../config";
import { Logger } from "../utils/logger";
import { isUserBanned } from "../utils/checkBan";
type nestedObj = Record<string, Record<string, number>>;
const maxRewardTimePerSegmentInSeconds = config.maxRewardTimePerSegmentInSeconds ?? 86400;
@@ -34,13 +35,14 @@ async function dbGetUserSummary(userID: HashedUserID, fetchCategoryStats: boolea
`;
}
try {
const countShadowHidden = await isUserBanned(userID) ? 2 : 1; // if shadowbanned, count shadowhidden as well
const row = await db.prepare("get", `
SELECT SUM(CASE WHEN "actionType" = 'chapter' THEN 0 ELSE ((CASE WHEN "endTime" - "startTime" > ? THEN ? ELSE "endTime" - "startTime" END) / 60) * "views" END) as "minutesSaved",
${additionalQuery}
count(*) as "segmentCount"
FROM "sponsorTimes"
WHERE "userID" = ? AND "votes" > -2 AND "shadowHidden" != 1`,
[maxRewardTimePerSegmentInSeconds, maxRewardTimePerSegmentInSeconds, userID]);
WHERE "userID" = ? AND "votes" > -2 AND "shadowHidden" != ?`,
[maxRewardTimePerSegmentInSeconds, maxRewardTimePerSegmentInSeconds, userID, countShadowHidden]);
const source = (row.minutesSaved != null) ? row : {};
const handler = { get: (target: Record<string, any>, name: string) => target?.[name] || 0 };
const proxy = new Proxy(source, handler);

View File

@@ -1,27 +1,28 @@
import { Request, Response } from "express";
import { db } from "../databases/databases";
import { videoLabelsHashKey, videoLabelsKey } from "../utils/redisKeys";
import { videoLabelsHashKey, videoLabelsKey, videoLabelsLargerHashKey } from "../utils/redisKeys";
import { SBRecord } from "../types/lib.model";
import { DBSegment, Segment, Service, VideoData, VideoID, VideoIDHash } from "../types/segments.model";
import { ActionType, Category, DBSegment, Service, VideoID, VideoIDHash } from "../types/segments.model";
import { Logger } from "../utils/logger";
import { QueryCacher } from "../utils/queryCacher";
import { getService } from "../utils/getService";
function transformDBSegments(segments: DBSegment[]): Segment[] {
interface FullVideoSegment {
category: Category;
}
interface FullVideoSegmentVideoData {
segments: FullVideoSegment[];
hasStartSegment: boolean;
}
function transformDBSegments(segments: DBSegment[]): FullVideoSegment[] {
return segments.map((chosenSegment) => ({
category: chosenSegment.category,
actionType: chosenSegment.actionType,
segment: [chosenSegment.startTime, chosenSegment.endTime],
UUID: chosenSegment.UUID,
locked: chosenSegment.locked,
votes: chosenSegment.votes,
videoDuration: chosenSegment.videoDuration,
userID: chosenSegment.userID,
description: chosenSegment.description
category: chosenSegment.category
}));
}
async function getLabelsByVideoID(videoID: VideoID, service: Service): Promise<Segment[]> {
async function getLabelsByVideoID(videoID: VideoID, service: Service): Promise<FullVideoSegmentVideoData> {
try {
const segments: DBSegment[] = await getSegmentsFromDBByVideoID(videoID, service);
return chooseSegment(segments);
@@ -33,8 +34,8 @@ async function getLabelsByVideoID(videoID: VideoID, service: Service): Promise<S
}
}
async function getLabelsByHash(hashedVideoIDPrefix: VideoIDHash, service: Service): Promise<SBRecord<VideoID, VideoData>> {
const segments: SBRecord<VideoID, VideoData> = {};
async function getLabelsByHash(hashedVideoIDPrefix: VideoIDHash, service: Service, checkHasStartSegment: boolean): Promise<SBRecord<VideoID, FullVideoSegmentVideoData>> {
const segments: SBRecord<VideoID, FullVideoSegmentVideoData> = {};
try {
type SegmentWithHashPerVideoID = SBRecord<VideoID, { hash: VideoIDHash, segments: DBSegment[] }>;
@@ -53,11 +54,13 @@ async function getLabelsByHash(hashedVideoIDPrefix: VideoIDHash, service: Servic
}, {});
for (const [videoID, videoData] of Object.entries(segmentPerVideoID)) {
const data: VideoData = {
segments: chooseSegment(videoData.segments),
const result = chooseSegment(videoData.segments);
const data: FullVideoSegmentVideoData = {
segments: result.segments,
hasStartSegment: checkHasStartSegment ? result.hasStartSegment : undefined
};
if (data.segments.length > 0) {
if (data.segments.length > 0 || (data.hasStartSegment && checkHasStartSegment)) {
segments[videoID] = data;
}
}
@@ -74,12 +77,14 @@ async function getSegmentsFromDBByHash(hashedVideoIDPrefix: VideoIDHash, service
.prepare(
"all",
`SELECT "startTime", "endTime", "videoID", "votes", "locked", "UUID", "userID", "category", "actionType", "hashedVideoID", "description" FROM "sponsorTimes"
WHERE "hashedVideoID" LIKE ? AND "service" = ? AND "actionType" = 'full' AND "hidden" = 0 AND "shadowHidden" = 0`,
WHERE "hashedVideoID" LIKE ? AND "service" = ? AND "hidden" = 0 AND "shadowHidden" = 0`,
[`${hashedVideoIDPrefix}%`, service]
) as Promise<DBSegment[]>;
if (hashedVideoIDPrefix.length === 3) {
return await QueryCacher.get(fetchFromDB, videoLabelsHashKey(hashedVideoIDPrefix, service));
} else if (hashedVideoIDPrefix.length === 4) {
return await QueryCacher.get(fetchFromDB, videoLabelsLargerHashKey(hashedVideoIDPrefix, service));
}
return await fetchFromDB();
@@ -90,22 +95,34 @@ async function getSegmentsFromDBByVideoID(videoID: VideoID, service: Service): P
.prepare(
"all",
`SELECT "startTime", "endTime", "votes", "locked", "UUID", "userID", "category", "actionType", "description" FROM "sponsorTimes"
WHERE "videoID" = ? AND "service" = ? AND "actionType" = 'full' AND "hidden" = 0 AND "shadowHidden" = 0`,
WHERE "videoID" = ? AND "service" = ? AND "hidden" = 0 AND "shadowHidden" = 0`,
[videoID, service]
) as Promise<DBSegment[]>;
return await QueryCacher.get(fetchFromDB, videoLabelsKey(videoID, service));
}
function chooseSegment<T extends DBSegment>(choices: T[]): Segment[] {
function chooseSegment<T extends DBSegment>(choices: T[]): FullVideoSegmentVideoData {
// filter out -2 segments
choices = choices.filter((segment) => segment.votes > -2);
const hasStartSegment = !!choices.some((segment) => segment.startTime < 5
&& (segment.actionType === ActionType.Skip || segment.actionType === ActionType.Mute));
choices = choices.filter((segment) => segment.actionType === ActionType.Full);
const results = [];
// trivial decisions
if (choices.length === 0) {
return [];
return {
segments: [],
hasStartSegment
};
} else if (choices.length === 1) {
return transformDBSegments(choices);
return {
segments: transformDBSegments(choices),
hasStartSegment
};
}
// if locked, only choose from locked
const locked = choices.filter((segment) => segment.locked);
@@ -114,7 +131,10 @@ function chooseSegment<T extends DBSegment>(choices: T[]): Segment[] {
}
//no need to filter, just one label
if (choices.length === 1) {
return transformDBSegments(choices);
return {
segments: transformDBSegments(choices),
hasStartSegment
};
}
// sponsor > exclusive > selfpromo
const findCategory = (category: string) => choices.find((segment) => segment.category === category);
@@ -122,25 +142,36 @@ function chooseSegment<T extends DBSegment>(choices: T[]): Segment[] {
const categoryResult = findCategory("sponsor") ?? findCategory("exclusive_access") ?? findCategory("selfpromo");
if (categoryResult) results.push(categoryResult);
return transformDBSegments(results);
return {
segments: transformDBSegments(results),
hasStartSegment
};
}
async function handleGetLabel(req: Request, res: Response): Promise<Segment[] | false> {
async function handleGetLabel(req: Request, res: Response): Promise<FullVideoSegmentVideoData | FullVideoSegment[] | false> {
const videoID = req.query.videoID as VideoID;
if (!videoID) {
res.status(400).send("videoID not specified");
return false;
}
const hasStartSegment = req.query.hasStartSegment === "true";
const service = getService(req.query.service, req.body.service);
const segments = await getLabelsByVideoID(videoID, service);
const segmentData = await getLabelsByVideoID(videoID, service);
const segments = segmentData.segments;
if (!segments || segments.length === 0) {
res.sendStatus(404);
return false;
}
return segments;
if (hasStartSegment) {
return segmentData;
} else {
return segments;
}
}
async function endpoint(req: Request, res: Response): Promise<Response> {

View File

@@ -11,16 +11,19 @@ export async function getVideoLabelsByHash(req: Request, res: Response): Promise
}
hashPrefix = hashPrefix.toLowerCase() as VideoIDHash;
const checkHasStartSegment = req.query.hasStartSegment === "true";
const service: Service = getService(req.query.service, req.body.service);
// Get all video id's that match hash prefix
const segments = await getLabelsByHash(hashPrefix, service);
const segments = await getLabelsByHash(hashPrefix, service, checkHasStartSegment);
if (!segments) return res.status(404).json([]);
const output = Object.entries(segments).map(([videoID, data]) => ({
videoID,
segments: data.segments,
hasStartSegment: data.hasStartSegment
}));
return res.status(output.length === 0 ? 404 : 200).json(output);
}

View File

@@ -2,9 +2,9 @@ import { Request, Response } from "express";
import { config } from "../config";
import { db, privateDB } from "../databases/databases";
import { BrandingSubmission, BrandingUUID, TimeThumbnailSubmission } from "../types/branding.model";
import { BrandingSubmission, BrandingUUID, TimeThumbnailSubmission, TitleSubmission } from "../types/branding.model";
import { HashedIP, IPAddress, VideoID } from "../types/segments.model";
import { HashedUserID } from "../types/user.model";
import { Feature, HashedUserID } from "../types/user.model";
import { getHashCache } from "../utils/getHashCache";
import { getIP } from "../utils/getIP";
import { getService } from "../utils/getService";
@@ -12,12 +12,26 @@ import { isUserVIP } from "../utils/isUserVIP";
import { Logger } from "../utils/logger";
import crypto from "crypto";
import { QueryCacher } from "../utils/queryCacher";
import { acquireLock } from "../utils/redisLock";
import { hasFeature } from "../utils/features";
import { checkBanStatus } from "../utils/checkBan";
import axios from "axios";
import { getMaxResThumbnail } from "../utils/youtubeApi";
import { getVideoDetails } from "../utils/getVideoDetails";
import { canSubmitDeArrow } from "../utils/permissions";
import { parseUserAgent } from "../utils/userAgent";
import { isRequestInvalid } from "../utils/requestValidator";
enum BrandingType {
Title,
Thumbnail
}
enum BrandingVoteType {
Upvote = 1,
Downvote = 2
}
interface ExistingVote {
UUID: BrandingUUID;
type: number;
@@ -25,125 +39,454 @@ interface ExistingVote {
}
export async function postBranding(req: Request, res: Response) {
const { videoID, userID, title, thumbnail } = req.body as BrandingSubmission;
const { videoID, userID, title, thumbnail, autoLock, downvote, videoDuration, wasWarned, casualMode } = req.body as BrandingSubmission;
const service = getService(req.body.service);
const userAgent = req.body.userAgent ?? parseUserAgent(req.get("user-agent")) ?? "";
if (!videoID || !userID || userID.length < 30 || !service
|| ((!title || !title.title)
&& (!thumbnail || thumbnail.original == null
|| (!thumbnail.original && !(thumbnail as TimeThumbnailSubmission).timestamp)))) {
|| (!thumbnail.original && (thumbnail as TimeThumbnailSubmission).timestamp) == null))) {
res.status(400).send("Bad Request");
return;
}
try {
const hashedUserID = await getHashCache(userID);
// const isVip = await isUserVIP(hashedUserID);
const isVip = false; // TODO: In future, reenable locks
const isVip = await isUserVIP(hashedUserID);
const shouldLock = isVip && autoLock !== false;
const hashedVideoID = await getHashCache(videoID, 1);
const hashedIP = await getHashCache(getIP(req) + config.globalSalt as IPAddress);
const isBanned = await checkBanStatus(hashedUserID, hashedIP);
const matchedRule = isRequestInvalid({
userAgent,
userAgentHeader: req.headers["user-agent"],
videoDuration,
videoID,
userID,
service,
dearrow: {
title,
thumbnail,
downvote,
},
endpoint: "dearrow-postBranding",
});
if (matchedRule !== null) {
sendNewUserWebhook(config.discordRejectedNewUserWebhookURL, hashedUserID, videoID, userAgent, req, videoDuration, title, `Caught by rule: ${matchedRule}`);
Logger.warn(`Dearrow submission rejected by request validator: ${hashedUserID} ${videoID} ${videoDuration} ${userAgent} ${req.headers["user-agent"]} ${title.title} ${thumbnail.timestamp}`);
res.status(200).send("OK");
return;
}
// treat banned users as existing users who "can submit" for the purposes of these checks
// this is to avoid their titles from being logged and them taking up "new user" slots with every submission
const permission = isBanned ? {
canSubmit: true,
newUser: false,
reason: "",
} : await canSubmitDeArrow(hashedUserID);
if (!permission.canSubmit) {
Logger.warn(`New user trying to submit dearrow: ${hashedUserID} ${videoID} ${videoDuration} ${Object.keys(req.body)} ${userAgent} ${title?.title} ${req.headers["user-agent"]}`);
res.status(403).send(permission.reason);
return;
} else if (permission.newUser) {
sendNewUserWebhook(config.discordNewUserWebhookURL, hashedUserID, videoID, userAgent, req, videoDuration, title, undefined);
}
if (videoDuration && thumbnail && await checkForWrongVideoDuration(videoID, videoDuration)) {
res.status(403).send("YouTube is currently testing a new anti-adblock technique called server-side ad-injection. This causes skips and submissions to be offset by the duration of the ad. It seems that you are affected by this A/B test, so until a fix is developed, we cannot accept submissions from your device due to them potentially being inaccurate.");
return;
}
const lock = await acquireLock(`postBranding:${videoID}.${hashedUserID}`);
if (!lock.status) {
res.status(429).send("Vote already in progress");
return;
}
const now = Date.now();
const voteType = 1;
const voteType: BrandingVoteType = downvote ? BrandingVoteType.Downvote : BrandingVoteType.Upvote;
if (title && !isVip && title.title.length > config.maxTitleLength) {
lock.unlock();
res.status(400).send("Your title is too long. Please keep titles concise.");
return;
}
let errorCode = 0;
await Promise.all([(async () => {
if (title) {
// ignore original submissions from banned users - hiding those would cause issues
if (title.original && isBanned) return;
const existingUUID = (await db.prepare("get", `SELECT "UUID" from "titles" where "videoID" = ? AND "title" = ?`, [videoID, title.title]))?.UUID;
const existingIsLocked = !!existingUUID && (await db.prepare("get", `SELECT "locked" from "titleVotes" where "UUID" = ?`, [existingUUID]))?.locked;
if (existingUUID != undefined && isBanned) return; // ignore votes on existing details from banned users
if (downvote && existingIsLocked && !isVip) {
if (!isBanned) sendWebhooks(videoID, existingUUID, voteType, wasWarned, shouldLock).catch((e) => Logger.error(e));
errorCode = 403;
return;
}
const UUID = existingUUID || crypto.randomUUID();
const existingVote = await handleExistingVotes(BrandingType.Title, videoID, hashedUserID, UUID, hashedIP, voteType);
await handleExistingVotes(BrandingType.Title, videoID, hashedUserID, UUID, hashedIP, voteType);
if (existingUUID) {
await updateVoteTotals(BrandingType.Title, existingVote, UUID, isVip);
await updateVoteTotals(BrandingType.Title, UUID, hashedUserID, shouldLock, !!downvote);
} else {
await db.prepare("run", `INSERT INTO "titles" ("videoID", "title", "original", "userID", "service", "hashedVideoID", "timeSubmitted", "UUID") VALUES (?, ?, ?, ?, ?, ?, ?, ?)`,
[videoID, title.title, title.original ? 1 : 0, hashedUserID, service, hashedVideoID, now, UUID]);
if (downvote) {
throw new Error("Title submission doesn't exist");
}
await db.prepare("run", `INSERT INTO "titleVotes" ("UUID", "votes", "locked", "shadowHidden") VALUES (?, 0, ?, 0);`,
[UUID, isVip ? 1 : 0]);
await db.prepare("run", `INSERT INTO "titles" ("videoID", "title", "original", "userID", "service", "hashedVideoID", "timeSubmitted", "UUID", "casualMode", "userAgent") VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?)`,
[videoID, title.title, title.original ? 1 : 0, hashedUserID, service, hashedVideoID, now, UUID, casualMode ? 1 : 0, userAgent]);
const verificationValue = await getVerificationValue(hashedUserID, isVip);
await db.prepare("run", `INSERT INTO "titleVotes" ("UUID", "votes", "locked", "shadowHidden", "verification") VALUES (?, 0, ?, ?, ?);`,
[UUID, shouldLock ? 1 : 0, isBanned ? 1 : 0, verificationValue]);
await verifyOldSubmissions(hashedUserID, verificationValue);
}
if (isVip) {
if (isVip && !downvote && shouldLock) {
// unlock all other titles
await db.prepare("run", `UPDATE "titleVotes" SET "locked" = 0 WHERE "UUID" != ? AND "videoID" = ?`, [UUID, videoID]);
await db.prepare("run", `UPDATE "titleVotes" as tv SET "locked" = 0 FROM "titles" t WHERE tv."UUID" = t."UUID" AND tv."UUID" != ? AND t."videoID" = ?`, [UUID, videoID]);
}
if (!isBanned) sendWebhooks(videoID, UUID, voteType, wasWarned, shouldLock).catch((e) => Logger.error(e));
}
})(), (async () => {
if (thumbnail) {
// ignore original submissions from banned users - hiding those would cause issues
if (thumbnail.original && (isBanned || !await canSubmitOriginal(hashedUserID, isVip))) return;
const existingUUID = thumbnail.original
? (await db.prepare("get", `SELECT "UUID" from "thumbnails" where "videoID" = ? AND "original" = 1`, [videoID]))?.UUID
: (await db.prepare("get", `SELECT "thumbnails"."UUID" from "thumbnailTimestamps" JOIN "thumbnails" ON "thumbnails"."UUID" = "thumbnailTimestamps"."UUID"
WHERE "thumbnailTimestamps"."timestamp" = ? AND "thumbnails"."videoID" = ?`, [(thumbnail as TimeThumbnailSubmission).timestamp, videoID]))?.UUID;
const existingIsLocked = !!existingUUID && (await db.prepare("get", `SELECT "locked" from "thumbnailVotes" where "UUID" = ?`, [existingUUID]))?.locked;
if (existingUUID != undefined && isBanned) return; // ignore votes on existing details from banned users
if (downvote && existingIsLocked && !isVip) {
errorCode = 403;
return;
}
const UUID = existingUUID || crypto.randomUUID();
const existingVote = await handleExistingVotes(BrandingType.Thumbnail, videoID, hashedUserID, UUID, hashedIP, voteType);
await handleExistingVotes(BrandingType.Thumbnail, videoID, hashedUserID, UUID, hashedIP, voteType);
if (existingUUID) {
await updateVoteTotals(BrandingType.Thumbnail, existingVote, UUID, isVip);
await updateVoteTotals(BrandingType.Thumbnail, UUID, hashedUserID, shouldLock, !!downvote);
} else {
await db.prepare("run", `INSERT INTO "thumbnails" ("videoID", "original", "userID", "service", "hashedVideoID", "timeSubmitted", "UUID") VALUES (?, ?, ?, ?, ?, ?, ?)`,
[videoID, thumbnail.original ? 1 : 0, hashedUserID, service, hashedVideoID, now, UUID]);
if (downvote) {
throw new Error("Thumbnail submission doesn't exist");
}
await db.prepare("run", `INSERT INTO "thumbnailVotes" ("UUID", "votes", "locked", "shadowHidden") VALUES (?, 0, ?, 0)`,
[UUID, isVip ? 1 : 0]);
await db.prepare("run", `INSERT INTO "thumbnails" ("videoID", "original", "userID", "service", "hashedVideoID", "timeSubmitted", "UUID", "casualMode", "userAgent") VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?)`,
[videoID, thumbnail.original ? 1 : 0, hashedUserID, service, hashedVideoID, now, UUID, casualMode ? 1 : 0, userAgent]);
await db.prepare("run", `INSERT INTO "thumbnailVotes" ("UUID", "votes", "locked", "shadowHidden") VALUES (?, 0, ?, ?)`,
[UUID, shouldLock ? 1 : 0, isBanned ? 1 : 0]);
if (!thumbnail.original) {
await db.prepare("run", `INSERT INTO "thumbnailTimestamps" ("UUID", "timestamp") VALUES (?, ?)`,
[UUID, (thumbnail as TimeThumbnailSubmission).timestamp]);
}
}
if (isVip) {
// unlock all other titles
await db.prepare("run", `UPDATE "thumbnailVotes" SET "locked" = 0 WHERE "UUID" != ? AND "videoID" = ?`, [UUID, videoID]);
}
if (isVip && !downvote && shouldLock) {
// unlock all other titles
await db.prepare("run", `UPDATE "thumbnailVotes" as tv SET "locked" = 0 FROM "thumbnails" t WHERE tv."UUID" = t."UUID" AND tv."UUID" != ? AND t."videoID" = ?`, [UUID, videoID]);
}
}
})()]);
QueryCacher.clearBrandingCache({ videoID, hashedVideoID, service });
res.status(200).send("OK");
if (errorCode) {
res.status(errorCode).send();
} else {
res.status(200).send("OK");
}
lock.unlock();
} catch (e) {
Logger.error(e as string);
res.status(500).send("Internal Server Error");
}
}
function sendNewUserWebhook(webhookUrl: string, hashedUserID: HashedUserID, videoID: VideoID, userAgent: any, req: Request, videoDuration: number, title: TitleSubmission, footerText: string | undefined) {
if (!webhookUrl) return;
axios.post(webhookUrl, {
"embeds": [{
"title": hashedUserID,
"url": `https://www.youtube.com/watch?v=${videoID}`,
"description": `**User Agent**: ${userAgent}\
\n**Sent User Agent**: ${req.body.userAgent}\
\n**Real User Agent**: ${req.headers["user-agent"]}\
\n**Video Duration**: ${videoDuration}\
\n**Title**: ${title?.title}`,
"color": 1184701,
"thumbnail": {
"url": getMaxResThumbnail(videoID),
},
"footer": footerText === undefined ? null : {
"text": footerText,
},
}],
})
.then(res => {
if (res.status >= 400) {
Logger.error("Error sending reported submission Discord hook");
Logger.error(JSON.stringify((res.data)));
Logger.error("\n");
}
})
.catch(err => {
Logger.error("Failed to send reported submission Discord hook.");
Logger.error(JSON.stringify(err));
Logger.error("\n");
});
}
/**
* Finds an existing vote, if found, and it's for a different submission, it undoes it, and points to the new submission.
* If no existing vote, it adds one.
*/
async function handleExistingVotes(type: BrandingType, videoID: VideoID,
hashedUserID: HashedUserID, UUID: BrandingUUID, hashedIP: HashedIP, voteType: number): Promise<ExistingVote> {
hashedUserID: HashedUserID, UUID: BrandingUUID, hashedIP: HashedIP, voteType: BrandingVoteType) {
const table = type === BrandingType.Title ? `"titleVotes"` : `"thumbnailVotes"`;
const idsDealtWith: BrandingUUID[] = [];
const existingVote = await privateDB.prepare("get", `SELECT "id", "UUID", "type" from ${table} where "videoID" = ? AND "userID" = ?`, [videoID, hashedUserID]);
if (existingVote && existingVote.UUID !== UUID) {
if (existingVote.type === 1) {
await db.prepare("run", `UPDATE ${table} SET "votes" = "votes" - 1 WHERE "UUID" = ?`, [existingVote.UUID]);
// Either votes of the same type, or on the same submission (undo a downvote)
const existingVotes = await privateDB.prepare("all", `SELECT "id", "UUID", "type" from ${table} where "videoID" = ? AND "userID" = ? AND ("type" = ? OR "UUID" = ?)`, [videoID, hashedUserID, voteType, UUID]) as ExistingVote[];
if (existingVotes.length > 0) {
// Only one upvote per video
for (const existingVote of existingVotes) {
// For downvotes, only undo for this specific submission (multiple downvotes on one submission not allowed)
if (voteType === BrandingVoteType.Downvote && existingVote.UUID !== UUID) continue;
switch (existingVote.type) {
case BrandingVoteType.Upvote:
// Old case where there are duplicate rows in private db
if (!idsDealtWith.includes(existingVote.UUID)) {
idsDealtWith.push(existingVote.UUID);
await db.prepare("run", `UPDATE ${table} SET "votes" = "votes" - 1 WHERE "UUID" = ?`, [existingVote.UUID]);
}
await privateDB.prepare("run", `DELETE FROM ${table} WHERE "id" = ?`, [existingVote.id]);
break;
case BrandingVoteType.Downvote: {
await db.prepare("run", `UPDATE ${table} SET "downvotes" = "downvotes" - 1 WHERE "UUID" = ?`, [existingVote.UUID]);
await privateDB.prepare("run", `DELETE FROM ${table} WHERE "id" = ?`, [existingVote.id]);
break;
}
}
}
await privateDB.prepare("run", `UPDATE ${table} SET "type" = ?, "UUID" = ? WHERE "id" = ?`, [voteType, UUID, existingVote.id]);
} else if (!existingVote) {
await privateDB.prepare("run", `INSERT INTO ${table} ("videoID", "UUID", "userID", "hashedIP", "type") VALUES (?, ?, ?, ?, ?)`,
[videoID, UUID, hashedUserID, hashedIP, voteType]);
}
return existingVote;
await privateDB.prepare("run", `INSERT INTO ${table} ("videoID", "UUID", "userID", "hashedIP", "type") VALUES (?, ?, ?, ?, ?)`,
[videoID, UUID, hashedUserID, hashedIP, voteType]);
}
/**
* Only called if an existing vote exists.
* Will update public vote totals and locked status.
*/
async function updateVoteTotals(type: BrandingType, existingVote: ExistingVote, UUID: BrandingUUID, isVip: boolean): Promise<void> {
if (!existingVote) return;
async function updateVoteTotals(type: BrandingType, UUID: BrandingUUID, userID: HashedUserID, shouldLock: boolean, downvote: boolean): Promise<void> {
const table = type === BrandingType.Title ? `"titleVotes"` : `"thumbnailVotes"`;
const table2 = type === BrandingType.Title ? `"titles"` : `"thumbnails"`;
// Don't upvote if we vote on the same submission
if (!existingVote || existingVote.UUID !== UUID) {
if (downvote) {
// Only downvote if it is not their submission
const isUsersSubmission = (await db.prepare("get", `SELECT "userID" FROM ${table2} WHERE "UUID" = ?`, [UUID]))?.userID === userID;
if (!isUsersSubmission) {
await db.prepare("run", `UPDATE ${table} SET "downvotes" = "downvotes" + 1 WHERE "UUID" = ?`, [UUID]);
}
} else {
await db.prepare("run", `UPDATE ${table} SET "votes" = "votes" + 1 WHERE "UUID" = ?`, [UUID]);
if (type === BrandingType.Title) {
const votedSubmitterUserID = (await db.prepare("get", `SELECT "userID" FROM ${table2} WHERE "UUID" = ?`, [UUID]))?.userID;
if (votedSubmitterUserID) {
await verifyOldSubmissions(votedSubmitterUserID, await getVerificationValue(votedSubmitterUserID, await isUserVIP(votedSubmitterUserID)));
}
}
}
if (isVip) {
await db.prepare("run", `UPDATE ${table} SET "locked" = 1 WHERE "UUID" = ?`, [UUID]);
if (shouldLock) {
if (downvote) {
await db.prepare("run", `UPDATE ${table} SET "removed" = 1 WHERE "UUID" = ?`, [UUID]);
} else {
await db.prepare("run", `UPDATE ${table} SET "locked" = 1, "removed" = 0 WHERE "UUID" = ?`, [UUID]);
}
}
}
}
export async function getVerificationValue(hashedUserID: HashedUserID, isVip: boolean): Promise<number> {
const voteSum = await db.prepare("get", `SELECT SUM("maxVotes") as "voteSum" FROM (SELECT MAX("votes") as "maxVotes" from "titles" JOIN "titleVotes" ON "titles"."UUID" = "titleVotes"."UUID" WHERE "titles"."userID" = ? GROUP BY "titles"."videoID") t`, [hashedUserID]);
if (voteSum.voteSum >= 1 || isVip || await hasFeature(hashedUserID, Feature.DeArrowTitleSubmitter)) {
return 0;
} else {
return -1;
}
}
export async function verifyOldSubmissions(hashedUserID: HashedUserID, verification: number): Promise<void> {
if (verification >= 0) {
const unverifiedSubmissions = await db.prepare("all", `SELECT "videoID", "hashedVideoID", "service" FROM "titles" JOIN "titleVotes" ON "titles"."UUID" = "titleVotes"."UUID" WHERE "titles"."userID" = ? AND "titleVotes"."verification" < ? GROUP BY "videoID", "hashedVideoID", "service"`, [hashedUserID, verification]);
if (unverifiedSubmissions.length > 0) {
for (const submission of unverifiedSubmissions) {
QueryCacher.clearBrandingCache({
videoID: submission.videoID,
hashedVideoID: submission.hashedVideoID,
service: submission.service
});
}
await db.prepare("run", `UPDATE "titleVotes" as tv SET "verification" = ? FROM "titles" WHERE "titles"."UUID" = tv."UUID" AND "titles"."userID" = ? AND tv."verification" < ?`, [verification, hashedUserID, verification]);
}
}
}
async function canSubmitOriginal(hashedUserID: HashedUserID, isVip: boolean): Promise<boolean> {
const upvotedThumbs = (await db.prepare("get", `SELECT count(*) as "upvotedThumbs" FROM "thumbnails" JOIN "thumbnailVotes" ON "thumbnails"."UUID" = "thumbnailVotes"."UUID" WHERE "thumbnailVotes"."votes" > 0 AND "thumbnails"."original" = 0 AND "thumbnails"."userID" = ?`, [hashedUserID])).upvotedThumbs;
const customThumbs = (await db.prepare("get", `SELECT count(*) as "customThumbs" FROM "thumbnails" JOIN "thumbnailVotes" ON "thumbnails"."UUID" = "thumbnailVotes"."UUID" WHERE "thumbnailVotes"."votes" >= 0 AND "thumbnails"."original" = 0 AND "thumbnails"."userID" = ?`, [hashedUserID])).customThumbs;
const originalThumbs = (await db.prepare("get", `SELECT count(*) as "originalThumbs" FROM "thumbnails" JOIN "thumbnailVotes" ON "thumbnails"."UUID" = "thumbnailVotes"."UUID" WHERE "thumbnailVotes"."votes" >= 0 AND "thumbnails"."original" = 1 AND "thumbnails"."userID" = ?`, [hashedUserID])).originalThumbs;
return isVip || (upvotedThumbs > 1 && customThumbs > 1 && originalThumbs / customThumbs < 0.4);
}
async function sendWebhooks(videoID: VideoID, UUID: BrandingUUID, voteType: BrandingVoteType, wasWarned: boolean, vipAction: boolean) {
const currentSubmission = await db.prepare(
"get",
`SELECT
"titles"."title",
"titleVotes"."locked",
"titles"."userID",
"titleVotes"."votes"-"titleVotes"."downvotes"+"titleVotes"."verification" AS "score"
FROM "titles" JOIN "titleVotes" ON "titles"."UUID" = "titleVotes"."UUID"
WHERE "titles"."UUID" = ?`,
[UUID]);
if (wasWarned && voteType === BrandingVoteType.Upvote) {
const data = await getVideoDetails(videoID);
axios.post(config.discordDeArrowWarnedWebhookURL, {
"embeds": [{
"title": data?.title,
"url": `https://www.youtube.com/watch?v=${videoID}`,
"description": `**Submitted title:** ${currentSubmission.title}\
\n\n**Submitted by:** ${currentSubmission.userID}`,
"color": 10813440,
"thumbnail": {
"url": getMaxResThumbnail(videoID),
},
}],
})
.then(res => {
if (res.status >= 400) {
Logger.error("Error sending reported submission Discord hook");
Logger.error(JSON.stringify((res.data)));
Logger.error("\n");
}
})
.catch(err => {
Logger.error("Failed to send reported submission Discord hook.");
Logger.error(JSON.stringify(err));
Logger.error("\n");
});
}
// Unlocked title getting more upvotes than the locked one
if (voteType === BrandingVoteType.Upvote) {
const lockedSubmission = await db.prepare(
"get",
`SELECT
"titles"."title",
"titles"."userID",
"titleVotes"."votes"-"titleVotes"."downvotes"+"titleVotes"."verification" AS "score"
FROM "titles" JOIN "titleVotes" ON "titles"."UUID" = "titleVotes"."UUID"
WHERE "titles"."videoID" = ?
AND "titles"."UUID" != ?
AND "titleVotes"."locked" = 1`,
[videoID, UUID]);
// Time to warn that there may be an issue
if (lockedSubmission && currentSubmission.score - lockedSubmission.score > 2) {
const usernameRow = await db.prepare("get", `SELECT "userName" FROM "userNames" WHERE "userID" = ?`, [lockedSubmission.userID]);
const data = await getVideoDetails(videoID);
axios.post(config.discordDeArrowLockedWebhookURL, {
"embeds": [{
"title": data?.title,
"url": `https://www.youtube.com/watch?v=${videoID}`,
"description": `**${lockedSubmission.score}** score vs **${currentSubmission.score}**\
\n\n**Locked title:** ${lockedSubmission.title}\
\n**New title:** ${currentSubmission.title}\
\n\n**Submitted by:** ${usernameRow?.userName ?? ""}\n${lockedSubmission.userID}`,
"color": 10813440,
"thumbnail": {
"url": getMaxResThumbnail(videoID),
},
}],
})
.then(res => {
if (res.status >= 400) {
Logger.error("Error sending reported submission Discord hook");
Logger.error(JSON.stringify((res.data)));
Logger.error("\n");
}
})
.catch(err => {
Logger.error("Failed to send reported submission Discord hook.");
Logger.error(JSON.stringify(err));
Logger.error("\n");
});
}
}
// Downvotes on locked title
if (voteType === BrandingVoteType.Downvote && currentSubmission.locked === 1) {
const usernameRow = await db.prepare("get", `SELECT "userName" FROM "userNames" WHERE "userID" = ?`, [currentSubmission.userID]);
const data = await getVideoDetails(videoID);
axios.post(config.discordDeArrowLockedWebhookURL, {
"embeds": [{
"title": data?.title,
"url": `https://www.youtube.com/watch?v=${videoID}`,
"description": `Locked title ${vipAction ? "was removed by a VIP" : `with **${currentSubmission.score}** score received a downvote`}\
\n\n**Locked title:** ${currentSubmission.title}\
\n**Submitted by:** ${usernameRow?.userName ?? ""}\n${currentSubmission.userID}`,
"color": 10813440,
"thumbnail": {
"url": getMaxResThumbnail(videoID),
},
}],
})
.then(res => {
if (res.status >= 400) {
Logger.error("Error sending reported submission Discord hook");
Logger.error(JSON.stringify((res.data)));
Logger.error("\n");
}
})
.catch(err => {
Logger.error("Failed to send reported submission Discord hook.");
Logger.error(JSON.stringify(err));
Logger.error("\n");
});
}
}
async function checkForWrongVideoDuration(videoID: VideoID, duration: number): Promise<boolean> {
const apiVideoDetails = await getVideoDetails(videoID, true);
const apiDuration = apiVideoDetails?.duration;
return apiDuration && apiDuration > 2 && duration && duration > 2 && Math.abs(apiDuration - duration) > 3;
}

153
src/routes/postCasual.ts Normal file
View File

@@ -0,0 +1,153 @@
import { Request, Response } from "express";
import { config } from "../config";
import { db, privateDB } from "../databases/databases";
import { BrandingUUID, CasualCategory, CasualVoteSubmission } from "../types/branding.model";
import { HashedIP, IPAddress, Service, VideoID } from "../types/segments.model";
import { HashedUserID } from "../types/user.model";
import { getHashCache } from "../utils/getHashCache";
import { getIP } from "../utils/getIP";
import { getService } from "../utils/getService";
import { Logger } from "../utils/logger";
import crypto from "crypto";
import { QueryCacher } from "../utils/queryCacher";
import { acquireLock } from "../utils/redisLock";
import { checkBanStatus } from "../utils/checkBan";
import { canSubmitDeArrow } from "../utils/permissions";
import { isRequestInvalid } from "../utils/requestValidator";
import { parseUserAgent } from "../utils/userAgent";
interface ExistingVote {
UUID: BrandingUUID;
type: number;
}
export async function postCasual(req: Request, res: Response) {
const { videoID, userID, downvote } = req.body as CasualVoteSubmission;
const userAgent = req.body.userAgent ?? parseUserAgent(req.get("user-agent")) ?? "";
let categories = req.body.categories as CasualCategory[];
const title = (req.body.title as string)?.toLowerCase();
const service = getService(req.body.service);
if (downvote) {
categories = ["downvote" as CasualCategory];
} else if (!categories.every((c) => config.casualCategoryList.includes(c))) {
return res.status(400).send("Invalid category");
}
if (!videoID || !userID || userID.length < 30 || !service || !categories || !Array.isArray(categories)) {
return res.status(400).send("Bad Request");
}
if (isRequestInvalid({
userID,
videoID,
userAgent,
userAgentHeader: req.headers["user-agent"],
casualCategories: categories,
service,
endpoint: "dearrow-postCasual",
})) {
Logger.warn(`Casual vote rejected by request validator: ${userAgent} ${req.headers["user-agent"]} ${categories} ${service} ${videoID}`);
return res.status(200).send("OK");
}
try {
const hashedUserID = await getHashCache(userID);
const hashedVideoID = await getHashCache(videoID, 1);
const hashedIP = await getHashCache(getIP(req) + config.globalSalt as IPAddress);
const isBanned = await checkBanStatus(hashedUserID, hashedIP);
const permission = await canSubmitDeArrow(hashedUserID);
if (!permission.canSubmit) {
res.status(403).send(permission.reason);
return;
}
const lock = await acquireLock(`postCasual:${videoID}.${hashedUserID}`);
if (!lock.status) {
res.status(429).send("Vote already in progress");
return;
}
if (isBanned) {
return res.status(200).send("OK");
}
let titleID = 0;
if (title) {
// See if title needs to be added
const titles = await db.prepare("all", `SELECT "title", "id" from "casualVoteTitles" WHERE "videoID" = ? AND "service" = ? ORDER BY "id"`, [videoID, service]) as { title: string, id: number }[];
if (titles.length > 0) {
const existingTitle = titles.find((t) => t.title === title);
if (existingTitle) {
titleID = existingTitle.id;
} else {
titleID = titles[titles.length - 1].id + 1;
await db.prepare("run", `INSERT INTO "casualVoteTitles" ("videoID", "service", "hashedVideoID", "id", "title") VALUES (?, ?, ?, ?, ?)`, [videoID, service, hashedVideoID, titleID, title]);
}
} else {
await db.prepare("run", `INSERT INTO "casualVoteTitles" ("videoID", "service", "hashedVideoID", "id", "title") VALUES (?, ?, ?, ?, ?)`, [videoID, service, hashedVideoID, titleID, title]);
}
} else {
const titles = await db.prepare("all", `SELECT "title", "id" from "casualVoteTitles" WHERE "videoID" = ? AND "service" = ? ORDER BY "id"`, [videoID, service]) as { title: string, id: number }[];
if (titles.length > 0) {
titleID = titles[titles.length - 1].id;
}
}
const now = Date.now();
for (const category of categories) {
const existingUUID = (await db.prepare("get", `SELECT "UUID" from "casualVotes" where "videoID" = ? AND "service" = ? AND "titleID" = ? AND "category" = ?`, [videoID, service, titleID, category]))?.UUID;
const UUID = existingUUID || crypto.randomUUID();
const alreadyVotedTheSame = await handleExistingVotes(videoID, service, titleID, hashedUserID, hashedIP, category, downvote, now);
if (existingUUID) {
if (!alreadyVotedTheSame) {
await db.prepare("run", `UPDATE "casualVotes" SET "upvotes" = "upvotes" + 1 WHERE "UUID" = ?`, [UUID]);
}
} else {
await db.prepare("run", `INSERT INTO "casualVotes" ("videoID", "service", "titleID", "hashedVideoID", "timeSubmitted", "UUID", "category", "upvotes") VALUES (?, ?, ?, ?, ?, ?, ?, ?)`,
[videoID, service, titleID, hashedVideoID, now, UUID, category, 1]);
}
}
QueryCacher.clearBrandingCache({ videoID, hashedVideoID, service });
res.status(200).send("OK");
lock.unlock();
} catch (e) {
Logger.error(e as string);
res.status(500).send("Internal Server Error");
}
}
async function handleExistingVotes(videoID: VideoID, service: Service, titleID: number,
hashedUserID: HashedUserID, hashedIP: HashedIP, category: CasualCategory, downvote: boolean, now: number): Promise<boolean> {
const existingVote = await privateDB.prepare("get", `SELECT "UUID" from "casualVotes" WHERE "videoID" = ? AND "service" = ? AND "titleID" = ? AND "userID" = ? AND "category" = ?`, [videoID, service, titleID, hashedUserID, category]) as ExistingVote;
if (existingVote) {
return true;
} else {
if (downvote) {
// Remove upvotes for all categories on this video
const existingUpvotes = await privateDB.prepare("all", `SELECT "category" from "casualVotes" WHERE "category" != 'downvote' AND "videoID" = ? AND "service" = ? AND "titleID" = ? AND "userID" = ?`, [videoID, service, titleID, hashedUserID]);
for (const existingUpvote of existingUpvotes) {
await db.prepare("run", `UPDATE "casualVotes" SET "upvotes" = "upvotes" - 1 WHERE "videoID" = ? AND "service" = ? AND "titleID" = ? AND "category" = ?`, [videoID, service, titleID, existingUpvote.category]);
await privateDB.prepare("run", `DELETE FROM "casualVotes" WHERE "videoID" = ? AND "service" = ? AND "titleID" = ? AND "userID" = ? AND "category" = ?`, [videoID, service, titleID, hashedUserID, existingUpvote.category]);
}
} else {
// Undo a downvote if it exists
const existingDownvote = await privateDB.prepare("get", `SELECT "UUID" from "casualVotes" WHERE "category" = 'downvote' AND "videoID" = ? AND "service" = ? AND "titleID" = ? AND "userID" = ?`, [videoID, service, titleID, hashedUserID]) as ExistingVote;
if (existingDownvote) {
await db.prepare("run", `UPDATE "casualVotes" SET "upvotes" = "upvotes" - 1 WHERE "category" = 'downvote' AND "videoID" = ? AND "service" = ? AND "titleID" = ?`, [videoID, service, titleID]);
await privateDB.prepare("run", `DELETE FROM "casualVotes" WHERE "category" = 'downvote' AND "videoID" = ? AND "service" = ? AND "titleID" = ? AND "userID" = ?`, [videoID, service, titleID, hashedUserID]);
}
}
}
await privateDB.prepare("run", `INSERT INTO "casualVotes" ("videoID", "service", "titleID", "userID", "hashedIP", "category", "timeSubmitted") VALUES (?, ?, ?, ?, ?, ?, ?)`,
[videoID, service, titleID, hashedUserID, hashedIP, category, now]);
return false;
}

View File

@@ -23,7 +23,7 @@ export async function postClearCache(req: Request, res: Response): Promise<Respo
if (invalidFields.length !== 0) {
// invalid request
const fields = invalidFields.reduce((p, c, i) => p + (i !== 0 ? ", " : "") + c, "");
const fields = invalidFields.join(", ");
return res.status(400).send(`No valid ${fields} field(s) provided`);
}

View File

@@ -9,7 +9,7 @@ import { getIP } from "../utils/getIP";
import { getFormattedTime } from "../utils/getFormattedTime";
import { dispatchEvent } from "../utils/webhookUtils";
import { Request, Response } from "express";
import { ActionType, Category, IncomingSegment, IPAddress, SegmentUUID, Service, VideoDuration, VideoID } from "../types/segments.model";
import { ActionType, Category, HashedIP, IncomingSegment, IPAddress, SegmentUUID, Service, VideoDuration, VideoID } from "../types/segments.model";
import { deleteLockCategories } from "./deleteLockCategories";
import { QueryCacher } from "../utils/queryCacher";
import { getReputation } from "../utils/reputation";
@@ -20,10 +20,12 @@ import { parseUserAgent } from "../utils/userAgent";
import { getService } from "../utils/getService";
import axios from "axios";
import { vote } from "./voteOnSponsorTime";
import { canSubmit } from "../utils/permissions";
import { canSubmit, canSubmitGlobal } from "../utils/permissions";
import { getVideoDetails, videoDetails } from "../utils/getVideoDetails";
import * as youtubeID from "../utils/youtubeID";
import { banUser } from "./shadowBanUser";
import { acquireLock } from "../utils/redisLock";
import { checkBanStatus } from "../utils/checkBan";
import { isRequestInvalid } from "../utils/requestValidator";
type CheckResult = {
pass: boolean,
@@ -128,14 +130,19 @@ async function autoModerateSubmission(apiVideoDetails: videoDetails,
// return false on undefined or 0
if (!duration) return false;
if (apiDuration && apiDuration > 2 && duration && duration > 2 && Math.abs(apiDuration - duration) > 3) {
// YouTube server-side ad injection might be active, reject
return "YouTube is currently testing a new anti-adblock technique called server-side ad-injection. This causes skips and submissions to be offset by the duration of the ad. It seems that you are affected by this A/B test, so until a fix is developed, we cannot accept submissions from your device due to them potentially being inaccurate.";
}
const segments = submission.segments;
// map all times to float array
const allSegmentTimes = segments.filter((s) => s.actionType !== ActionType.Chapter)
.map(segment => [parseFloat(segment.segment[0]), parseFloat(segment.segment[1])]);
// add previous submissions by this user
const allSubmittedByUser = await db.prepare("all", `SELECT "startTime", "endTime" FROM "sponsorTimes" WHERE "userID" = ? AND "videoID" = ? AND "votes" > -1 AND "actionType" != 'chapter' AND "hidden" = 0`
, [submission.userID, submission.videoID]) as { startTime: string, endTime: string }[];
const allSubmittedByUser = await db.prepare("all", `SELECT "startTime", "endTime" FROM "sponsorTimes" WHERE "userID" = ? AND "videoID" = ? AND "service" = ? AND "votes" > -1 AND "actionType" != 'chapter' AND "hidden" = 0`
, [submission.userID, submission.videoID, submission.service]) as { startTime: string, endTime: string }[];
if (allSubmittedByUser) {
//add segments the user has previously submitted
@@ -157,28 +164,23 @@ async function autoModerateSubmission(apiVideoDetails: videoDetails,
}
async function checkUserActiveWarning(userID: HashedUserID): Promise<CheckResult> {
const MILLISECONDS_IN_HOUR = 3600000;
const now = Date.now();
const warnings = (await db.prepare("all",
const warning = await db.prepare("get",
`SELECT "reason"
FROM warnings
WHERE "userID" = ? AND "issueTime" > ? AND enabled = 1
WHERE "userID" = ? AND enabled = 1 AND type = 0
ORDER BY "issueTime" DESC`,
[
userID,
Math.floor(now - (config.hoursAfterWarningExpires * MILLISECONDS_IN_HOUR))
],
) as {reason: string}[]).sort((a, b) => (b?.reason?.length ?? 0) - (a?.reason?.length ?? 0));
[userID],
) as {reason: string};
if (warnings?.length >= config.maxNumberOfActiveWarnings) {
const defaultMessage = "Submission rejected due to a warning from a moderator. This means that we noticed you were making some common mistakes"
if (warning != null) {
const defaultMessage = "Submission rejected due to a tip from a moderator. This means that we noticed you were making some common mistakes"
+ " that are not malicious, and we just want to clarify the rules. "
+ "Could you please send a message in discord.gg/SponsorBlock or matrix.to/#/#sponsor:ajay.app so we can further help you? "
+ `Your userID is ${userID}.`;
return {
pass: false,
errorMessage: defaultMessage + (warnings[0]?.reason?.length > 0 ? `\n\nWarning reason: '${warnings[0].reason}'` : ""),
errorMessage: defaultMessage + (warning.reason?.length > 0 ? `\n\nTip message: '${warning.reason}'` : ""),
errorCode: 403
};
}
@@ -193,12 +195,17 @@ async function checkInvalidFields(videoID: VideoID, userID: UserID, hashedUserID
if (typeof videoID !== "string" || videoID?.length == 0) {
invalidFields.push("videoID");
}
if (service === Service.YouTube && config.mode !== "test") {
const sanitizedVideoID = youtubeID.validate(videoID) ? videoID : youtubeID.sanitize(videoID);
if (!youtubeID.validate(sanitizedVideoID)) {
invalidFields.push("videoID");
errors.push("YouTube videoID could not be extracted");
if (service === Service.YouTube) {
if (config.mode !== "test") {
const sanitizedVideoID = youtubeID.validate(videoID) ? videoID : youtubeID.sanitize(videoID);
if (!youtubeID.validate(sanitizedVideoID)) {
invalidFields.push("videoID");
errors.push("YouTube videoID could not be extracted");
}
}
} else if (service !== Service.Spotify) {
invalidFields.push("service");
errors.push("Service is not supported");
}
const minLength = config.minUserIDLength;
if (typeof userID !== "string" || userID?.length < minLength) {
@@ -236,11 +243,11 @@ async function checkInvalidFields(videoID: VideoID, userID: UserID, hashedUserID
if (invalidFields.length !== 0) {
// invalid request
const formattedFields = invalidFields.reduce((p, c, i) => p + (i !== 0 ? ", " : "") + c, "");
const formattedErrors = errors.reduce((p, c, i) => p + (i !== 0 ? ". " : " ") + c, "");
const formattedFields = invalidFields.join(", ");
const formattedErrors = errors.join(". ");
return {
pass: false,
errorMessage: `No valid ${formattedFields}.${formattedErrors}`,
errorMessage: `No valid ${formattedFields}. ${formattedErrors}`,
errorCode: 400
};
}
@@ -278,7 +285,7 @@ async function checkEachSegmentValid(rawIP: IPAddress, paramUserID: UserID, user
errorMessage:
`Users have voted that all the segments required for this video have already been submitted for the following category: ` +
`'${segments[i].category}'\n` +
`${lockedCategoryList[lockIndex].reason?.length !== 0 ? `\nReason: '${lockedCategoryList[lockIndex].reason}\n'` : ""}` +
`${lockedCategoryList[lockIndex].reason?.length !== 0 ? `\nReason: '${lockedCategoryList[lockIndex].reason}'\n` : ""}` +
`You may need to refresh if you don't see the segments.\n` +
`${(segments[i].category === "sponsor" ? "\nMaybe the segment you are submitting is a different category that you have not enabled and is not a sponsor. " +
"Categories that aren't sponsor, such as self-promotion can be enabled in the options.\n" : "")}` +
@@ -314,7 +321,7 @@ async function checkEachSegmentValid(rawIP: IPAddress, paramUserID: UserID, user
}
if (!(isVIP || isTempVIP) && segments[i].category === "sponsor"
&& segments[i].actionType !== ActionType.Full && (endTime - startTime) < 1) {
&& segments[i].actionType === ActionType.Skip && (endTime - startTime) < 1) {
// Too short
return { pass: false, errorMessage: "Segments must be longer than 1 second long", errorCode: 400 };
}
@@ -323,23 +330,26 @@ async function checkEachSegmentValid(rawIP: IPAddress, paramUserID: UserID, user
const duplicateCheck2Row = await db.prepare("get", `SELECT "UUID" FROM "sponsorTimes" WHERE "startTime" = ?
and "endTime" = ? and "category" = ? and "actionType" = ? and "description" = ? and "videoID" = ? and "service" = ?`, [startTime, endTime, segments[i].category, segments[i].actionType, segments[i].description, videoID, service]);
if (duplicateCheck2Row) {
segments[i].ignoreSegment = true;
if (segments[i].actionType === ActionType.Full) {
// Forward as vote
await vote(rawIP, duplicateCheck2Row.UUID, paramUserID, 1);
segments[i].ignoreSegment = true;
continue;
} else {
return { pass: false, errorMessage: "Segment has already been submitted before.", errorCode: 409 };
}
}
}
if (segments.every((s) => s.ignoreSegment && s.actionType !== ActionType.Full)) {
return { pass: false, errorMessage: "Segment has already been submitted before.", errorCode: 409 };
}
return CHECK_PASS;
}
async function checkByAutoModerator(videoID: VideoID, userID: HashedUserID, segments: IncomingSegment[], service: Service, apiVideoDetails: videoDetails, videoDuration: number): Promise<CheckResult> {
// Auto moderator check
if (service == Service.YouTube) {
if (service == Service.YouTube && apiVideoDetails) {
const autoModerateResult = await autoModerateSubmission(apiVideoDetails, { videoID, userID, segments, service, videoDuration });
if (autoModerateResult) {
return {
@@ -355,6 +365,15 @@ async function checkByAutoModerator(videoID: VideoID, userID: HashedUserID, segm
async function updateDataIfVideoDurationChange(videoID: VideoID, service: Service, videoDuration: VideoDuration, videoDurationParam: VideoDuration) {
let lockedCategoryList = await db.prepare("all", 'SELECT category, "actionType", reason from "lockCategories" where "videoID" = ? AND "service" = ?', [videoID, service]);
if (service === Service.Spotify) {
// Don't handle changed durations
return {
videoDuration,
apiVideoDetails: null,
lockedCategoryList
};
}
const previousSubmissions = await db.prepare("all",
`SELECT "videoDuration", "UUID"
FROM "sponsorTimes"
@@ -384,9 +403,12 @@ async function updateDataIfVideoDurationChange(videoID: VideoID, service: Servic
// Only treat as difference if both the api duration and submitted duration have changed
if (videoDurationChanged(videoDuration) && (!videoDurationParam || videoDurationChanged(videoDurationParam))) {
// Hide all previous submissions
for (const submission of previousSubmissions) {
await db.prepare("run", `UPDATE "sponsorTimes" SET "hidden" = 1 WHERE "UUID" = ?`, [submission.UUID]);
}
await db.prepare("run", `UPDATE "sponsorTimes" SET "hidden" = 1
WHERE "videoID" = ? AND "service" = ? AND "videoDuration" != ?
AND "hidden" = 0 AND "shadowHidden" = 0 AND
"actionType" != 'full' AND "votes" > -2`,
[videoID, service, videoDuration]);
lockedCategoryList = [];
deleteLockCategories(videoID, null, null, service).catch((e) => Logger.error(`deleting lock categories: ${e}`));
}
@@ -497,6 +519,22 @@ export async function postSkipSegments(req: Request, res: Response): Promise<Res
}
const userID: HashedUserID = await getHashCache(paramUserID);
const matchedRule = isRequestInvalid({
userAgent,
userAgentHeader: req.headers["user-agent"],
videoDuration,
videoID,
userID: paramUserID,
service,
segments,
endpoint: "sponsorblock-postSkipSegments"
});
if (matchedRule !== null) {
sendNewUserWebhook(config.discordRejectedNewUserWebhookURL, userID, videoID, userAgent, req, videoDurationParam, matchedRule);
Logger.warn(`Sponsorblock submission rejected by request validator: ${userID} ${videoID} ${videoDurationParam} ${userAgent} ${req.headers["user-agent"]}`);
return res.status(200).send("OK");
}
const invalidCheckResult = await checkInvalidFields(videoID, paramUserID, userID, segments, videoDurationParam, userAgent, service);
if (!invalidCheckResult.pass) {
return res.status(invalidCheckResult.errorCode).send(invalidCheckResult.errorMessage);
@@ -508,36 +546,52 @@ export async function postSkipSegments(req: Request, res: Response): Promise<Res
return res.status(userWarningCheckResult.errorCode).send(userWarningCheckResult.errorMessage);
}
const isVIP = (await isUserVIP(userID));
const isTempVIP = (await isUserTempVIP(userID, videoID));
const rawIP = getIP(req);
const newData = await updateDataIfVideoDurationChange(videoID, service, videoDuration, videoDurationParam);
videoDuration = newData.videoDuration;
const { lockedCategoryList, apiVideoDetails } = newData;
// Check if all submissions are correct
const segmentCheckResult = await checkEachSegmentValid(rawIP, paramUserID, userID, videoID, segments, service, isVIP, isTempVIP, lockedCategoryList);
if (!segmentCheckResult.pass) {
return res.status(segmentCheckResult.errorCode).send(segmentCheckResult.errorMessage);
const lock = await acquireLock(`postSkipSegment:${videoID}.${userID}`);
if (!lock.status) {
res.status(429).send("Submission already in progress");
return;
}
if (!(isVIP || isTempVIP)) {
const autoModerateCheckResult = await checkByAutoModerator(videoID, userID, segments, service, apiVideoDetails, videoDurationParam);
if (!autoModerateCheckResult.pass) {
return res.status(autoModerateCheckResult.errorCode).send(autoModerateCheckResult.errorMessage);
}
}
// Will be filled when submitting
const UUIDs = [];
const newSegments = [];
//hash the ip 5000 times so no one can get it from the database
const hashedIP = await getHashCache(rawIP + config.globalSalt);
try {
//get current time
const isVIP = (await isUserVIP(userID));
const isTempVIP = (await isUserTempVIP(userID, videoID));
const rawIP = getIP(req);
const newData = await updateDataIfVideoDurationChange(videoID, service, videoDuration, videoDurationParam);
videoDuration = newData.videoDuration;
const { lockedCategoryList, apiVideoDetails } = newData;
// Check if all submissions are correct
const segmentCheckResult = await checkEachSegmentValid(rawIP, paramUserID, userID, videoID, segments, service, isVIP, isTempVIP, lockedCategoryList);
if (!segmentCheckResult.pass) {
lock.unlock();
return res.status(segmentCheckResult.errorCode).send(segmentCheckResult.errorMessage);
}
if (!(isVIP || isTempVIP)) {
const autoModerateCheckResult = await checkByAutoModerator(videoID, userID, segments, service, apiVideoDetails, videoDurationParam);
if (!autoModerateCheckResult.pass) {
return res.status(autoModerateCheckResult.errorCode).send(autoModerateCheckResult.errorMessage);
}
}
const permission = await canSubmitGlobal(userID);
if (!permission.canSubmit) {
lock.unlock();
Logger.warn(`New user trying to submit: ${userID} ${videoID} ${Object.keys(segments?.[0] ?? {})} ${Object.keys(req.query)} ${videoDurationParam} ${userAgent} ${req.headers["user-agent"]}`);
return res.status(403).send(permission.reason);
} else if (permission.newUser) {
sendNewUserWebhook(config.discordNewUserWebhookURL, userID, videoID, userAgent, req, videoDurationParam, undefined);
}
// Will be filled when submitting
const UUIDs = [];
const newSegments = [];
//hash the ip 5000 times so no one can get it from the database
const hashedIP = await getHashCache(rawIP + config.globalSalt) as HashedIP;
const timeSubmitted = Date.now();
// const rateLimitCheckResult = checkRateLimit(userID, videoID, service, timeSubmitted, hashedIP);
@@ -546,22 +600,14 @@ export async function postSkipSegments(req: Request, res: Response): Promise<Res
// }
//check to see if this user is shadowbanned
const userBanCount = (await db.prepare("get", `SELECT count(*) as "userCount" FROM "shadowBannedUsers" WHERE "userID" = ? LIMIT 1`, [userID]))?.userCount;
const ipBanCount = (await db.prepare("get", `SELECT count(*) as "userCount" FROM "shadowBannedIPs" WHERE "hashedIP" = ? LIMIT 1`, [hashedIP]))?.userCount;
const shadowBanCount = userBanCount || ipBanCount;
const isBanned = await checkBanStatus(userID, hashedIP);
const startingVotes = 0;
const reputation = await getReputation(userID);
if (!userBanCount && ipBanCount) {
// Make sure the whole user is banned
banUser(userID, true, true, 1, config.categoryList as Category[])
.catch((e) => Logger.error(`Error banning user after submitting from a banned IP: ${e}`));
}
for (const segmentInfo of segments) {
// Full segments are always rejected since there can only be one, so shadow hide wouldn't work
if (segmentInfo.ignoreSegment
|| (shadowBanCount && segmentInfo.actionType === ActionType.Full)) {
|| (isBanned && segmentInfo.actionType === ActionType.Full)) {
continue;
}
@@ -578,17 +624,19 @@ export async function postSkipSegments(req: Request, res: Response): Promise<Res
("videoID", "startTime", "endTime", "votes", "locked", "UUID", "userID", "timeSubmitted", "views", "category", "actionType", "service", "videoDuration", "reputation", "shadowHidden", "hashedVideoID", "userAgent", "description")
VALUES(?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)`, [
videoID, segmentInfo.segment[0], segmentInfo.segment[1], startingVotes, startingLocked, UUID, userID, timeSubmitted, 0
, segmentInfo.category, segmentInfo.actionType, service, videoDuration, reputation, shadowBanCount, hashedVideoID, userAgent, segmentInfo.description
, segmentInfo.category, segmentInfo.actionType, service, videoDuration, reputation, isBanned ? 1 : 0, hashedVideoID, userAgent, segmentInfo.description
],
);
//add to private db as well
await privateDB.prepare("run", `INSERT INTO "sponsorTimes" VALUES(?, ?, ?, ?)`, [videoID, hashedIP, timeSubmitted, service]);
await db.prepare("run", `INSERT INTO "videoInfo" ("videoID", "channelID", "title", "published")
SELECT ?, ?, ?, ?
WHERE NOT EXISTS (SELECT 1 FROM "videoInfo" WHERE "videoID" = ?)`, [
videoID, apiVideoDetails?.authorId || "", apiVideoDetails?.title || "", apiVideoDetails?.published || 0, videoID]);
if (service === Service.YouTube) {
await db.prepare("run", `INSERT INTO "videoInfo" ("videoID", "channelID", "title", "published")
SELECT ?, ?, ?, ?
WHERE NOT EXISTS (SELECT 1 FROM "videoInfo" WHERE "videoID" = ?)`, [
videoID, apiVideoDetails?.authorId || "", apiVideoDetails?.title || "", apiVideoDetails?.published || 0, videoID]);
}
// Clear redis cache for this video
QueryCacher.clearSegmentCache({
@@ -600,6 +648,7 @@ export async function postSkipSegments(req: Request, res: Response): Promise<Res
} catch (err) {
//a DB change probably occurred
Logger.error(`Error when putting sponsorTime in the DB: ${videoID}, ${segmentInfo.segment[0]}, ${segmentInfo.segment[1]}, ${userID}, ${segmentInfo.category}. ${err}`);
lock.unlock();
return res.sendStatus(500);
}
@@ -610,15 +659,52 @@ export async function postSkipSegments(req: Request, res: Response): Promise<Res
segment: segmentInfo.segment,
});
}
for (let i = 0; i < segments.length; i++) {
sendWebhooks(apiVideoDetails, userID, videoID, UUIDs[i], segments[i], service).catch((e) => Logger.error(`call send webhooks ${e}`));
}
return res.json(newSegments);
} catch (err) {
Logger.error(err as string);
return res.sendStatus(500);
} finally {
lock.unlock();
}
}
for (let i = 0; i < segments.length; i++) {
sendWebhooks(apiVideoDetails, userID, videoID, UUIDs[i], segments[i], service).catch((e) => Logger.error(`call send webhooks ${e}`));
}
return res.json(newSegments);
function sendNewUserWebhook(webhookUrl: string, userID: HashedUserID, videoID: any, userAgent: any, req: Request, videoDurationParam: VideoDuration, ruleName: string | undefined) {
if (!webhookUrl) return;
axios.post(webhookUrl, {
"embeds": [{
"title": userID,
"url": `https://www.youtube.com/watch?v=${videoID}`,
"description": `**User Agent**: ${userAgent}\
\n**Sent User Agent**: ${req.query.userAgent ?? req.body.userAgent}\
\n**Real User Agent**: ${req.headers["user-agent"]}\
\n**Video Duration**: ${videoDurationParam}`,
"color": 10813440,
"thumbnail": {
"url": getMaxResThumbnail(videoID),
},
"footer": {
"text": ruleName === undefined ? "Caught by permission check" : `Caught by rule '${ruleName}'`,
},
}],
})
.then(res => {
if (res.status >= 400) {
Logger.error("Error sending reported submission Discord hook");
Logger.error(JSON.stringify((res.data)));
Logger.error("\n");
}
})
.catch(err => {
Logger.error("Failed to send reported submission Discord hook.");
Logger.error(JSON.stringify(err));
Logger.error("\n");
});
}
// Takes an array of arrays:

View File

@@ -4,8 +4,8 @@ import { db } from "../databases/databases";
import { isUserVIP } from "../utils/isUserVIP";
import { getHashCache } from "../utils/getHashCache";
import { HashedUserID, UserID } from "../types/user.model";
import { config } from "../config";
import { generateWarningDiscord, warningData, dispatchEvent } from "../utils/webhookUtils";
import { WarningType } from "../types/warning.model";
type warningEntry = {
userID: HashedUserID,
@@ -15,12 +15,7 @@ type warningEntry = {
reason: string
}
function checkExpiredWarning(warning: warningEntry): boolean {
const MILLISECONDS_IN_HOUR = 3600000;
const now = Date.now();
const expiry = Math.floor(now - (config.hoursAfterWarningExpires * MILLISECONDS_IN_HOUR));
return warning.issueTime > expiry && !warning.enabled;
}
const MAX_EDIT_DELAY = 900000; // 15 mins
const getUsername = (userID: HashedUserID) => db.prepare("get", `SELECT "userName" FROM "userNames" WHERE "userID" = ?`, [userID], { useReplica: true });
@@ -32,6 +27,7 @@ export async function postWarning(req: Request, res: Response): Promise<Response
const issueTime = new Date().getTime();
const enabled: boolean = req.body.enabled ?? true;
const reason: string = req.body.reason ?? "";
const type: WarningType = req.body.type ?? WarningType.SponsorBlock;
if ((!issuerUserID && enabled) || (issuerUserID && !await isUserVIP(issuerUserID))) {
Logger.warn(`Permission violation: User ${issuerUserID} attempted to warn user ${userID}.`);
@@ -40,53 +36,61 @@ export async function postWarning(req: Request, res: Response): Promise<Response
let resultStatus = "";
if (enabled) {
const previousWarning = await db.prepare("get", 'SELECT * FROM "warnings" WHERE "userID" = ? AND "issuerUserID" = ?', [userID, issuerUserID]) as warningEntry;
if (!previousWarning) {
await db.prepare(
"run",
'INSERT INTO "warnings" ("userID", "issueTime", "issuerUserID", "enabled", "reason") VALUES (?, ?, ?, 1, ?)',
[userID, issueTime, issuerUserID, reason]
);
resultStatus = "issued to";
// check if warning is still within issue time and warning is not enabled
} else if (checkExpiredWarning(previousWarning) ) {
await db.prepare(
"run", 'UPDATE "warnings" SET "enabled" = 1, "reason" = ? WHERE "userID" = ? AND "issueTime" = ?',
[reason, userID, previousWarning.issueTime]
);
resultStatus = "re-enabled";
} else {
return res.sendStatus(409);
}
} else {
await db.prepare("run", 'UPDATE "warnings" SET "enabled" = 0 WHERE "userID" = ?', [userID]);
resultStatus = "removed from";
}
const targetUsername = await getUsername(userID) ?? null;
const issuerUsername = await getUsername(issuerUserID) ?? null;
const webhookData = {
target: {
userID,
username: targetUsername
},
issuer: {
userID: issuerUserID,
username: issuerUsername
},
reason
} as warningData;
try {
const warning = generateWarningDiscord(webhookData);
dispatchEvent("warning", warning);
} catch /* istanbul ignore next */ (err) {
Logger.error(`Error sending warning to Discord ${err}`);
}
if (enabled) {
if (!reason) {
return res.status(400).json({ "message": "Missing warning reason" });
}
const previousWarning = await db.prepare("get", 'SELECT * FROM "warnings" WHERE "userID" = ? AND "type" = ? AND "enabled" = 1', [userID, type]) as warningEntry;
return res.status(200).json({
message: `Warning ${resultStatus} user '${userID}'.`,
});
if (!previousWarning) {
await db.prepare(
"run",
'INSERT INTO "warnings" ("userID", "issueTime", "issuerUserID", "enabled", "reason", "type") VALUES (?, ?, ?, 1, ?, ?)',
[userID, issueTime, issuerUserID, reason, type]
);
resultStatus = "issued to";
// allow a warning to be edited by the same vip within 15 mins of issuing
} else if (issuerUserID === previousWarning.issuerUserID && (Date.now() - MAX_EDIT_DELAY) < previousWarning.issueTime) {
await db.prepare(
"run", 'UPDATE "warnings" SET "reason" = ? WHERE "userID" = ? AND "issueTime" = ?',
[reason, userID, previousWarning.issueTime]
);
resultStatus = "edited for";
} else {
return res.sendStatus(409);
}
} else {
await db.prepare("run", 'UPDATE "warnings" SET "enabled" = 0, "disableTime" = ? WHERE "userID" = ? AND "type" = ? AND "enabled" = 1', [issueTime, userID, type]);
resultStatus = "removed from";
}
const targetUsername = await getUsername(userID) ?? null;
const issuerUsername = await getUsername(issuerUserID) ?? null;
const webhookData = {
target: {
userID,
username: targetUsername
},
issuer: {
userID: issuerUserID,
username: issuerUsername
},
reason
} as warningData;
try {
const warning = generateWarningDiscord(webhookData);
dispatchEvent("warning", warning);
} catch /* istanbul ignore next */ (err) {
Logger.error(`Error sending warning to Discord ${err}`);
}
return res.status(200).json({
message: `Tip ${resultStatus} user '${userID}'.`,
});
} catch (e) {
Logger.error(e as string);
return res.sendStatus(500);
}
}

48
src/routes/setConfig.ts Normal file
View File

@@ -0,0 +1,48 @@
import { getHashCache } from "../utils/getHashCache";
import { db } from "../databases/databases";
import { Request, Response } from "express";
import { isUserVIP } from "../utils/isUserVIP";
import { UserID } from "../types/user.model";
import { Logger } from "../utils/logger";
interface SetConfigRequest extends Request {
body: {
userID: UserID;
key: string;
value: string;
}
}
const allowedConfigs = [
"old-submitter-block-date",
"max-users-per-minute",
"max-users-per-minute-dearrow"
];
export async function setConfig(req: SetConfigRequest, res: Response): Promise<Response> {
const { body: { userID, key, value } } = req;
if (!userID || !allowedConfigs.includes(key)) {
// invalid request
return res.sendStatus(400);
}
// hash the userID
const hashedUserID = await getHashCache(userID as UserID);
const isVIP = (await isUserVIP(hashedUserID));
if (!isVIP) {
// not authorized
return res.sendStatus(403);
}
try {
await db.prepare("run", `INSERT INTO "config" ("key", "value") VALUES(?, ?) ON CONFLICT ("key") DO UPDATE SET "value" = ?`, [key, value, value]);
return res.sendStatus(200);
} catch (e) {
Logger.error(e as string);
return res.sendStatus(500);
}
}

View File

@@ -3,8 +3,11 @@ import { Logger } from "../utils/logger";
import { db, privateDB } from "../databases/databases";
import { getHashCache } from "../utils/getHashCache";
import { Request, Response } from "express";
import { isUserBanned } from "../utils/checkBan";
import { HashedUserID } from "../types/user.model";
import { isRequestInvalid } from "../utils/requestValidator";
function logUserNameChange(userID: string, newUserName: string, oldUserName: string, updatedByAdmin: boolean): Promise<Response> {
function logUserNameChange(userID: string, newUserName: string, oldUserName: string, updatedByAdmin: boolean): Promise<void> {
return privateDB.prepare("run",
`INSERT INTO "userNameLogs"("userID", "newUserName", "oldUserName", "updatedByAdmin", "updatedAt") VALUES(?, ?, ?, ?, ?)`,
[userID, newUserName, oldUserName, + updatedByAdmin, new Date().getTime()]
@@ -12,12 +15,12 @@ function logUserNameChange(userID: string, newUserName: string, oldUserName: str
}
export async function setUsername(req: Request, res: Response): Promise<Response> {
let userID = req.query.userID as string;
const userIDInput = req.query.userID as string;
const adminUserIDInput = req.query.adminUserID as string | undefined;
let userName = req.query.username as string;
let hashedUserID: HashedUserID;
let adminUserIDInput = req.query.adminUserID as string;
if (userID == undefined || userName == undefined || userID === "undefined" || userName.length > 64) {
if (userIDInput == undefined || userName == undefined || userIDInput === "undefined" || userName.length > 64) {
//invalid request
return res.sendStatus(400);
}
@@ -32,33 +35,41 @@ export async function setUsername(req: Request, res: Response): Promise<Response
// eslint-disable-next-line no-control-regex
userName = userName.replace(/[\u0000-\u001F\u007F-\u009F]/g, "");
// check privateID against publicID
if (!await checkPrivateUsername(userName, userID)) {
return res.sendStatus(400);
}
if (adminUserIDInput != undefined) {
//this is the admin controlling the other users account, don't hash the controling account's ID
adminUserIDInput = await getHashCache(adminUserIDInput);
if (adminUserIDInput != config.adminUserID) {
//they aren't the admin
return res.sendStatus(403);
}
} else {
//hash the userID
userID = await getHashCache(userID);
if (isRequestInvalid({
userAgentHeader: req.headers["user-agent"],
userID: adminUserIDInput ?? userIDInput,
newUsername: userName,
endpoint: "setUsername",
})) {
Logger.warn(`Username change rejected by request validator: ${userName} ${req.headers["user-agent"]}`);
return res.sendStatus(200);
}
try {
const row = await db.prepare("get", `SELECT count(*) as "userCount" FROM "userNames" WHERE "userID" = ? AND "locked" = 1`, [userID]);
if (adminUserIDInput === undefined && row.userCount > 0) {
return res.sendStatus(200);
}
if (adminUserIDInput != undefined) {
//this is the admin controlling the other users account, don't hash the controling account's ID
hashedUserID = userIDInput as HashedUserID;
const shadowBanRow = await db.prepare("get", `SELECT count(*) as "userCount" FROM "shadowBannedUsers" WHERE "userID" = ? LIMIT 1`, [userID]);
if (adminUserIDInput === undefined && shadowBanRow.userCount > 0) {
return res.sendStatus(200);
if (await getHashCache(adminUserIDInput) != config.adminUserID) {
//they aren't the admin
return res.sendStatus(403);
}
} else {
// check privateID against publicID
if (!await checkPrivateUsername(userName, userIDInput)) {
return res.sendStatus(400);
}
//hash the userID
hashedUserID = await getHashCache(userIDInput) as HashedUserID;
const row = await db.prepare("get", `SELECT count(*) as "userCount" FROM "userNames" WHERE "userID" = ? AND "locked" = 1`, [hashedUserID]);
if (row.userCount > 0) {
return res.sendStatus(200);
}
if (await isUserBanned(hashedUserID)) {
return res.sendStatus(200);
}
}
}
catch (error) /* istanbul ignore next */ {
@@ -68,24 +79,26 @@ export async function setUsername(req: Request, res: Response): Promise<Response
try {
//check if username is already set
const row = await db.prepare("get", `SELECT "userName" FROM "userNames" WHERE "userID" = ? LIMIT 1`, [userID]);
const row = await db.prepare("get", `SELECT "userName" FROM "userNames" WHERE "userID" = ? LIMIT 1`, [hashedUserID]);
const locked = adminUserIDInput === undefined ? 0 : 1;
let oldUserName = "";
if (row?.userName !== undefined) {
//already exists, update this row
oldUserName = row.userName;
if (userName == userID && !locked) {
await db.prepare("run", `DELETE FROM "userNames" WHERE "userID" = ?`, [userID]);
if (userName == hashedUserID && !locked) {
await db.prepare("run", `DELETE FROM "userNames" WHERE "userID" = ?`, [hashedUserID]);
} else {
await db.prepare("run", `UPDATE "userNames" SET "userName" = ?, "locked" = ? WHERE "userID" = ?`, [userName, locked, userID]);
await db.prepare("run", `UPDATE "userNames" SET "userName" = ?, "locked" = ? WHERE "userID" = ?`, [userName, locked, hashedUserID]);
}
} else if (userName === hashedUserID) {
return res.sendStatus(200);
} else {
//add to the db
await db.prepare("run", `INSERT INTO "userNames"("userID", "userName", "locked") VALUES(?, ?, ?)`, [userID, userName, locked]);
await db.prepare("run", `INSERT INTO "userNames"("userID", "userName", "locked") VALUES(?, ?, ?)`, [hashedUserID, userName, locked]);
}
await logUserNameChange(userID, userName, oldUserName, adminUserIDInput !== undefined);
await logUserNameChange(hashedUserID, userName, oldUserName, adminUserIDInput !== undefined);
return res.sendStatus(200);
} catch (err) /* istanbul ignore next */ {
@@ -101,4 +114,4 @@ async function checkPrivateUsername(username: string, userID: string): Promise<b
const userNameRow = await db.prepare("get", `SELECT "userID" FROM "userNames" WHERE "userID" = ? LIMIT 1`, [userNameHash]);
if (userNameRow?.userID) return false;
return true;
}
}

View File

@@ -1,16 +1,16 @@
import { db, privateDB } from "../databases/databases";
import { db } from "../databases/databases";
import { getHashCache } from "../utils/getHashCache";
import { Request, Response } from "express";
import { config } from "../config";
import { Category, HashedIP, Service, VideoID, VideoIDHash } from "../types/segments.model";
import { Category, DeArrowType, Service, VideoID, VideoIDHash } from "../types/segments.model";
import { UserID } from "../types/user.model";
import { QueryCacher } from "../utils/queryCacher";
import { isUserVIP } from "../utils/isUserVIP";
import { parseCategories } from "../utils/parseParams";
import { parseCategories, parseDeArrowTypes } from "../utils/parseParams";
import { Logger } from "../utils/logger";
export async function shadowBanUser(req: Request, res: Response): Promise<Response> {
const userID = req.query.userID as UserID;
const hashedIP = req.query.hashedIP as HashedIP;
const adminUserIDInput = req.query.adminUserID as UserID;
const type = Number.parseInt(req.query.type as string ?? "1");
if (isNaN(type)) {
@@ -20,60 +20,41 @@ export async function shadowBanUser(req: Request, res: Response): Promise<Respon
const enabled = req.query.enabled === undefined
? true
: req.query.enabled === "true";
const lookForIPs = req.query.lookForIPs === "true";
const banUsers = req.query.banUsers === undefined
? true
: req.query.banUsers === "true";
//if enabled is false and the old submissions should be made visible again
const unHideOldSubmissions = req.query.unHideOldSubmissions !== "false";
const categories: Category[] = parseCategories(req, config.categoryList as Category[]);
const deArrowTypes: DeArrowType[] = parseDeArrowTypes(req, config.deArrowTypes);
if (adminUserIDInput == undefined || (userID == undefined && hashedIP == undefined || type <= 0)) {
if (adminUserIDInput == undefined || (userID == undefined || type <= 0)) {
//invalid request
return res.sendStatus(400);
}
//hash the userID
const adminUserID = await getHashCache(adminUserIDInput);
try {
//hash the userID
const adminUserID = await getHashCache(adminUserIDInput);
const isVIP = await isUserVIP(adminUserID);
if (!isVIP) {
//not authorized
return res.sendStatus(403);
}
if (userID) {
const result = await banUser(userID, enabled, unHideOldSubmissions, type, categories);
if (enabled && lookForIPs) {
const ipLoggingFixedTime = 1675295716000;
const timeSubmitted = (await db.prepare("all", `SELECT "timeSubmitted" FROM "sponsorTimes" WHERE "timeSubmitted" > ? AND "userID" = ?`, [ipLoggingFixedTime, userID])) as { timeSubmitted: number }[];
const ips = (await Promise.all(timeSubmitted.map((s) => {
return privateDB.prepare("all", `SELECT "hashedIP" FROM "sponsorTimes" WHERE "timeSubmitted" = ?`, [s.timeSubmitted]) as Promise<{ hashedIP: HashedIP }[]>;
}))).flat();
await Promise.all([...new Set(ips.map((ip) => ip.hashedIP))].map((ip) => {
return banIP(ip, enabled, unHideOldSubmissions, type, categories, true);
}));
const isVIP = await isUserVIP(adminUserID);
if (!isVIP) {
//not authorized
return res.sendStatus(403);
}
if (result) {
res.sendStatus(result);
return;
}
} else if (hashedIP) {
const result = await banIP(hashedIP, enabled, unHideOldSubmissions, type, categories, banUsers);
const result = await banUser(userID, enabled, unHideOldSubmissions, type, categories, deArrowTypes);
if (result) {
res.sendStatus(result);
return;
}
return res.sendStatus(200);
} catch (e) {
Logger.error(e as string);
return res.sendStatus(500);
}
return res.sendStatus(200);
}
export async function banUser(userID: UserID, enabled: boolean, unHideOldSubmissions: boolean, type: number, categories: Category[]): Promise<number> {
export async function banUser(userID: UserID, enabled: boolean, unHideOldSubmissions: boolean,
type: number, categories: Category[], deArrowTypes: DeArrowType[]): Promise<number> {
//check to see if this user is already shadowbanned
const row = await db.prepare("get", `SELECT count(*) as "userCount" FROM "shadowBannedUsers" WHERE "userID" = ?`, [userID]);
@@ -85,12 +66,12 @@ export async function banUser(userID: UserID, enabled: boolean, unHideOldSubmiss
//find all previous submissions and hide them
if (unHideOldSubmissions) {
await unHideSubmissionsByUser(categories, userID, type);
await unHideSubmissionsByUser(categories, deArrowTypes, userID, type);
}
} else if (enabled && row.userCount > 0) {
// apply unHideOldSubmissions if applicable
if (unHideOldSubmissions) {
await unHideSubmissionsByUser(categories, userID, type);
await unHideSubmissionsByUser(categories, deArrowTypes, userID, type);
} else {
// otherwise ban already exists, send 409
return 409;
@@ -98,7 +79,7 @@ export async function banUser(userID: UserID, enabled: boolean, unHideOldSubmiss
} else if (!enabled && row.userCount > 0) {
//find all previous submissions and unhide them
if (unHideOldSubmissions) {
await unHideSubmissionsByUser(categories, userID, 0);
await unHideSubmissionsByUser(categories, deArrowTypes, userID, 0);
}
//remove them from the shadow ban list
@@ -107,75 +88,40 @@ export async function banUser(userID: UserID, enabled: boolean, unHideOldSubmiss
// already not shadowbanned
return 400;
}
return 200;
}
export async function banIP(hashedIP: HashedIP, enabled: boolean, unHideOldSubmissions: boolean, type: number, categories: Category[], banUsers: boolean): Promise<number> {
//check to see if this user is already shadowbanned
const row = await db.prepare("get", `SELECT count(*) as "userCount" FROM "shadowBannedIPs" WHERE "hashedIP" = ?`, [hashedIP]);
async function unHideSubmissionsByUser(categories: string[], deArrowTypes: DeArrowType[],
userID: UserID, type = 1) {
if (enabled) {
if (row.userCount == 0) {
await db.prepare("run", `INSERT INTO "shadowBannedIPs" VALUES(?)`, [hashedIP]);
}
//find all previous submissions and hide them
if (unHideOldSubmissions) {
const users = await unHideSubmissionsByIP(categories, hashedIP, type);
if (banUsers) {
await Promise.all([...users].map((user) => {
return banUser(user, enabled, unHideOldSubmissions, type, categories);
}));
}
} else if (row.userCount > 0) {
// Nothing to do, and already added
return 409;
}
} else if (!enabled) {
if (row.userCount > 0) {
//remove them from the shadow ban list
await db.prepare("run", `DELETE FROM "shadowBannedIPs" WHERE "hashedIP" = ?`, [hashedIP]);
}
//find all previous submissions and unhide them
if (unHideOldSubmissions) {
await unHideSubmissionsByIP(categories, hashedIP, 0);
}
if (categories.length) {
await db.prepare("run", `UPDATE "sponsorTimes" SET "shadowHidden" = '${type}' WHERE "userID" = ? AND "category" in (${categories.map((c) => `'${c}'`).join(",")})
AND NOT EXISTS ( SELECT "videoID", "category" FROM "lockCategories" WHERE
"sponsorTimes"."videoID" = "lockCategories"."videoID" AND "sponsorTimes"."service" = "lockCategories"."service" AND "sponsorTimes"."category" = "lockCategories"."category")`, [userID]);
}
return 200;
}
async function unHideSubmissionsByUser(categories: string[], userID: UserID, type = 1) {
await db.prepare("run", `UPDATE "sponsorTimes" SET "shadowHidden" = '${type}' WHERE "userID" = ? AND "category" in (${categories.map((c) => `'${c}'`).join(",")})
AND NOT EXISTS ( SELECT "videoID", "category" FROM "lockCategories" WHERE
"sponsorTimes"."videoID" = "lockCategories"."videoID" AND "sponsorTimes"."service" = "lockCategories"."service" AND "sponsorTimes"."category" = "lockCategories"."category")`, [userID]);
// clear cache for all old videos
(await db.prepare("all", `SELECT "videoID", "hashedVideoID", "service", "votes", "views" FROM "sponsorTimes" WHERE "userID" = ?`, [userID]))
(await db.prepare("all", `SELECT "category", "videoID", "hashedVideoID", "service", "userID" FROM "sponsorTimes" WHERE "userID" = ?`, [userID]))
.forEach((videoInfo: { category: Category; videoID: VideoID; hashedVideoID: VideoIDHash; service: Service; userID: UserID; }) => {
QueryCacher.clearSegmentCache(videoInfo);
});
}
async function unHideSubmissionsByIP(categories: string[], hashedIP: HashedIP, type = 1): Promise<Set<UserID>> {
const submissions = await privateDB.prepare("all", `SELECT "timeSubmitted" FROM "sponsorTimes" WHERE "hashedIP" = ?`, [hashedIP]) as { timeSubmitted: number }[];
if (deArrowTypes.includes("title")) {
await db.prepare("run", `UPDATE "titleVotes" as tv SET "shadowHidden" = ${type} FROM "titles" t WHERE tv."UUID" = t."UUID" AND t."userID" = ?`,
[userID]);
}
const users: Set<UserID> = new Set();
await Promise.all(submissions.map(async (submission) => {
(await db.prepare("all", `SELECT "videoID", "hashedVideoID", "service", "votes", "views", "userID" FROM "sponsorTimes" WHERE "timeSubmitted" = ? AND "category" in (${categories.map((c) => `'${c}'`).join(",")})`, [submission.timeSubmitted]))
.forEach((videoInfo: { category: Category, videoID: VideoID, hashedVideoID: VideoIDHash, service: Service, userID: UserID }) => {
QueryCacher.clearSegmentCache(videoInfo);
users.add(videoInfo.userID);
}
);
if (deArrowTypes.includes("thumbnail")) {
await db.prepare("run", `UPDATE "thumbnailVotes" as tv SET "shadowHidden" = ${type} FROM "thumbnails" t WHERE tv."UUID" = t."UUID" AND t."userID" = ?`,
[userID]);
}
await db.prepare("run", `UPDATE "sponsorTimes" SET "shadowHidden" = ${type} WHERE "timeSubmitted" = ? AND "category" in (${categories.map((c) => `'${c}'`).join(",")})
AND NOT EXISTS ( SELECT "videoID", "category" FROM "lockCategories" WHERE
"sponsorTimes"."videoID" = "lockCategories"."videoID" AND "sponsorTimes"."service" = "lockCategories"."service" AND "sponsorTimes"."category" = "lockCategories"."category")`, [submission.timeSubmitted]);
}));
return users;
}
(await db.prepare("all", `SELECT "videoID", "hashedVideoID", "service" FROM "titles" WHERE "userID" = ?`, [userID]))
.forEach((videoInfo: { videoID: VideoID; hashedVideoID: VideoIDHash; service: Service; }) => {
QueryCacher.clearBrandingCache(videoInfo);
});
(await db.prepare("all", `SELECT "videoID", "hashedVideoID", "service" FROM "thumbnails" WHERE "userID" = ?`, [userID]))
.forEach((videoInfo: { videoID: VideoID; hashedVideoID: VideoIDHash; service: Service; }) => {
QueryCacher.clearBrandingCache(videoInfo);
});
}

View File

@@ -4,6 +4,7 @@ import { config } from "../config";
import { privateDB } from "../databases/databases";
import { Logger } from "../utils/logger";
import { getPatreonIdentity, PatronStatus, refreshToken, TokenType } from "../utils/tokenUtils";
import { getHash } from "../utils/getHash";
interface VerifyTokenRequest extends Request {
query: {
@@ -12,55 +13,74 @@ interface VerifyTokenRequest extends Request {
}
export const validateLicenseKeyRegex = (token: string) =>
new RegExp(/[A-Za-z0-9]{40}|[A-Za-z0-9-]{35}/).test(token);
new RegExp(/[A-Za-z0-9]{40}|[A-Za-z0-9-]{35}|[A-Za-z0-9-]{5}-[A-Za-z0-9-]{5}/).test(token);
const isLocalLicenseKey = (token: string) => /[A-Za-z0-9]{5}-[A-Za-z0-9]{5}/.test(token);
export async function verifyTokenRequest(req: VerifyTokenRequest, res: Response): Promise<Response> {
const { query: { licenseKey } } = req;
if (!licenseKey) {
return res.status(400).send("Invalid request");
} else if (!validateLicenseKeyRegex(licenseKey)) {
// fast check for invalid licence key
return res.status(200).send({
allowed: false
});
}
const tokens = (await privateDB.prepare("get", `SELECT "accessToken", "refreshToken", "expiresIn" from "oauthLicenseKeys" WHERE "licenseKey" = ?`
, [licenseKey])) as {accessToken: string, refreshToken: string, expiresIn: number};
if (tokens) {
const identity = await getPatreonIdentity(tokens.accessToken);
if (tokens.expiresIn < 15 * 24 * 60 * 60) {
refreshToken(TokenType.patreon, licenseKey, tokens.refreshToken).catch((e) => Logger.error(`refresh token: ${e}`));
try {
if (!licenseKey) {
return res.status(400).send("Invalid request");
} else if (!validateLicenseKeyRegex(licenseKey)) {
// fast check for invalid licence key
return res.status(200).send({
allowed: false
});
}
/* istanbul ignore else */
if (identity) {
const membership = identity.included?.[0]?.attributes;
const allowed = !!membership && ((membership.patron_status === PatronStatus.active && membership.currently_entitled_amount_cents > 0)
|| (membership.patron_status === PatronStatus.former && membership.campaign_lifetime_support_cents > 300));
if (isLocalLicenseKey(licenseKey) && !licenseKey.startsWith("P")) {
const parts = licenseKey.split("-");
const code = parts[0];
const givenResult = parts[1];
return res.status(200).send({
allowed
});
if (getHash(config.tokenSeed + code, 1).startsWith(givenResult)) {
return res.status(200).send({
allowed: true
});
}
}
const tokens = (await privateDB.prepare("get", `SELECT "accessToken", "refreshToken", "expiresIn" from "oauthLicenseKeys" WHERE "licenseKey" = ?`
, [licenseKey])) as {accessToken: string, refreshToken: string, expiresIn: number};
if (tokens) {
const identity = await getPatreonIdentity(tokens.accessToken);
if (tokens.expiresIn < 15 * 24 * 60 * 60) {
refreshToken(TokenType.patreon, licenseKey, tokens.refreshToken).catch((e) => Logger.error(`refresh token: ${e}`));
}
/* istanbul ignore else */
if (identity) {
const membership = identity.included?.[0]?.attributes;
const allowed = !!membership && ((membership.patron_status === PatronStatus.active && membership.currently_entitled_amount_cents > 0)
|| (membership.patron_status === PatronStatus.former && membership.campaign_lifetime_support_cents > 300));
return res.status(200).send({
allowed
});
} else {
return res.status(500);
}
} else {
return res.status(500);
}
} else {
// Check Local
const result = await privateDB.prepare("get", `SELECT "licenseKey" from "licenseKeys" WHERE "licenseKey" = ?`, [licenseKey]);
if (result) {
return res.status(200).send({
allowed: true
});
} else {
// Gumroad
return res.status(200).send({
allowed: await checkAllGumroadProducts(licenseKey)
});
}
// Check Local
const result = await privateDB.prepare("get", `SELECT "licenseKey" from "licenseKeys" WHERE "licenseKey" = ?`, [licenseKey]);
if (result) {
return res.status(200).send({
allowed: true
});
} else {
// Gumroad
return res.status(200).send({
allowed: await checkAllGumroadProducts(licenseKey)
});
}
}
} catch (e) {
Logger.error(e as string);
return res.status(500);
}
}

View File

@@ -3,14 +3,18 @@ import { Request, Response } from "express";
export async function viewedVideoSponsorTime(req: Request, res: Response): Promise<Response> {
const UUID = req.query?.UUID;
const videoID = req.query?.videoID;
if (!UUID) {
//invalid request
return res.sendStatus(400);
}
//up the view count by one
await db.prepare("run", `UPDATE "sponsorTimes" SET views = views + 1 WHERE "UUID" = ?`, [UUID]);
if (!videoID) {
await db.prepare("run", `UPDATE "sponsorTimes" SET views = views + 1 WHERE "UUID" = ?`, [UUID]);
} else {
await db.prepare("run", `UPDATE "sponsorTimes" SET views = views + 1 WHERE "UUID" LIKE ? AND "videoID" = ?`, [`${UUID}%`, videoID]);
}
return res.sendStatus(200);
}

View File

@@ -2,19 +2,21 @@ import { Request, Response } from "express";
import { Logger } from "../utils/logger";
import { isUserVIP } from "../utils/isUserVIP";
import { isUserTempVIP } from "../utils/isUserTempVIP";
import { getMaxResThumbnail, YouTubeAPI } from "../utils/youtubeApi";
import { getMaxResThumbnail } from "../utils/youtubeApi";
import { db, privateDB } from "../databases/databases";
import { dispatchEvent, getVoteAuthor, getVoteAuthorRaw } from "../utils/webhookUtils";
import { getFormattedTime } from "../utils/getFormattedTime";
import { getIP } from "../utils/getIP";
import { getHashCache } from "../utils/getHashCache";
import { config } from "../config";
import { UserID } from "../types/user.model";
import { HashedUserID, UserID } from "../types/user.model";
import { DBSegment, Category, HashedIP, IPAddress, SegmentUUID, Service, VideoID, VideoIDHash, VideoDuration, ActionType, VoteType } from "../types/segments.model";
import { QueryCacher } from "../utils/queryCacher";
import axios from "axios";
import { getVideoDetails, videoDetails } from "../utils/getVideoDetails";
import { deleteLockCategories } from "./deleteLockCategories";
import { acquireLock } from "../utils/redisLock";
import { checkBanStatus } from "../utils/checkBan";
const voteTypes = {
normal: 0,
@@ -126,88 +128,85 @@ async function sendWebhooks(voteData: VoteData) {
webhookURL = config.discordCompletelyIncorrectReportWebhookURL;
}
if (config.newLeafURLs !== null) {
const videoID = submissionInfoRow.videoID;
const { err, data } = await YouTubeAPI.listVideos(videoID);
if (err) return;
const videoID = submissionInfoRow.videoID;
const data = await getVideoDetails(videoID);
const isUpvote = voteData.incrementAmount > 0;
// Send custom webhooks
dispatchEvent(isUpvote ? "vote.up" : "vote.down", {
const isUpvote = voteData.incrementAmount > 0;
// Send custom webhooks
dispatchEvent(isUpvote ? "vote.up" : "vote.down", {
"user": {
"status": getVoteAuthorRaw(userSubmissionCountRow.submissionCount, voteData.isTempVIP, voteData.isVIP, voteData.isOwnSubmission),
},
"video": {
"id": submissionInfoRow.videoID,
"title": data?.title,
"url": `https://www.youtube.com/watch?v=${videoID}`,
"thumbnail": getMaxResThumbnail(videoID),
},
"submission": {
"UUID": voteData.UUID,
"views": voteData.row.views,
"category": voteData.category,
"startTime": submissionInfoRow.startTime,
"endTime": submissionInfoRow.endTime,
"user": {
"status": getVoteAuthorRaw(userSubmissionCountRow.submissionCount, voteData.isTempVIP, voteData.isVIP, voteData.isOwnSubmission),
},
"video": {
"id": submissionInfoRow.videoID,
"title": data?.title,
"url": `https://www.youtube.com/watch?v=${videoID}`,
"thumbnail": getMaxResThumbnail(videoID),
},
"submission": {
"UUID": voteData.UUID,
"views": voteData.row.views,
"category": voteData.category,
"startTime": submissionInfoRow.startTime,
"endTime": submissionInfoRow.endTime,
"user": {
"UUID": submissionInfoRow.userID,
"username": submissionInfoRow.userName,
"submissions": {
"total": submissionInfoRow.count,
"ignored": submissionInfoRow.disregarded,
},
"UUID": submissionInfoRow.userID,
"username": submissionInfoRow.userName,
"submissions": {
"total": submissionInfoRow.count,
"ignored": submissionInfoRow.disregarded,
},
},
"votes": {
"before": voteData.row.votes,
"after": (voteData.row.votes + voteData.incrementAmount - voteData.oldIncrementAmount),
},
});
},
"votes": {
"before": voteData.row.votes,
"after": (voteData.row.votes + voteData.incrementAmount - voteData.oldIncrementAmount),
},
});
// Send discord message
if (webhookURL !== null && !isUpvote) {
axios.post(webhookURL, {
"embeds": [{
"title": data?.title,
"url": `https://www.youtube.com/watch?v=${submissionInfoRow.videoID}&t=${(submissionInfoRow.startTime.toFixed(0) - 2)}s#requiredSegment=${voteData.UUID}`,
"description": `**${voteData.row.votes} Votes Prior | \
${(voteData.row.votes + voteData.incrementAmount - voteData.oldIncrementAmount)} Votes Now | ${voteData.row.views} \
Views**\n\n**Locked**: ${voteData.row.locked}\n\n**Submission ID:** ${voteData.UUID}\
\n**Category:** ${submissionInfoRow.category}\
\n\n**Submitted by:** ${submissionInfoRow.userName}\n${submissionInfoRow.userID}\
\n\n**Total User Submissions:** ${submissionInfoRow.count}\
\n**Ignored User Submissions:** ${submissionInfoRow.disregarded}\
\n\n**Timestamp:** \
${getFormattedTime(submissionInfoRow.startTime)} to ${getFormattedTime(submissionInfoRow.endTime)}`,
"color": 10813440,
"author": {
"name": voteData.finalResponse?.webhookMessage ??
voteData.finalResponse?.finalMessage ??
`${getVoteAuthor(userSubmissionCountRow.submissionCount, voteData.isTempVIP, voteData.isVIP, voteData.isOwnSubmission)}${voteData.row.locked ? " (Locked)" : ""}`,
},
"thumbnail": {
"url": getMaxResThumbnail(videoID),
},
}],
})
.then(res => {
if (res.status >= 400) {
Logger.error("Error sending reported submission Discord hook");
Logger.error(JSON.stringify((res.data)));
Logger.error("\n");
}
})
.catch(err => {
Logger.error("Failed to send reported submission Discord hook.");
Logger.error(JSON.stringify(err));
// Send discord message
if (webhookURL !== null && !isUpvote) {
axios.post(webhookURL, {
"embeds": [{
"title": data?.title,
"url": `https://www.youtube.com/watch?v=${submissionInfoRow.videoID}&t=${(submissionInfoRow.startTime.toFixed(0) - 2)}s#requiredSegment=${voteData.UUID}`,
"description": `**${voteData.row.votes} Votes Prior | \
${(voteData.row.votes + voteData.incrementAmount - voteData.oldIncrementAmount)} Votes Now | ${voteData.row.views} \
Views**\n\n**Locked**: ${voteData.row.locked}\n\n**Submission ID:** ${voteData.UUID}\
\n**Category:** ${submissionInfoRow.category}\
\n\n**Submitted by:** ${submissionInfoRow.userName}\n${submissionInfoRow.userID}\
\n\n**Total User Submissions:** ${submissionInfoRow.count}\
\n**Ignored User Submissions:** ${submissionInfoRow.disregarded}\
\n\n**Timestamp:** \
${getFormattedTime(submissionInfoRow.startTime)} to ${getFormattedTime(submissionInfoRow.endTime)}`,
"color": 10813440,
"author": {
"name": voteData.finalResponse?.webhookMessage ??
voteData.finalResponse?.finalMessage ??
`${getVoteAuthor(userSubmissionCountRow.submissionCount, voteData.isTempVIP, voteData.isVIP, voteData.isOwnSubmission)}${voteData.row.locked ? " (Locked)" : ""}`,
},
"thumbnail": {
"url": getMaxResThumbnail(videoID),
},
}],
})
.then(res => {
if (res.status >= 400) {
Logger.error("Error sending reported submission Discord hook");
Logger.error(JSON.stringify((res.data)));
Logger.error("\n");
});
}
}
})
.catch(err => {
Logger.error("Failed to send reported submission Discord hook.");
Logger.error(JSON.stringify(err));
Logger.error("\n");
});
}
}
}
async function categoryVote(UUID: SegmentUUID, userID: UserID, isVIP: boolean, isTempVIP: boolean, isOwnSubmission: boolean, category: Category
async function categoryVote(UUID: SegmentUUID, userID: HashedUserID, isVIP: boolean, isTempVIP: boolean, isOwnSubmission: boolean, category: Category
, hashedIP: HashedIP, finalResponse: FinalResponse): Promise<{ status: number, message?: string }> {
// Check if they've already made a vote
const usersLastVoteInfo = await privateDB.prepare("get", `select count(*) as votes, category from "categoryVotes" where "UUID" = ? and "userID" = ? group by category`, [UUID, userID], { useReplica: true });
@@ -243,8 +242,7 @@ async function categoryVote(UUID: SegmentUUID, userID: UserID, isVIP: boolean, i
const timeSubmitted = Date.now();
const voteAmount = (isVIP || isTempVIP) ? 500 : 1;
const ableToVote = finalResponse.finalStatus === 200
&& (await db.prepare("get", `SELECT "userID" FROM "shadowBannedUsers" WHERE "userID" = ?`, [userID], { useReplica: true })) === undefined;
const ableToVote = finalResponse.finalStatus === 200; // ban status checks handled by vote() (caller function)
if (ableToVote) {
// Add the vote
@@ -305,9 +303,10 @@ export async function voteOnSponsorTime(req: Request, res: Response): Promise<Re
const paramUserID = getUserID(req);
const type = req.query.type !== undefined ? parseInt(req.query.type as string) : undefined;
const category = req.query.category as Category;
const videoID = req.query.videoID as VideoID;
const ip = getIP(req);
const result = await vote(ip, UUID, paramUserID, type, category);
const result = await vote(ip, UUID, paramUserID, type, videoID, category);
const response = res.status(result.status);
if (result.message) {
@@ -319,7 +318,7 @@ export async function voteOnSponsorTime(req: Request, res: Response): Promise<Re
}
}
export async function vote(ip: IPAddress, UUID: SegmentUUID, paramUserID: UserID, type: number, category?: Category): Promise<{ status: number, message?: string, json?: unknown }> {
export async function vote(ip: IPAddress, UUID: SegmentUUID, paramUserID: UserID, type: number, videoID?: VideoID, category?: Category): Promise<{ status: number, message?: string, json?: unknown }> {
// missing key parameters
if (!UUID || !paramUserID || !(type !== undefined || category)) {
return { status: 400 };
@@ -329,12 +328,28 @@ export async function vote(ip: IPAddress, UUID: SegmentUUID, paramUserID: UserID
return { status: 200 };
}
if (videoID && UUID.length < 60) {
// Get the full UUID
const segmentInfo: DBSegment = await db.prepare("get", `SELECT "UUID" from "sponsorTimes" WHERE "UUID" LIKE ? AND "videoID" = ?`, [`${UUID}%`, videoID]);
if (segmentInfo) {
UUID = segmentInfo.UUID;
}
}
const originalType = type;
//hash the userID
const nonAnonUserID = await getHashCache(paramUserID);
const userID = await getHashCache(paramUserID + UUID);
//hash the ip 5000 times so no one can get it from the database
const hashedIP: HashedIP = await getHashCache((ip + config.globalSalt) as IPAddress);
const lock = await acquireLock(`voteOnSponsorTime:${UUID}.${paramUserID}`);
if (!lock.status) {
return { status: 429, message: "Vote already in progress" };
}
// To force a non 200, change this early
const finalResponse: FinalResponse = {
blockVote: false,
@@ -344,42 +359,51 @@ export async function vote(ip: IPAddress, UUID: SegmentUUID, paramUserID: UserID
webhookMessage: null
};
//hash the ip 5000 times so no one can get it from the database
const hashedIP: HashedIP = await getHashCache((ip + config.globalSalt) as IPAddress);
const segmentInfo: DBSegment = await db.prepare("get", `SELECT * from "sponsorTimes" WHERE "UUID" = ?`, [UUID]);
// segment doesnt exist
if (!segmentInfo) {
lock.unlock();
return { status: 404 };
}
const isTempVIP = await isUserTempVIP(nonAnonUserID, segmentInfo.videoID);
const isVIP = await isUserVIP(nonAnonUserID);
const isBanned = await checkBanStatus(nonAnonUserID, hashedIP); // propagates IP bans
//check if user voting on own submission
const isOwnSubmission = nonAnonUserID === segmentInfo.userID;
// disallow vote types 10/11
if (type === 10 || type === 11) {
lock.unlock();
return { status: 400 };
}
const MILLISECONDS_IN_HOUR = 3600000;
const now = Date.now();
const warnings = (await db.prepare("all", `SELECT "reason" FROM warnings WHERE "userID" = ? AND "issueTime" > ? AND enabled = 1`,
[nonAnonUserID, Math.floor(now - (config.hoursAfterWarningExpires * MILLISECONDS_IN_HOUR))],
const warning = (await db.prepare("get", `SELECT "reason" FROM warnings WHERE "userID" = ? AND enabled = 1 AND type = 0`,
[nonAnonUserID],
));
if (warnings.length >= config.maxNumberOfActiveWarnings) {
const warningReason = warnings[0]?.reason;
return { status: 403, message: "Vote rejected due to a warning from a moderator. This means that we noticed you were making some common mistakes that are not malicious, and we just want to clarify the rules. " +
if (warning != null) {
const warningReason = warning.reason;
lock.unlock();
return { status: 403, message: "Vote rejected due to a tip from a moderator. This means that we noticed you were making some common mistakes that are not malicious, and we just want to clarify the rules. " +
"Could you please send a message in Discord or Matrix so we can further help you?" +
`${(warningReason.length > 0 ? ` Warning reason: '${warningReason}'` : "")}` };
`${(warningReason.length > 0 ? ` Tip message: '${warningReason}'` : "")}` };
}
// we can return out of the function early if the user is banned after warning checks
// returning before warning checks would make them not appear on vote if the user is also banned
if (isBanned) {
lock.unlock();
return { status: 200 };
}
// no type but has category, categoryVote
if (!type && category) {
return categoryVote(UUID, nonAnonUserID, isVIP, isTempVIP, isOwnSubmission, category, hashedIP, finalResponse);
const result = categoryVote(UUID, nonAnonUserID, isVIP, isTempVIP, isOwnSubmission, category, hashedIP, finalResponse);
lock.unlock();
return result;
}
// If not upvote, or an upvote on a dead segment (for ActionType.Full)
@@ -399,8 +423,11 @@ export async function vote(ip: IPAddress, UUID: SegmentUUID, paramUserID: UserID
if (!isNaN(type) && segmentInfo.votes <= -2 && segmentInfo.actionType !== ActionType.Full &&
!(isVIP || isTempVIP || isOwnSubmission)) {
if (type == 1) {
lock.unlock();
return { status: 403, message: "Not allowed to upvote segment with too many downvotes unless you are VIP." };
} else if (type == 0) {
lock.unlock();
// Already downvoted enough, ignore
return { status: 200 };
}
@@ -433,6 +460,8 @@ export async function vote(ip: IPAddress, UUID: SegmentUUID, paramUserID: UserID
//undo/cancel vote
incrementAmount = 0;
} else {
lock.unlock();
//unrecongnised type of vote
return { status: 400 };
}
@@ -470,16 +499,15 @@ export async function vote(ip: IPAddress, UUID: SegmentUUID, paramUserID: UserID
}
// Only change the database if they have made a submission before and haven't voted recently
const userAbleToVote = (!(isOwnSubmission && incrementAmount > 0 && oldIncrementAmount >= 0)
&& !(originalType === VoteType.Malicious && segmentInfo.actionType !== ActionType.Chapter)
&& !finalResponse.blockVote
&& finalResponse.finalStatus === 200
&& (await db.prepare("get", `SELECT "userID" FROM "sponsorTimes" WHERE "userID" = ?`, [nonAnonUserID], { useReplica: true })) !== undefined
&& (await db.prepare("get", `SELECT "userID" FROM "shadowBannedUsers" WHERE "userID" = ?`, [nonAnonUserID], { useReplica: true })) === undefined
&& (await privateDB.prepare("get", `SELECT "UUID" FROM "votes" WHERE "UUID" = ? AND "hashedIP" = ? AND "userID" != ?`, [UUID, hashedIP, userID], { useReplica: true })) === undefined);
const ableToVote = isVIP || isTempVIP || userAbleToVote;
// ban status check was handled earlier (w/ early return)
const ableToVote = isVIP || isTempVIP || (
(!(isOwnSubmission && incrementAmount > 0 && oldIncrementAmount >= 0)
&& !(originalType === VoteType.Malicious && segmentInfo.actionType !== ActionType.Chapter)
&& !finalResponse.blockVote
&& finalResponse.finalStatus === 200
&& (await db.prepare("get", `SELECT "userID" FROM "sponsorTimes" WHERE "userID" = ? AND "category" = ? AND "votes" > -2 AND "hidden" = 0 AND "shadowHidden" = 0 LIMIT 1`, [nonAnonUserID, segmentInfo.category], { useReplica: true }) !== undefined)
&& (await privateDB.prepare("get", `SELECT "UUID" FROM "votes" WHERE "UUID" = ? AND "hashedIP" = ? AND "userID" != ?`, [UUID, hashedIP, userID], { useReplica: true })) === undefined)
);
if (ableToVote) {
//update the votes table
@@ -526,8 +554,13 @@ export async function vote(ip: IPAddress, UUID: SegmentUUID, paramUserID: UserID
finalResponse
}).catch((e) => Logger.error(`Sending vote webhook: ${e}`));
}
lock.unlock();
return { status: finalResponse.finalStatus, message: finalResponse.finalMessage ?? undefined };
} catch (err) {
lock.unlock();
Logger.error(err as string);
return { status: 500, message: finalResponse.finalMessage ?? undefined, json: { error: "Internal error creating segment vote" } };
}

View File

@@ -1,12 +1,17 @@
import { Service, VideoID, VideoIDHash } from "./segments.model";
import { Category, Service, VideoID, VideoIDHash } from "./segments.model";
import { UserID } from "./user.model";
export type BrandingUUID = string & { readonly __brandingUUID: unique symbol };
export interface BrandingDBSubmission {
export type CasualCategory = ("funny" | "creative" | "clever" | "descriptive" | "other" | "downvote") & { __casualCategoryBrand: unknown };
export interface BrandingDBSubmissionData {
videoID: VideoID,
}
export interface BrandingDBSubmission extends BrandingDBSubmissionData {
shadowHidden: number,
UUID: BrandingUUID,
videoID: VideoID,
hashedVideoID: VideoIDHash
}
@@ -14,7 +19,10 @@ export interface TitleDBResult extends BrandingDBSubmission {
title: string,
original: number,
votes: number,
locked: number
downvotes: number,
locked: number,
verification: number,
userID: UserID
}
export interface TitleResult {
@@ -22,14 +30,17 @@ export interface TitleResult {
original: boolean,
votes: number,
locked: boolean,
UUID: BrandingUUID
UUID: BrandingUUID,
userID?: UserID
}
export interface ThumbnailDBResult extends BrandingDBSubmission {
timestamp?: number,
original: number,
votes: number,
locked: number
downvotes: number,
locked: number,
userID: UserID
}
export interface ThumbnailResult {
@@ -37,20 +48,33 @@ export interface ThumbnailResult {
original: boolean,
votes: number,
locked: boolean,
UUID: BrandingUUID
UUID: BrandingUUID,
userID?: UserID
}
export interface CasualVote {
id: string,
count: number,
title: string | null
}
export interface BrandingResult {
titles: TitleResult[],
thumbnails: ThumbnailResult[]
thumbnails: ThumbnailResult[],
casualVotes: CasualVote[],
randomTime: number,
videoDuration: number | null
}
export interface BrandingHashDBResult {
titles: TitleDBResult[],
thumbnails: ThumbnailDBResult[]
titles: TitleDBResult[];
thumbnails: ThumbnailDBResult[];
segments: BrandingSegmentDBResult[];
casualVotes: CasualVoteDBResult[];
}
export interface OriginalThumbnailSubmission {
timestamp?: undefined | null;
original: true;
}
@@ -72,4 +96,45 @@ export interface BrandingSubmission {
videoID: VideoID;
userID: UserID;
service: Service;
}
autoLock: boolean | undefined;
downvote: boolean | undefined;
videoDuration: number | undefined;
wasWarned: boolean | undefined;
casualMode: boolean | undefined;
}
export interface CasualVoteSubmission {
videoID: VideoID;
userID: UserID;
service: Service;
downvote: boolean | undefined;
categories: CasualCategory[];
title?: string;
}
export interface BrandingSegmentDBResult {
startTime: number;
endTime: number;
category: Category;
videoDuration: number;
}
export interface CasualVoteDBResult {
category: CasualCategory;
upvotes: number;
downvotes: number;
title?: string;
}
export interface BrandingSegmentHashDBResult extends BrandingDBSubmissionData {
startTime: number;
endTime: number;
category: Category;
videoDuration: number;
}
export interface CasualVoteHashDBResult extends BrandingDBSubmissionData {
category: CasualCategory;
upvotes: number;
downvotes: number;
}

View File

@@ -1,5 +1,6 @@
import { PoolConfig } from "pg";
import * as redis from "redis";
import { DeArrowType } from "./segments.model";
interface RedisConfig extends redis.RedisClientOptions {
enabled: boolean;
@@ -9,7 +10,11 @@ interface RedisConfig extends redis.RedisClientOptions {
maxWriteConnections: number;
stopWritingAfterResponseTime: number;
responseTimePause: number;
maxReadResponseTime: number;
disableHashCache: boolean;
clientCacheSize: number;
useCompression: boolean;
dragonflyMode: boolean;
}
interface RedisReadOnlyConfig extends redis.RedisClientOptions {
@@ -26,6 +31,7 @@ export interface CustomWritePostgresConfig extends CustomPostgresConfig {
maxActiveRequests: number;
timeout: number;
highLoadThreshold: number;
redisTimeoutThreshold: number;
}
export interface CustomPostgresReadOnlyConfig extends CustomPostgresConfig {
@@ -35,6 +41,35 @@ export interface CustomPostgresReadOnlyConfig extends CustomPostgresConfig {
stopRetryThreshold: number;
}
export type ValidatorPattern = string | [string, string];
export interface RequestValidatorRule {
ruleName?: string;
// mostly universal
userAgent?: ValidatorPattern;
userAgentHeader?: ValidatorPattern;
videoDuration?: ValidatorPattern;
videoID?: ValidatorPattern;
userID?: ValidatorPattern;
service?: ValidatorPattern;
endpoint?: ValidatorPattern;
// sb postSkipSegments
startTime?: ValidatorPattern;
endTime?: ValidatorPattern;
category?: ValidatorPattern;
actionType?: ValidatorPattern;
description?: ValidatorPattern;
// dearrow postBranding
title?: ValidatorPattern;
titleOriginal?: boolean;
thumbnailTimestamp?: ValidatorPattern;
thumbnailOriginal?: boolean;
dearrowDownvote?: boolean;
// postCasual
casualCategory?: ValidatorPattern;
// setUsername
newUsername?: ValidatorPattern;
}
export interface SBSConfig {
[index: string]: any
port: number;
@@ -47,8 +82,12 @@ export interface SBSConfig {
discordFirstTimeSubmissionsWebhookURL?: string;
discordCompletelyIncorrectReportWebhookURL?: string;
discordMaliciousReportWebhookURL?: string;
discordDeArrowLockedWebhookURL?: string,
discordDeArrowWarnedWebhookURL?: string,
discordNewUserWebhookURL?: string;
neuralBlockURL?: string;
discordNeuralBlockRejectWebhookURL?: string;
discordRejectedNewUserWebhookURL?: string;
minReputationToSubmitChapter: number;
minReputationToSubmitFiller: number;
userCounterURL?: string;
@@ -65,16 +104,16 @@ export interface SBSConfig {
readOnly: boolean;
webhooks: WebhookConfig[];
categoryList: string[];
casualCategoryList: string[];
deArrowTypes: DeArrowType[];
categorySupport: Record<string, string[]>;
maxTitleLength: number;
getTopUsersCacheTimeMinutes: number;
maxNumberOfActiveWarnings: number;
hoursAfterWarningExpires: number;
rateLimit: {
vote: RateLimitConfig;
view: RateLimitConfig;
};
mysql?: any;
privateMysql?: any;
requestValidatorRules: RequestValidatorRule[];
minimumPrefix?: string;
maximumPrefix?: string;
redis?: RedisConfig;
@@ -83,6 +122,7 @@ export interface SBSConfig {
maxRewardTimePerSegmentInSeconds?: number;
postgres?: CustomWritePostgresConfig;
postgresReadOnly?: CustomPostgresReadOnlyConfig;
postgresPrivateMax?: number;
dumpDatabase?: DumpDatabase;
diskCacheURL: string;
crons: CronJobOptions;
@@ -95,7 +135,20 @@ export interface SBSConfig {
gumroad: {
productPermalinks: string[],
},
minUserIDLength: number
tokenSeed: string,
minUserIDLength: number,
deArrowPaywall: boolean,
useCacheForSegmentGroups: boolean
maxConnections: number;
maxResponseTime: number;
maxResponseTimeWhileLoadingCache: number;
etagExpiry: number;
youTubeKeys: {
visitorData: string | null;
poToken: string | null;
floatieUrl: string | null;
floatieAuth: string | null;
}
}
export interface WebhookConfig {
@@ -145,4 +198,4 @@ export interface CronJobOptions {
export interface DownvoteSegmentArchiveCron {
voteThreshold: number;
timeThresholdInDays: number;
}
}

View File

@@ -6,6 +6,7 @@ export type SegmentUUID = string & { __segmentUUIDBrand: unknown };
export type VideoID = string & { __videoIDBrand: unknown };
export type VideoDuration = number & { __videoDurationBrand: unknown };
export type Category = ("sponsor" | "selfpromo" | "interaction" | "intro" | "outro" | "preview" | "music_offtopic" | "poi_highlight" | "chapter" | "filler" | "exclusive_access") & { __categoryBrand: unknown };
export type DeArrowType = "title" | "thumbnail";
export type VideoIDHash = VideoID & HashedValue;
export type IPAddress = string & { __ipAddressBrand: unknown };
export type HashedIP = IPAddress & HashedValue;
@@ -21,7 +22,8 @@ export enum ActionType {
// Uncomment as needed
export enum Service {
YouTube = "YouTube",
PeerTube = "PeerTube",
Spotify = "Spotify",
PeerTube = "PeerTube"
// Twitch = 'Twitch',
// Nebula = 'Nebula',
// RSS = 'RSS',
@@ -102,7 +104,7 @@ export interface VideoData {
}
export interface SegmentCache {
shadowHiddenSegmentIPs: SBRecord<VideoID, SBRecord<string, {hashedIP: HashedIP}[]>>,
shadowHiddenSegmentIPs: SBRecord<VideoID, SBRecord<string, Promise<{hashedIP: HashedIP}[] | null>>>,
userHashedIP?: HashedIP
userHashedIPPromise?: Promise<HashedIP>;
}

View File

@@ -5,5 +5,6 @@ export type HashedUserID = UserID & HashedValue;
export enum Feature {
ChapterSubmitter = 0,
FillerSubmitter = 1
FillerSubmitter = 1,
DeArrowTitleSubmitter = 2,
}

View File

@@ -0,0 +1,4 @@
export enum WarningType {
SponsorBlock = 0,
DeArrow = 1
}

26
src/utils/checkBan.ts Normal file
View File

@@ -0,0 +1,26 @@
import { HashedUserID } from "../types/user.model";
import { db } from "../databases/databases";
import { Category, HashedIP } from "../types/segments.model";
import { banUser } from "../routes/shadowBanUser";
import { config } from "../config";
import { Logger } from "./logger";
export async function isUserBanned(userID: HashedUserID): Promise<boolean> {
return (await db.prepare("get", `SELECT 1 FROM "shadowBannedUsers" WHERE "userID" = ? LIMIT 1`, [userID], { useReplica: true })) !== undefined;
}
export async function isIPBanned(ip: HashedIP): Promise<boolean> {
return (await db.prepare("get", `SELECT 1 FROM "shadowBannedIPs" WHERE "hashedIP" = ? LIMIT 1`, [ip], { useReplica: true })) !== undefined;
}
// NOTE: this function will propagate IP bans
export async function checkBanStatus(userID: HashedUserID, ip: HashedIP): Promise<boolean> {
const [userBanStatus, ipBanStatus] = await Promise.all([isUserBanned(userID), isIPBanned(ip)]);
if (!userBanStatus && ipBanStatus) {
// Make sure the whole user is banned
banUser(userID, true, true, 1, config.categoryList as Category[], config.deArrowTypes)
.catch((e) => Logger.error(`Error banning user after submitting from a banned IP: ${e}`));
}
return userBanStatus || ipBanStatus;
}

View File

@@ -23,9 +23,7 @@ export function createMemoryCache(memoryFn: (...args: any[]) => void, cacheTimeM
}
}
// create new promise
const promise = new Promise((resolve) => {
resolve(memoryFn(...args));
});
const promise = Promise.resolve(memoryFn(...args));
// store promise reference until fulfilled
promiseMemory.set(cacheKey, promise);
return promise.then(result => {

View File

@@ -2,6 +2,7 @@ import axios from "axios";
import { Logger } from "../utils/logger";
export const getCWSUsers = (extID: string): Promise<number | undefined> =>
axios.post(`https://chrome.google.com/webstore/ajax/detail?pv=20210820&id=${extID}`)
.then(res => res.data.split("\n")[2])
.then(data => JSON.parse(data))
@@ -10,4 +11,22 @@ export const getCWSUsers = (extID: string): Promise<number | undefined> =>
.catch((err) => {
Logger.error(`Error getting chrome users - ${err}`);
return 0;
});
});
/* istanbul ignore next */
export function getChromeUsers(chromeExtensionUrl: string): Promise<number> {
return axios.get(chromeExtensionUrl)
.then(res => {
const body = res.data;
// 2024-02-09
// >20,000 users<
const match = body.match(/>([\d,]+) users</)?.[1];
if (match) {
return parseInt(match.replace(/,/g, ""));
}
})
.catch(/* istanbul ignore next */ () => {
Logger.debug(`Failing to connect to ${chromeExtensionUrl}`);
return 0;
});
}

View File

@@ -1,14 +1,14 @@
import { getHash } from "./getHash";
import { HashedValue } from "../types/hash.model";
import { ActionType, VideoID, Service, Category } from "../types/segments.model";
import { UserID } from "../types/user.model";
import { HashedUserID } from "../types/user.model";
export function getSubmissionUUID(
videoID: VideoID,
category: Category,
actionType: ActionType,
description: string,
userID: UserID,
userID: HashedUserID,
startTime: number,
endTime: number,
service: Service

View File

@@ -46,7 +46,10 @@ async function newLeafWrapper(videoId: string, ignoreCache: boolean) {
export function getVideoDetails(videoId: string, ignoreCache = false): Promise<videoDetails> {
if (!config.newLeafURLs) {
return getPlayerData(videoId, ignoreCache)
.then(data => convertFromInnerTube(data));
.then(data => convertFromInnerTube(data))
.catch(() => {
return null;
});
}
return Promise.any([
newLeafWrapper(videoId, ignoreCache)

View File

@@ -1,11 +1,12 @@
import axios from "axios";
import axios, { AxiosError } from "axios";
import { Logger } from "./logger";
import { innerTubeVideoDetails } from "../types/innerTubeApi.model";
import DiskCache from "./diskCache";
import { config } from "../config";
const privateResponse = (videoId: string): innerTubeVideoDetails => ({
const privateResponse = (videoId: string, reason: string): innerTubeVideoDetails => ({
videoId,
title: "",
title: reason,
channelId: "",
// exclude video duration
isOwnerViewing: false,
@@ -27,24 +28,58 @@ const privateResponse = (videoId: string): innerTubeVideoDetails => ({
publishDate: ""
});
async function getFromITube (videoID: string): Promise<innerTubeVideoDetails> {
export async function getFromITube (videoID: string): Promise<innerTubeVideoDetails> {
if (config.youTubeKeys.floatieUrl) {
try {
const result = await axios.get(config.youTubeKeys.floatieUrl, {
params: {
videoID,
auth: config.youTubeKeys.floatieAuth
}
});
if (result.status === 200) {
return result.data?.videoDetails ?? privateResponse(videoID, result.data?.playabilityStatus?.reason ?? "Bad response");
} else {
return Promise.reject(`Floatie returned non-200 response: ${result.status}`);
}
} catch (e) {
if (e instanceof AxiosError) {
const result = e.response;
if (result && result.status === 500) {
return privateResponse(videoID, result.data ?? "Bad response");
} else {
return Promise.reject(`Floatie returned non-200 response: ${result?.status}`);
}
}
}
}
// start subrequest
const url = "https://www.youtube.com/youtubei/v1/player";
const data = {
context: {
client: {
clientName: "WEB",
clientVersion: "2.20221215.04.01"
clientVersion: "2.20221215.04.01",
visitorData: config.youTubeKeys.visitorData
}
},
videoId: videoID
videoId: videoID,
serviceIntegrityDimensions: {
poToken: config.youTubeKeys.poToken
}
};
const result = await axios.post(url, data, {
timeout: 3500
timeout: 3500,
headers: {
"X-Goog-Visitor-Id": config.youTubeKeys.visitorData
}
});
/* istanbul ignore else */
if (result.status === 200) {
return result.data?.videoDetails ?? privateResponse(videoID);
return result.data?.videoDetails ?? privateResponse(videoID, result.data?.playabilityStatus?.reason ?? "Bad response");
} else {
return Promise.reject(`Innertube returned non-200 response: ${result.status}`);
}

Some files were not shown because too many files have changed in this diff Show More