Compare commits

..

1384 Commits

Author SHA1 Message Date
a797494aa3 fix: missed update user's password 2023-08-07 18:51:54 +08:00
353dd7f796 ci: mark non-prerelease when upload assets 2023-08-07 16:23:36 +08:00
1c00d64952 feat: rehash password with a unique salt for each user 2023-08-07 15:46:19 +08:00
ff5cf3f4fa feat: allow use token to access WebDAV 2023-08-07 14:38:50 +08:00
5b6b2f427a feat(cmd): add show token command 2023-08-07 13:49:23 +08:00
7877184bee feat(baidu_netdisk): add retry to most operations (close #4863 in #4939) 2023-08-07 13:44:28 +08:00
e9cb37122e chore(cmd): change come output for admin command 2023-08-06 23:02:22 +08:00
a425392a2b feat(cmd): set or random new password for admin 2023-08-06 22:34:02 +08:00
75acbcc115 perf: sha256 for user's password (close #3552) 2023-08-06 22:09:17 +08:00
30415cefbe perf: delete user cache after cancel 2FA 2023-08-06 20:47:58 +08:00
1d06a0019f feat(search): paging and scope (close #4381 in #4930)
Co-authored-by: Andy Hsu <i@nn.ci>
2023-08-06 15:13:23 +08:00
3686075a7f ci: change auto commit user [skip ci] 2023-08-05 16:32:06 +08:00
6c1c7e5cc0 fix(wopan): missing familyID on mkdir (close #4927) 2023-08-04 22:26:56 +08:00
c4f901b201 fix: undeclared identifier kIOMainPortDefault on darwin/arm64 2023-08-04 21:23:58 +08:00
4b7acb1389 feat(ci): add multiple ARM targets prebuilt (close #4243) 2023-08-04 20:57:56 +08:00
15b7169df4 perf: multi-thread downloader, Content-Disposition (#4921)
general: enhance multi-thread downloader with cancelable context, immediately stop all stream processes when canceled;
feat(crypt): improve stream closing;
general: fix the bug of downloading files becomes previewing stream on modern browsers;

Co-authored-by: Sean He <866155+seanhe26@users.noreply.github.com>
Co-authored-by: Andy Hsu <i@nn.ci>
2023-08-04 15:29:54 +08:00
861948bcf3 revert: "ci: auto gofmt for pull request" [skip ci]
This reverts commit 8b353da0d2.
2023-08-04 13:25:23 +08:00
e5ffd39cf2 feat: add 123Pan Share driver (close #4853 in #4898)
Co-authored-by: Andy Hsu <i@nn.ci>
2023-08-03 15:01:43 +08:00
8b353da0d2 ci: auto gofmt for pull request [skip ci] 2023-08-03 14:49:22 +08:00
49bde82426 perf(189pc): empty file upload and cache optimization (#4913)
- login captcha error
- cache optimization
- upload empty file
2023-08-03 14:08:40 +08:00
3e285aaec4 feat: add weiyun support (close #4802 in #4883)
Co-authored-by: Andy Hsu <i@nn.ci>
2023-08-02 21:39:59 +08:00
355fc576b1 issue: add config to bug report template [skip ci] 2023-08-02 21:05:50 +08:00
a69d72aa20 feat(aliyundrive_open): support resource drive (close #4889) 2023-08-02 15:50:01 +08:00
e5d123c5d3 fix(deps): update module golang.org/x/image to v0.10.0 [skip ci] (#4902)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2023-08-02 15:38:10 +08:00
220eb33f88 fix(deps): update module golang.org/x/net to v0.13.0 [skip ci] (#4903)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2023-08-02 15:16:39 +08:00
5238850036 docs: sync README [skip ci] 2023-08-02 15:15:48 +08:00
81ac963567 fix(deps): update module github.com/ipfs/go-ipfs-api to v0.6.1 [skip ci] (#4882)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2023-08-02 15:01:25 +08:00
3c21a9a520 feat: Crypt driver, improve http/webdav handling (#4884)
this PR has several enhancements, fixes, and features:
- [x] Crypt: a transparent encryption driver. Anyone can easily, and safely store encrypted data on the remote storage provider.  Consider your data is safely stored in the safe, and the storage provider can only see the safe, but not your data.
  - [x] Optional: compatible with [Rclone Crypt](https://rclone.org/crypt/). More ways to manipulate the encrypted data.
  - [x] directory and filename encryption
  - [x] server-side encryption mode (server encrypts & decrypts all data, all data flows thru the server)
- [x] obfuscate sensitive information internally
- [x] introduced a server memory-cached multi-thread downloader.
  - [x] Driver: **Quark** enabled this feature, faster load in any single thread scenario. e.g. media player directly playing from the link, now it's faster.
- [x] general improvement on HTTP/WebDAV stream processing & header handling & response handling
  - [x] Driver: **Mega** driver support ranged http header
  - [x] Driver: **Quark** fix bug of not closing HTTP request to Quark server while user end has closed connection to alist

## Crypt, a transparent Encrypt/Decrypt Driver. (Rclone Crypt compatible)

e.g.  
Crypt mount path ->  /vault 
Crypt remote path -> /ali/encrypted
Aliyun mount paht -> /ali

when the user uploads a.jpg to /vault, the data will be encrypted and saved to /ali/encrypted/xxxxx. And when the user wants to access a.jpg,  it's automatically decrypted, and the user can do anything with it.
Since it's Rclone Crypt compatible, users can download /ali/encrypted/xxxxx  and decrypt it with rclone crypt tool. Or the user can mount this folder using rclone, then mount the decrypted folder in Linux...

NB.  Some breaking changes is made to make it follow global standard, e.g. processing the HTTP header properly.

close #4679 
close #4827 

Co-authored-by: Sean He <866155+seanhe26@users.noreply.github.com>
Co-authored-by: Andy Hsu <i@nn.ci>
2023-08-02 14:40:36 +08:00
1dc1dd1f07 feat(aliyundrive_open): support livp format file download (close #4890) 2023-08-01 21:50:25 +08:00
c9ea9bce81 feat(lanzou): support login with account (close #4880 in #4885) 2023-08-01 19:44:57 +08:00
9f08353d31 feat(baidu_photo): optional delete album origin file (close #4872 in #4875) 2023-07-31 18:29:45 +08:00
ce0c3626c2 ci: remove working label on issue closed 2023-07-31 16:54:00 +08:00
06f46206db fix(baidu_photo): album download (close #4603 in #4871)
Co-authored-by: Andy Hsu <i@nn.ci>
2023-07-31 16:27:16 +08:00
579f0c06af ci: delete file after decompression
fix: no space left on device
2023-07-30 18:25:52 +08:00
b12d92acc9 perf(baidu_netdisk): optimize memory allocate 2023-07-29 17:12:43 +08:00
e700ce15e5 fix: missed progress in upload task 2023-07-29 17:09:26 +08:00
7dbef7d559 chore(deps): update actions-cool/issues-helper action to v3.5.1 (#4855)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2023-07-28 16:16:42 +08:00
7e9cdd8b07 fix(aliyundrive_open): fail limit on concurrently call (#4851) 2023-07-28 15:55:39 +08:00
cee6bc6b5d fix(terabox): slice out of range (close #4858 in #4860) 2023-07-28 15:52:20 +08:00
cfd23c05b4 fix(139): upload empty file (close #4711) 2023-07-27 19:26:22 +08:00
0c1acd72ca fix: link cache not deleted after overwriting file (close #4852) 2023-07-27 19:07:53 +08:00
e2ca06dcca docs: update go version 2023-07-27 18:32:33 +08:00
0828fd787d chore: update placeholder of version in bug_report issue template 2023-07-27 18:31:16 +08:00
2e23ea68d4 fix(aliyundrive_open): increase limit interval (close #4851) 2023-07-27 18:26:11 +08:00
4afa822bec fix(123): Use APP-side API (close #4834 in #4856) 2023-07-27 15:51:59 +08:00
f2ca9b40db fix(qbittorrent): incorrect field type (close #4843) 2023-07-25 13:31:41 +08:00
4c2535cb22 fix(115): user-agent lost on upload (close #4831) 2023-07-23 15:18:33 +08:00
d4ea8787c9 fix(123): upload file size that less than 16 MB (close #4816) 2023-07-21 14:35:18 +08:00
a4de04528a fix(123): auth-key verification (close #4811 in #4814)
Co-authored-by: Andy Hsu <i@nn.ci>
2023-07-21 14:33:45 +08:00
f60aae7499 chore(deps): update actions-cool/issues-helper action to v3.5.0 (#4801)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2023-07-21 13:55:16 +08:00
de8f9e9eee feat: SSO auto register (close #4692 in #4795)
Co-authored-by: Andy Hsu <i@nn.ci>
2023-07-20 16:30:30 +08:00
cace9db12f docs: add Japanese README [skip ci] (#4798) 2023-07-19 14:05:41 +08:00
ec2fb82836 chore: update special sponsors [skip ci] 2023-07-18 15:26:03 +08:00
afcfbf02ea chore: go mod tidy 2023-07-16 15:12:38 +08:00
cad04e07dd fix(deps): update module github.com/blevesearch/bleve/v2 to v2.3.9 [skip ci] (#4750)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2023-07-16 15:06:49 +08:00
30f732138c fix(deps): update module github.com/sirupsen/logrus to v1.9.3 [skip ci] (#4668)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2023-07-16 15:06:16 +08:00
04034bd03b fix(deps): update module github.com/jlaffaye/ftp to v0.2.0 [skip ci] (#4455)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2023-07-16 15:05:46 +08:00
6ec9a8d4c7 fix(aliyundrive_open): the temp file is not delete (close #4777) 2023-07-16 15:01:22 +08:00
3f7882b467 feat(aliyundrive_open): rapid upload (close #4766) 2023-07-15 19:33:46 +08:00
a4511c1963 refactor: change hash function 2023-07-15 16:29:44 +08:00
9d1f122717 fix(local): thumbnail rotated if exist orientation tag (close #4749) 2023-07-15 14:31:03 +08:00
5dd73d80d8 fix(123): remove stream upload method (close #4772) 2023-07-14 19:12:18 +08:00
fce872bc1b feat(123): thumbnail support (#3953) 2023-07-14 14:43:40 +08:00
df6c4c80c2 fix(123): update app-version (close #4758) 2023-07-14 14:17:29 +08:00
d2ff040cf8 feat(s3): add SessionToken field (close #4761) 2023-07-13 15:58:19 +08:00
a31af209cc fix(pikpak): hash calculation and fast upload judgment (#4745 fix #1081) 2023-07-11 22:19:21 +08:00
3f8b3da52b feat(server): add HEAD method support (close #4740) 2023-07-11 13:47:49 +08:00
6887f14ec6 feat(pikpak): allow disable media link (close #4735) 2023-07-11 13:40:58 +08:00
3e0de5eaac fix(deps): adapt module github.com/caarlos0/env/v9 (#4728) 2023-07-10 22:06:50 +08:00
61101a60f4 fix(s3): unable to copy empty folder (close #4620) 2023-07-10 14:55:19 +08:00
3529023bf9 fix(mopan): size field type(close #4734 in #4736) 2023-07-10 14:25:27 +08:00
d1d1a089a4 fix(deps): update module github.com/caarlos0/env/v7 to v9 (#4728)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2023-07-09 18:15:04 +08:00
fa66358b1e fix(sftp): read target obj of symlink file (close #4713) 2023-07-09 14:42:57 +08:00
2b533e4b91 feat: allow customize perm of unix file (close #4709) 2023-07-08 20:17:05 +08:00
d3530a8d80 fix(deps): update module golang.org/x/image to v0.9.0 (#4725)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2023-07-08 19:21:15 +08:00
6052eb3512 fix(deps): update module golang.org/x/oauth2 to v0.10.0 (#4522)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2023-07-08 15:44:42 +08:00
d17f7f7cad fix(123): judge status on get redirect_url (close #4718) 2023-07-07 19:55:37 +08:00
8bdc67ec3d fix(webdav): return 404 if error happened on handlePropfind 2023-07-05 13:52:21 +08:00
4fabc27366 fix(aliyundrive_open): panic if driver not init 2023-07-05 13:51:46 +08:00
e4c7b0f17c fix: https port is not effective 2023-07-05 13:02:52 +08:00
5e8bfb017e fix(123): add Referer to request (close #4631) 2023-07-04 18:36:46 +08:00
7d20a01dba feat!: support listen to the unix (close #4671)
Starting from this commit, the HTTP server related config all move to the scheme
2023-07-04 17:56:02 +08:00
59dbf4496f feat(offline_download): try to init client if not ready (close #4674) 2023-07-03 22:57:42 +08:00
12f40608e6 fix(oidc): use TOTP as state verification to replace the static 'state' parameter (#4665) 2023-07-03 22:41:08 +08:00
89832c296f fix: judge can proxy with ext (close #4688) 2023-07-03 20:41:37 +08:00
f09bb88846 fix(thunder): upload issues (close #4663 in #4667) 2023-06-29 13:21:30 +08:00
c518f59528 feat: add MoPan driver (close #4325 in #4659)
Co-authored-by: Andy Hsu <i@nn.ci>
2023-06-28 14:53:43 +08:00
e9c74f9959 fix: regexp rename error (close #4644 in #4653)
Co-authored-by: Andy Hsu <i@nn.ci>
2023-06-26 15:15:57 +08:00
21b8e7f6e5 fix(aliyundrive_share): add limit rate and lift rate limit restrictions (#4587) 2023-06-26 14:49:21 +08:00
2ae9cd8634 fix(dropbox): failed get link in #4639
close cfee536b96 (commitcomment-119404554)
2023-06-25 17:07:31 +08:00
cfee536b96 feat: add Dropbox driver (#4639 close #4590)
Co-authored-by: Andy Hsu <i@nn.ci>
2023-06-23 17:36:40 +08:00
1c8fe3b24c fix(aliyundrive_open): adaptive part size adjustment (#4609)
Co-authored-by: Andy Hsu <i@nn.ci>
2023-06-23 14:25:30 +08:00
84e23c397d fix(baidu_netdisk): rollback #3652 (close #4628) 2023-06-21 18:37:25 +08:00
f7baec2e65 feat: add WoPan driver (close #4541) 2023-06-17 20:20:00 +08:00
378bab32f1 chore(aliyundrive_share): increase the limit of the list api (#4588) 2023-06-17 20:10:34 +08:00
6cd8151cad fix(aliyundrive_open): change default oauth_token_url 2023-06-16 15:03:27 +08:00
541449e10f docs: add special sponsor [skip ci] 2023-06-14 05:42:21 +08:00
ca5a53fc24 fix(aliyundrive_open): openFile/list rate limit 2023-06-11 18:18:09 +08:00
f646d2a699 feat!: listen to both http & https (#4536)
Co-authored-by: Andy Hsu <i@nn.ci>
2023-06-11 18:17:37 +08:00
363e036bf0 chore: fix typo [skip ci] 2023-06-10 22:25:35 +08:00
e23f00f349 fix(139): avoid panic due to Authorization for emptiness 2023-06-10 00:12:04 +08:00
9600267bda ci: add linux-musl-amd64/arm64 to dev build 2023-06-09 23:43:52 +08:00
a66b0e0151 feat(139): auto extract account from Authorization 2023-06-09 23:41:41 +08:00
3bfa00d5d2 fix(189pc): add REQID header 2023-06-09 23:33:12 +08:00
6cbd2532cc fix(139): modify the authentication mode 2023-06-09 23:02:02 +08:00
47976af0d3 feat: set ProxyFromEnvironment for default http client (#4546) 2023-06-09 22:08:54 +08:00
4dca52be85 fix(s3): optional add filename to disposition (close #4538) 2023-06-06 22:47:27 +08:00
62bb09300d chore: fix typo [skip ci] 2023-06-06 19:34:10 +08:00
f9e067abec feat: support delayed start (#4532) 2023-06-05 16:00:31 +08:00
1e62666406 feat(baidu_netdisk): allow custom crack ua 2023-06-04 15:57:41 +08:00
0e0cdf15ef chore: change daysUntilClose [skip ci] 2023-06-03 21:15:52 +08:00
b124fdc092 perf(baidu): avoid refreshing the token on every startup 2023-06-02 18:31:42 +08:00
5141b3c165 fix(deps): update module github.com/gin-gonic/gin to v1.9.1 [security] [skip ci] (#4521)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2023-06-02 18:31:14 +08:00
881d6e271e feat: add OIDC single sign-on (#4496)
close #3914
close #4315
2023-06-02 18:22:07 +08:00
bd2418c438 feat(deps): update alpine to 3.18 2023-05-28 19:30:42 +08:00
8421c72c5c fix(seafile): driver panic while downloading or uploading file (#4491)
Co-authored-by: Andy Hsu <i@nn.ci>
2023-05-28 16:45:46 +08:00
a80e21997c feat(cloudreve): auto remove trailing slash in address (#4492)
Co-authored-by: Andy Hsu <i@nn.ci>
2023-05-28 16:18:09 +08:00
4369cbbac3 fix(alist_v3): missed Content-Length on upload (close #4457) 2023-05-27 20:23:36 +08:00
89f76d7899 feat: add UC driver (close #1127 in #4459)
Co-authored-by: lj98568 <lj98568@alibaba-inc.com>
Co-authored-by: Andy Hsu <i@nn.ci>
2023-05-27 19:36:14 +08:00
ef68f84787 fix(baidu_photo): legal album title check (close #4479 in #4487) 2023-05-27 17:07:57 +08:00
2c1f70fbe9 fix(189pc): large file upload error (close #4417 in #4438) 2023-05-27 14:28:58 +08:00
b2f5757f8d fix(copy): copy from driver that return writer (close #4291) 2023-05-26 21:57:43 +08:00
6b97b4eb20 feat(s3): set content type from stream when uploading (#4460)
Co-authored-by: guopeilun <guopl@flatincbr.com>
2023-05-24 18:02:49 +08:00
645c10c11f fix(deps): update module github.com/sirupsen/logrus to v1.9.2 (#4402)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2023-05-20 22:15:32 +08:00
571bcf07b0 fix(alias): add api prefix for proxy url (close #4392) 2023-05-19 00:12:57 +08:00
63de65be45 fix: increase timeout for http_client (close #4409) 2023-05-18 23:32:05 +08:00
a3446720a2 fix: make TlsInsecureSkipVerify enable for all request (#4386) 2023-05-14 17:05:47 +08:00
3c4c2ad4e0 feat(teambition): support s3 upload method (close #4365) 2023-05-13 23:06:25 +08:00
077a525961 fix(189): adapt new login method (close #4378) 2023-05-13 17:28:40 +08:00
5be79eb26e feat: add robots.txt setting (close #4303) 2023-05-12 16:53:15 +08:00
ddc19ab699 fix(deps): update module github.com/blevesearch/bleve/v2 to v2.3.8 [skip ci] (#4322)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2023-05-12 16:34:25 +08:00
ddfca5a29b fix(deps): update module github.com/aws/aws-sdk-go to v1.44.262 (#3285)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2023-05-12 16:25:30 +08:00
c19166be1c feat(google_drive): support sa (close #3132 in #4360)
Co-authored-by: Andy Hsu <i@nn.ci>
2023-05-12 14:47:50 +08:00
daad61443c feat(local): support thumbnail cache (close #4216) 2023-05-11 19:57:24 +08:00
4b0c01158d fix: panic on nil pointer 2023-05-11 19:44:44 +08:00
f97f1d532e fix(webdav): don't retry for put if body isn't seeker (close #4149 close #4238) 2023-05-11 18:57:35 +08:00
e15755fef0 fix(189): enable TlsInsecureSkipVerify (close #4355) 2023-05-11 18:48:31 +08:00
ea88998325 docs: add help message for mount path (#4364)
Co-authored-by: Andy Hsu <i@nn.ci>
2023-05-11 18:40:56 +08:00
74d971aa8a docs: fix git address [skip ci] (#4366) 2023-05-11 15:05:33 +08:00
d41d868a8d fix(baidu_photo): change folder name length limit (close #4351 in #4353)
Co-authored-by: Andy Hsu <i@nn.ci>
2023-05-09 20:44:57 +08:00
555cc26cbf fix(deps): update module golang.org/x/crypto to v0.9.0 (#4350)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2023-05-09 20:28:52 +08:00
ab4215080b fix(deps): update module golang.org/x/net to v0.10.0 (#4347)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2023-05-09 16:31:17 +08:00
9502f5acd7 fix(cloudreve): skip init login when using cookie (#4341) 2023-05-08 19:25:36 +08:00
b03879403f feat(cloudreve): support use cookie to login (close #4324 in #4339)
Co-authored-by: Andy Hsu <i@nn.ci>
2023-05-08 15:19:51 +08:00
ee4ac81677 fix(webdav): can't rename on infini-cloud (close #4333) 2023-05-08 14:21:12 +08:00
b69fc8c306 ci: increase daysUntilClose to avoid use stale-bot [skip ci] 2023-05-07 21:07:31 +08:00
ee6c31332d feat(drivers): ipfs api (#4265)
Co-authored-by: Andy Hsu <i@nn.ci>
2023-05-05 17:42:22 +08:00
9fa16bd5fc ci: use github helper to close stale issue 2023-05-05 16:29:59 +08:00
c77ed5fcb0 feat(aliyundrive_open): limit rate for List and Link (close #4290) 2023-05-02 22:06:03 +08:00
822be17fb9 feat(aliyundrive_open): add expiration for link (close #4061) 2023-05-02 16:12:40 +08:00
7e3b13ea2d fix: fs/list interface conversion from copy alias (close #4279) 2023-05-01 15:45:45 +08:00
f8fb48fb32 fix: cannot connect to Casdoor SSO (close #4266 in #4274) 2023-05-01 15:32:34 +08:00
4bf46268da feat(alias): support thumbnail (close #4256) 2023-04-28 00:17:15 +08:00
b7ea73b3c2 fix(aliyundrive_open): can't refresh token if access_token is empty (#4255) 2023-04-28 00:01:47 +08:00
9fbc54314d chore(aliyundrive_open): change base url 2023-04-27 16:38:40 +08:00
cf8ab29a17 feat: optional allow be mounted (close #4218) 2023-04-27 16:33:01 +08:00
51cadd2d49 fix: ignore handle in json (close #4251 close #4252) 2023-04-27 15:39:32 +08:00
2bae8e129e feat: add Casdoor single sign-on (#4222) 2023-04-26 16:01:40 +08:00
9d55ad3af6 fix(123): get download url (close #4244) 2023-04-26 15:06:24 +08:00
36cd504783 fix(alist_v3): missed meta_password update
fix: adb0739dfe (commitcomment-110328033)
2023-04-24 20:56:46 +08:00
49f13b9b90 fix(baidu_photo): upload file has web prefix (close #4233 in #4235) 2023-04-24 19:13:33 +08:00
adb0739dfe feat!(alist_v3): support username & password login (close #4226)
Breaking changes:
- rename access_token to token
- rename old password to meta_password
2023-04-23 17:48:26 +08:00
340cb940e3 fix(qbittorrent): set autoTMM (#4217) 2023-04-22 13:33:54 +08:00
8711f2a1c5 feat(quark): shard request file (close #4175) 2023-04-17 15:33:38 +08:00
7f35aab071 revert(quark): remove preset range header 2023-04-17 14:39:21 +08:00
ecd167d2f9 feat(quark): add preset range header (close #4166) 2023-04-16 19:26:03 +08:00
220fd30830 fix: the recursive subdirectory moving bug (#4171) 2023-04-16 16:08:12 +08:00
5cba10446e fix(123): adapt new upload method (close #4141) 2023-04-14 15:48:39 +08:00
a9bdb15205 ci: fix golang version in auto_lang [skip ci] 2023-04-14 13:49:13 +08:00
c5f6a90f54 fix(quark): download file size limit (close #4140) 2023-04-14 13:47:05 +08:00
46f9aefb04 feat: empty folder clear API [ckip ci] (#4132)
* 增加清理空文件夹API

* 修复嵌套文件夹删除Bug

 Author:    varg247 <varg247@gmail.com>

---------

Co-authored-by: varg247 <varg247@qq.com>
2023-04-13 15:39:21 +08:00
fdcad9c154 fix(123): incorrect endpoint (close #4046) 2023-04-12 23:04:12 +08:00
027025361a ci: fixed version of alpine 2023-04-12 16:01:49 +08:00
f1245153b9 chore(deps): upgrade to go@1.20 2023-04-12 15:42:27 +08:00
570b8be022 fix(onedrive): error check in upBig 2023-04-11 22:52:42 +08:00
86a773674a feat(task): print stack trace if panic 2023-04-11 15:16:57 +08:00
75fd0ee185 feat(s3): optional remove bucket name from path (close #4069) 2023-04-09 19:25:52 +08:00
cc43238bd1 fix(alias): disable log completely (#4054) 2023-04-09 15:46:26 +08:00
c0a6beecea fix(alias): panic on nil pointer (close #4093) 2023-04-09 14:06:04 +08:00
c77eebb035 fix(deps): update module golang.org/x/image to v0.7.0 (#4065)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2023-04-08 21:51:51 +08:00
b1efb86b28 fix(deps): update module golang.org/x/net to v0.9.0 [skip ci] (#4066)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2023-04-08 21:20:19 +08:00
0707449c8f fix(deps): update module golang.org/x/crypto to v0.8.0 [skip ci] (#4076)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2023-04-08 21:18:39 +08:00
0f8a84f67e perf(alias): disabled log on fs call (close #4054) 2023-04-07 00:02:07 +08:00
a475783b00 fix(deps): update module github.com/spf13/cobra to v1.7.0 [skip ci] (#4041)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2023-04-06 21:41:41 +08:00
67413015e8 ci: use non-upx prebuilt for windows by default 2023-04-06 21:38:57 +08:00
3a311a47af fix(deps): update module github.com/upyun/go-sdk/v3 to v3.0.4 (#4039)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2023-04-04 17:10:31 +08:00
9ccd802126 fix(123): api prefix changed (close #4038) 2023-04-04 16:39:56 +08:00
0acba7cd22 perf(123): reduce login count 2023-04-03 11:24:29 +08:00
3cdb8e7a81 fix(trainbit): incorrect filename display (#4027) 2023-04-02 21:13:20 +08:00
d3efee2ea1 fix(s3): increase PartSize if filesize > 50000MB (close #4017) 2023-04-02 16:09:27 +08:00
4ec274e748 fix(aliyundrive_open): refresh upload url if expired (#3999 close #3823)
* fix(aliyundrive_open): refresh upload url for large files

* fix(aliyundrive_open): retry upload on url expiry

* fix(aliyundrive_open): ignore 409 error

* feat(aliyundrive): cleanup upload retry logic

* feat(util): add multireadable io utility

* feat(aliyundrive_open): make upload fully stream

* feat(aliyundrive_open): refresh upload url every 20 puts

* fix(aliyundrive_open): part info panic

* chore: change refresh upload url strategy

---------

Co-authored-by: Andy Hsu <i@nn.ci>
2023-04-01 14:54:29 +08:00
3b07c72f88 fix(proxy): ignore Referer if got redirect (close #3996) 2023-03-31 20:29:55 +08:00
0c5820a98f docs(aliyundrive_open): revised the sentence that may cause ambiguity (#3989) 2023-03-29 20:26:21 +08:00
86beadc0ed fix: missed sign with enable sign_all (close #3957) 2023-03-26 16:19:01 +08:00
be62d64dba chore: cancel 2fa succeed tips 2023-03-25 18:36:13 +08:00
112363031a feat: add fine-grained control for link signing (#3924)
* Determine whether the URL requires Sign

* Add File and Mem based KV

NOT TESTED: TokenKV Function

* Change Token KV func to common func.

Add File based KV func

* Remove KV, Remove Token

I found that the original Sign function is enough to complete the link signature, and only need to add simple configuration items to meet the requirements.

* Add IsStorageSigned func to judge if Signing is enabled in the storage settings.

It should be working now.

* Add a SIGN button to the management panel.

* Add enable_sign to the basic storage struct.

Can enable sign for every driver now.

Bug: When sign enabled, in download page, Copy link doesn't contain a sign.

(Not done yet)

* Fix a bug from commit 8f6c25f.

Response of fsread function does not contain sign.

* Optimize code and follow advices.

- Add back public/dist/README.md

- Enable sign when DownProxyUrl is enabled

- Merge needSign() to isEncrypt() in fsread.go

* simplify code

---------

Co-authored-by: Andy Hsu <i@nn.ci>
2023-03-24 22:44:33 +08:00
48dc3552a6 fix(url_tree): incorrect tree structure 2023-03-24 20:34:03 +08:00
663814c9ef fix(url_tree): fix test url [skip ci] (#3940) 2023-03-24 20:26:00 +08:00
bd892e6a63 feat(drivers): new driver UrlTree (close #3268 in #3933)
* feat(drivers): new driver `urls` (close #3268)

* chore: rename

* support customize basic info or get from url

* dfs tree to calculate folder size

* go mod tidy

* add help message
2023-03-24 15:13:54 +08:00
4fd2c09845 fix(115): download issue due to ua (close #3931 in #3932) 2023-03-23 22:57:44 +08:00
0eab31bdf5 fix(local): filename with whitespace issue (#3928)
* fix(local): filename whitespace problem

* fix(deps): remove deprecated package io/ioutil

---------

Co-authored-by: XZB <i@1248.ink>
2023-03-23 15:18:37 +08:00
c6af22b97e feat: add thumbnail to fs/get api (#3927) 2023-03-23 13:59:39 +08:00
b2a5110672 feat(onedrive): support application authorization method (#3906) 2023-03-23 13:26:03 +08:00
c628992ea6 ci: add log required on question label [skip ci] 2023-03-22 14:03:04 +08:00
c65d868e09 fix(baidu_share): large file download (#3887 close #3876)
* fix(baidushare): large file download

* refactor: optimize client
2023-03-20 17:46:15 +08:00
aeb48b2ecc perf(aliyundrive_open): don't refresh token on init if token valid 2023-03-20 15:00:02 +08:00
cefec1a663 style: sort imports 2023-03-20 14:59:01 +08:00
e7ad830aa8 fix(cloudreve): captcha code ocr (#3889 close #3662) 2023-03-19 20:30:39 +08:00
b27eed265a fix(deps): update module github.com/blevesearch/bleve/v2 to v2.3.7 [skip ci] (#3874)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2023-03-19 20:26:48 +08:00
3abe26473c fix(trainbit): decode html code (#3883) 2023-03-19 15:25:06 +08:00
023107226c fix(trainbit): remove unnecessary operation (#3881) 2023-03-18 13:52:36 +08:00
8b109cfe40 fix(smb): byte alignment (close #3868) 2023-03-17 16:32:34 +08:00
b48e97d406 chore: fix release name [skip ci] 2023-03-16 22:47:01 +08:00
6c91cfeb90 chore(deps): update actions/setup-go action to v4 (#3858)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2023-03-16 18:28:51 +08:00
bfd1f25972 fix(deps): update module github.com/deckarep/golang-set/v2 to v2.3.0 [skip ci] (#3852)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2023-03-16 15:58:02 +08:00
8c0defce09 feat(task): add clear succeeded and retry (#3856 close #3776) 2023-03-16 15:56:27 +08:00
a1e88cfa05 fix(teambition): empty token for upload (close #3854) 2023-03-15 14:56:41 +08:00
443f5ffbcc feat(alias): auto flatten if only one root 2023-03-14 20:25:52 +08:00
b8bc94306d fix(alias): check obj exist for every storage (fix d9795ff) 2023-03-14 20:11:25 +08:00
d9795ff22f feat(alias): support proxy and direct together 2023-03-14 13:46:27 +08:00
c4108007cd fix: spaces in filename will be replaced with plus sign (#3841)
Co-authored-by: XZB <i@1248.ink>
2023-03-14 12:27:42 +08:00
f3db23a41e feat(qbittorrent): add offline download seed time (#3842 close #3588) 2023-03-14 12:13:23 +08:00
4741a75c92 feat(115): update upload api to v4.0 add pagesize option (#3840 close #3753) 2023-03-13 20:02:52 +08:00
301756ba03 feat(drivers): alias a new storage with multi path (close #3248) 2023-03-13 15:35:37 +08:00
3b2703a5e5 feat(drivers): add the support for Trainbit (#3813)
* feat: add the support for Trainbit
read only

* feat: add the support for Trainbit
modify the structure of code
allow to create folder, move, rename and remove

* feat: add the support for Trainbit
allow to upload file

* feat: add the support for Trainbit
get token from page

* feat: add the support for Trainbit
display progress of updating

* feat: add the support for Trainbit
fix bug of time zone

* feat: add the support for Trainbit
fix the bug of filename
2023-03-12 22:18:55 +08:00
2a601f06cb feat(drivers): add BaiduYun share link support (#3801)
新增百度网盘分享链接挂载
2023-03-12 14:00:11 +08:00
adc3a56552 feat(aliyundrive): make checksum cancellable (#3814) 2023-03-12 13:59:40 +08:00
4d9a29bddd feat(ftp): support seek/range request (#3811) 2023-03-11 21:02:47 +08:00
666e02f0c3 fix(storage): explicitly set storages' status to disabled (#3810) 2023-03-11 20:45:35 +08:00
6aaec19c1c feat: allow override startup command for Docker image (#3800)
This is to enable the use case where the stock Docker image is used with
different flags. E.g. `docker run xhofe/alist:latest ./alist server --data=mydata`

This was the behavior until PR#2818 changed it. This would make the image more usable.
2023-03-11 15:33:59 +08:00
1091e1b740 feat: file aggregation and regular rename api (#3788)
* 增加文件聚合接口,将给定文件夹下所有文件移动到目标文件夹。

* 增加文件正则重命名接口。

---------

Co-authored-by: varg247 <varg247@qq.com>
2023-03-10 19:01:49 +08:00
d06c605421 fix: smb drive lastConnTime data race (#3787 close #3782) 2023-03-10 15:59:53 +08:00
43de823058 fix: path IsApply check (close #3784) 2023-03-09 21:03:56 +08:00
02d0aef611 feat(aliyundrive_open): add internal upload (aliyun ECS for Beijing area only) (#3775) 2023-03-09 20:48:30 +08:00
5596661ce8 feat(aliyundrive_open): optional delete file directly (close #3769) 2023-03-08 19:19:13 +08:00
2379cb8d67 style: go mod tidy 2023-03-08 19:08:11 +08:00
8c0ebe0841 revert: "fix(deps): update module gorm.io/gorm to v1.24.6 (#3684)" (close #3746)
This reverts commit c595fd7f94.
2023-03-08 19:07:04 +08:00
fd868bac84 fix(deps): update module github.com/caarlos0/env/v7 to v7.1.0 (#3763)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2023-03-08 16:21:43 +08:00
ebcbb29a0f feat: ping api (close #3752) 2023-03-07 19:05:52 +08:00
00ff0a43a7 feat(cmd): disable a storage with specific mountPath (close #3564) 2023-03-07 19:01:40 +08:00
3d3f23ec9e fix: upload check if disable sub folder (close #3741) 2023-03-07 14:13:39 +08:00
d484219c48 fix(security): compare auth token in constant time (#3740 close #3739) 2023-03-06 23:41:06 +08:00
dd4c97393e feat: show sso settings at a more reasonable sort (#3735) 2023-03-06 20:59:45 +08:00
07b8ff25a7 ci: auto release desktop 2023-03-06 18:05:57 +08:00
0d5c3c5080 fix(deps): update module github.com/deckarep/golang-set/v2 to v2.2.0 [skip ci] (#3727)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2023-03-06 17:54:17 +08:00
75b4429f73 feat(quark): enable NoOverwriteUpload (#3720) 2023-03-05 18:00:00 +08:00
34ef6bd18d feat(115): enable NoOverwriteUpload [skip ci] (close #3669) 2023-03-05 17:59:19 +08:00
c915313ec9 feat: rename then delete if storage doesn't support overwrite upload (close #3643) 2023-03-05 15:36:12 +08:00
12a095a1d6 fix: slice bounds out of range on CanAccess check 2023-03-05 15:29:53 +08:00
dc000f640a feat: optional log to std 2023-03-05 15:07:06 +08:00
aa1c5b2be3 fix(deps): update module golang.org/x/crypto to v0.7.0 [skip ci] (#3717)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2023-03-05 14:32:41 +08:00
1d4ec3c50d fix(deps): update module golang.org/x/net to v0.8.0 [skip ci] (#3715)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2023-03-05 14:32:26 +08:00
ebfeef52f4 fix(deps): update module golang.org/x/image to v0.6.0 [skip ci] (#3714)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2023-03-05 13:52:53 +08:00
c595fd7f94 fix(deps): update module gorm.io/gorm to v1.24.6 (#3684)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2023-03-02 19:15:50 +08:00
421052f88a fix(deps): update github.com/t3rm1n4l/go-mega digest to a01a2cd (#3665)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2023-03-02 19:03:38 +08:00
603681fbe6 feat: rebuild Single sign-on system (#3649 close #3571)
* rebuild single sign on system

* perf: use cache

* fix: codefactor check

---------

Co-authored-by: Andy Hsu <i@nn.ci>
2023-03-02 17:55:33 +08:00
f442185aa5 perf(123): optimize login error 2023-02-28 21:17:15 +08:00
ca9e739465 fix: hide apply to sub path without enable (close #3661) 2023-02-28 18:43:52 +08:00
53a1c4283b fix(baidu_netdisk): maybe optimize crack api (#3652)
User-Agent to netdisk and remove origin=dlna(is baned)
2023-02-28 18:27:07 +08:00
93dd768234 fix(webdav): disabled is not working in webdav (#3659)
A disabled user with webdav permission can use webdav normally, which is not allowed.
2023-02-28 18:26:13 +08:00
c9c4d6bc7e fix!(local): perm on mkdir (close #3626) 2023-02-26 21:25:32 +08:00
81e10f8939 ci: set prerelease before the build completes 2023-02-25 18:06:35 +08:00
4dd753de52 fix(aliyundrive_open): missed expire_sec while get link (close #3610) 2023-02-25 17:54:36 +08:00
79df63d319 chore(aliyundrive): change alert info 2023-02-25 14:28:27 +08:00
ec54831162 fix: only refresh token while do request (close #3591) 2023-02-24 20:31:12 +08:00
c8f3e8ab4d feat!: skip tls insecure verify by default 2023-02-23 22:33:54 +08:00
4be8524d80 feat: add alert for driver 2023-02-23 22:03:11 +08:00
0d3146b51d fix(webdav): disable put with empty path (close #3569) 2023-02-23 21:19:50 +08:00
f95d843969 feat(aliyundrive): add url_expire_sec for video preview (close #3522) 2023-02-23 20:50:31 +08:00
28aee8c493 feat: add aliyundrive open driver (#3437)
close #3533 
close #3521 
close #3459 
close #3375 

* feat: add aliyundrive open driver

* feat: adapt alist api

* fix: trailing spaces

* feat(aliyundrive_open): video preview api
2023-02-23 20:45:57 +08:00
de3ea82eb9 ci: add closeComment for stale 2023-02-22 22:17:33 +08:00
268ba3d069 fix(deps): update module github.com/gin-gonic/gin to v1.9.0 [skip ci] (#3551)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2023-02-22 21:24:35 +08:00
309d6558fb feat(local): add thumbnail for video with ffmpeg (#3556)
* feat(local): add ffmpeg

* fix: missed `+`

---------

Co-authored-by: Andy Hsu <i@nn.ci>
2023-02-22 21:19:42 +08:00
c08fdfc868 fix: missed assignment [skip ci] 2023-02-22 20:20:28 +08:00
1b28e6af3e ci: replace issues-helper with stale for inactive check 2023-02-22 20:07:18 +08:00
8655e33e60 fix: incorrect api if not set site_url (6c2f348) 2023-02-21 19:57:50 +08:00
50579fef84 fix: cancel api replace to avoid missing host 2023-02-21 19:45:09 +08:00
e39299bfe2 fix(local): missed type of MkdirPerm (923937b) 2023-02-21 17:45:15 +08:00
d1ab2443f1 feat(qbittorrent): delete tags when deleting qbittorrent tasks (#3546)
* feat & refactor(qbittorrent/client): support `deleteFiles` arg for `Client.Delete()` method

* feat(qbittorrent/client): also delete tags in `Client.Delete()`
2023-02-21 16:45:41 +08:00
658cf368bb fix(deps): update github.com/t3rm1n4l/go-mega digest to b87ebf5 (#3539)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2023-02-21 16:43:37 +08:00
fd36ce59f6 fix(onedrive): either id or path in parentReference must be specified (close #3028) 2023-02-21 16:19:46 +08:00
95b3b87672 feat(sftp): support range header 2023-02-20 16:57:52 +08:00
0d07d81802 feat(smb): support range header (close #3192) 2023-02-20 16:46:38 +08:00
923937b530 feat(local): custom mkdir perm (close #3196) 2023-02-20 16:20:36 +08:00
09492193c4 fix(alist_v3): api error pass (close #3326) 2023-02-20 16:15:52 +08:00
40b26a81a0 fix!: change default epub viewer (close #3519) 2023-02-20 16:08:10 +08:00
4293a0ba8c fix(deps): update module github.com/golang-jwt/jwt/v4 to v4.5.0 [skip ci] (#3525)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2023-02-20 16:06:35 +08:00
6c2f3486fc fix!: reverse proxy to sub-directory (#3483)
from this commit, if you want reverse proxy to sub-directory like `alist` with `nginx`, you need config:

```nginx
location /alist/ {
    proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
    proxy_set_header X-Forwarded-Proto $scheme;
    proxy_set_header Host $http_host;
    proxy_set_header X-Real-IP $remote_addr;
    proxy_set_header Range $http_range;
    proxy_set_header If-Range $http_if_range;
    proxy_redirect off;
    proxy_pass http://127.0.0.1:5244/alist/;
    # the max size of file to upload
    client_max_body_size 20000m;
}
```
2023-02-18 19:03:07 +08:00
3c7512f64a fix(qbittorrent): fix two file transferring related bugs [skip ci] (#3501)
* fix(qbittorrent): delete qbittorrent task before transferring

* fix(qbittorrent): parse the path correctly when the torrent contains folders
2023-02-18 18:54:51 +08:00
84219d3d70 fix(deps): update module gorm.io/driver/mysql to v1.4.7 [skip ci] (#3495)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2023-02-18 18:13:41 +08:00
05d3727335 fix(deps): update module golang.org/x/image to v0.5.0 [security skip ci] (#3489)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2023-02-18 18:13:22 +08:00
ee77c3b113 fix: friendly tip for initial logging in [skip ci] (#3406)
* refactor: friendly tip for initial logging in

* fix CodeFactor issue

more info pls refer to: https://segmentfault.com/a/1190000043031147
2023-02-18 17:53:11 +08:00
fcaf485e0b fix(deps): update module gorm.io/driver/postgres to v1.4.8 [skip ci] (#3496)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2023-02-18 17:52:03 +08:00
bd83469bb1 fix(deps): update module golang.org/x/net to v0.7.0 [security skip ci] (#3502)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2023-02-18 17:51:33 +08:00
90f111b24f docs: translate title [skip ci] (#3498)
* Update README_cn.md

* Update README_cn.md

---------

Co-authored-by: Andy Hsu <i@nn.ci>
2023-02-18 17:50:42 +08:00
7d1034c569 fix(aliyundrive): error occurred when running multiple instances at the same time (#3448)
* fix(aliyundrive):an error occurred when running multiple instances at the same time

* Update util.go

fix(aliyunpan):clear retry count
2023-02-16 22:12:19 +08:00
236c17176c fix(123): adapt new file list api (close #3464) 2023-02-16 22:09:45 +08:00
6ee4c10e8f chore(onedrive)!: change default redirect_uri [skip ci] 2023-02-16 21:37:20 +08:00
3798634028 fix(pikpak_share): change media url to content url (close #3273) (#3441) 2023-02-16 15:42:11 +08:00
567ba5ccd4 feat(aliyundrive_share): aliyun office preview (close #3408) 2023-02-15 16:52:24 +08:00
ae2ee1821a chore: change qBittorrent setting [skip ci] 2023-02-15 16:51:29 +08:00
805b1e4fa3 fix: different url encoding (close #3423) 2023-02-15 16:20:30 +08:00
d92c10da56 fix(qbittorrent): fix multiple bugs for qbittorrent download (close #3413 in #3427)
* fix(qbittorrent): wait for qbittorrent to parse torrent and create task

#3413

* fix(qbittorrent): check task state correctly

* fix(qbittorrent): fix path sent to `op.Put()`
2023-02-15 15:58:31 +08:00
6659f6d367 fix: windows arm64 build [skip ci] 2023-02-14 20:28:05 +08:00
fe416ba15c feat!: close sign_all by default 2023-02-14 19:20:15 +08:00
de66708b24 fix(aliyundrive): device session signature error (#3398)
* fix signature

* fix: indent-error-flow [skip ci]
2023-02-14 19:17:21 +08:00
2ca3e0b8bc fix(123): incorrect download url (close #3385) 2023-02-14 15:47:41 +08:00
ae04a0a760 chore: go mod tidy 2023-02-14 15:30:33 +08:00
c28168c970 feat: support qbittorrent (close #3087 in #3333)
* feat(qbittorrent): authorization and logging in support

* feat(qbittorrent/client): support `AddFromLink`

* refactor(qbittorrent/client): check authorization when getting a new client

* feat(qbittorrent/client): support `GetInfo`

* test(qbittorrent/client): update test cases

* feat(qbittorrent): init qbittorrent client on bootstrap

* feat(qbittorrent): support setting webui url via gin

* feat(qbittorrent/client): support deleting

* feat(qbittorrent/client): parse `TorrentStatus` enum when unmarshalling json in `GetInfo()`

* feat(qbittorrent/client): support getting files by id

* feat(qbittorrent): support adding qbittorrent tasks via gin

* refactor(qbittorrent/client): return a `Client` interface in `New()` instead of `*client`

* refactor: task handle

* chore: fix typo

* chore: change path

---------

Co-authored-by: Andy Hsu <i@nn.ci>
2023-02-14 15:20:45 +08:00
46b2ed2507 fix(aliyundriver):x-device-id error code (#3390)
* fix(aliyundriver):x-drvice-id error code

* fix(aliyunpan):session signature error

* fix typo

---------

Co-authored-by: Andy Hsu <i@nn.ci>
2023-02-14 14:11:07 +08:00
22843ffc70 fix(fs): copy file if symlink failed (#3368) 2023-02-13 14:41:35 +08:00
e1b6368343 feat(aliyundrive): zero copy for local file uploads (#3359) 2023-02-12 16:13:57 +08:00
62dae50d70 feat(fs): create symbolic link instead of copy local files (close #2186 in #3354) 2023-02-12 16:03:11 +08:00
43a8ed472b fix: can't login by github after disable guest (close #3314) 2023-02-09 20:12:04 +08:00
d87878c232 ci: cancel win/arm64 on dev build [skip ci] 2023-02-09 20:05:00 +08:00
ab7dee49b0 feat: add windows/arm64 target (close #3308) 2023-02-09 19:52:40 +08:00
dca115506d fix(deps): update module golang.org/x/crypto to v0.6.0 [skip ci] (#3315)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2023-02-09 16:17:10 +08:00
be17fba0c6 fix(deps): update module golang.org/x/net to v0.6.0 [skip ci] (#3316)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2023-02-09 16:16:43 +08:00
cd58aa5efe fix(deps): update module gorm.io/driver/mysql to v1.4.6 (#3311) [skip ci]
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2023-02-09 16:00:08 +08:00
946833d2cc fix(deps): update module golang.org/x/image to v0.4.0 (#3323) [skip ci]
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2023-02-09 15:59:31 +08:00
eb42d09849 chore(deps): update docker/build-push-action action to v4 (#3200)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2023-02-08 22:22:33 +08:00
9d00492750 fix(deps): update module gorm.io/driver/postgres to v1.4.7 (#3312) [skip ci]
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2023-02-08 22:20:04 +08:00
b6711d6ab9 chore(deps): update actions-cool/issues-helper action to v3.4.0 (#3279) [skip ci]
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2023-02-08 22:12:02 +08:00
7bc46de8aa feat: settings for tls insecure skip verify (close #3306 in #3307) 2023-02-08 22:01:26 +08:00
a4f4fb2d73 chore(deps): upgrade github.com/caarlos0/env 2023-02-07 19:55:55 +08:00
a181b56ea7 feat: optional forward direct link params (close #3123) 2023-02-07 16:39:14 +08:00
d0b743d955 fix(onedrive): downloadUrl missed on personal account (close #3276) 2023-02-07 16:16:29 +08:00
a985b748e9 fix: allow_indexed check (close #3291) 2023-02-07 15:14:39 +08:00
44cb8aaafe feat: only log to std on debug/dev mode 2023-02-05 09:17:37 +08:00
51f5d1b3c4 fix(local): set perm 0777 for folder (close #2996) 2023-02-04 12:11:13 +08:00
36e0d6f787 perf(onedrive): optimize request parameter (close #3178) 2023-02-04 11:53:13 +08:00
3d0065bdcf feat!: allow disable user (close #3241)
From this commit, the guest user will be disabled by default
2023-02-04 11:44:17 +08:00
7bf8071095 fix(deps): update module github.com/aws/aws-sdk-go to v1.44.194 (#2940)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2023-02-04 11:24:47 +08:00
30d39f8e10 fix(deps): update module gorm.io/gorm to v1.24.5 (#3231)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2023-02-04 11:22:39 +08:00
20d3ef7de6 fix(139): check http code & increase chunk size (#3224)
* fixed: 大文件上传导致连接重置

Signed-off-by: aimuz <mr.imuz@gmail.com>

* revert Dockerfile

---------

Signed-off-by: aimuz <mr.imuz@gmail.com>
Co-authored-by: Andy Hsu <i@nn.ci>
2023-02-04 11:20:13 +08:00
86e5dae4d1 fix(aliyundrive_share): no permission after share_id change (#3246) 2023-02-04 11:10:28 +08:00
d89b1d4871 fix(baidu_baidu_netdisk): override for create (close #3242) 2023-02-03 18:10:39 +08:00
080e6fb22a fix(google_drive): allow download abuse file (#3217)
通过添加参数acknowledgeAbuse=true,对疑似风险文件直接下载
2023-02-01 19:43:36 +08:00
e1cd71616d feat(aliyundrive): internal upload (aliyun ECS for Beijing area only) (#3188)
Co-authored-by: wangwuxuan2011 <git@wangwuxuan.cn>
2023-01-30 11:18:08 +08:00
c92e11dad5 ci: auto build docker with aria2 2023-01-27 15:16:00 +08:00
b52e8747fa fix(alist_v3): incorrect dir on remove (close #3154) 2023-01-27 14:51:56 +08:00
14305748f0 fix(lanzou): files cannot be uploaded to the specified directory (#3157)
* Update driver.go

* fix(Lanzou):files cannot be uploaded to the specified directory

Solve the problem that files cannot be uploaded to the specified directory
2023-01-27 14:46:54 +08:00
44f8112e53 fix(s3): ignore current folder in contents (close #3137) 2023-01-25 19:58:00 +08:00
6a90b1d40a fix(deps): update module github.com/caarlos0/env/v6 to v7 (#3117)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2023-01-23 20:02:09 +08:00
b42ec3e810 fix: relative path judgment (close #3130) 2023-01-23 15:52:46 +08:00
28875ce304 fix(alist_v3): incorrect src_dir on move and copy (close #3121 pr #3124)
* fix(alist_v3):add dir check(close #3121)

* Update driver.go

Co-authored-by: Noah Hsu <i@nn.ci>
2023-01-22 18:52:54 +08:00
9b99e8ab70 fix(search): allow indexed check (close #3103) 2023-01-19 17:00:49 +08:00
98872a8fdb fix: cancel EXCLUSIVE mode on sqlite3
because it will result in failure to get admin's info
2023-01-19 16:49:43 +08:00
ce4a295008 fix!: check https with X-Forwarded-Proto
not read old setting `api_url` and `base_path` from this commit
2023-01-19 12:16:42 +08:00
bc1babb5b5 fix(lanzou): shortened filename when uploading files (#3099) 2023-01-19 12:05:14 +08:00
d61242d85d feat: add wma to default audio types (close #3088) 2023-01-18 10:50:28 +08:00
99d7105357 fix: move virtual files to end (close #3052) 2023-01-18 10:23:54 +08:00
be8a9c5f07 fix: mark progress as done after clear (#3086) 2023-01-18 09:39:32 +08:00
530e74c70b fix: avoid regular expression match current directory (#3078)
* fix: avoid regular expression match current directory

* fix: optimize and regexp exclude slash

Co-authored-by: wuxuan <refused@wuxuan.eu.org>
2023-01-17 21:54:25 +08:00
0a337756ba fix(quark): upload file integer divide by zero panic. (close #3076 pr #3077) 2023-01-17 18:02:06 +08:00
26fe0a7684 feat: customize index max depth
Because some driver's issue may cause infinite loop
2023-01-17 17:33:18 +08:00
9c7e451c03 perf: optimize sqlite3 (#3074)
- use journal mode to WAL
- set locking mode to EXCLUSIVE
- set auto vacuum

ref:
 - https://www.sqlite.org/pragma.html#pragma_journal_mode
 - https://www.sqlite.org/pragma.html#pragma_locking_mode
 - https://www.sqlite.org/pragma.html#pragma_auto_vacuum
2023-01-17 17:06:11 +08:00
8df1455f25 workflow: add tips for Reproduction 2023-01-17 16:34:56 +08:00
9d9377f65d fix(local): incorrect path of thumbnail (for 6453ae0) 2023-01-16 20:02:30 +08:00
8b523fab8b revert: add Getter interface back 2023-01-16 19:55:43 +08:00
6453ae0968 fix(search): empty parent where update (close #2810) 2023-01-16 17:33:24 +08:00
1cfd47a258 feat: install tzdata in the docker image (#3056)
* disable caching of repository metadata and installation of tzdata

* add TZ variable example
2023-01-16 13:43:15 +08:00
8e2069c554 fix: db non full-text import error (#3055) 2023-01-15 23:49:23 +08:00
6b8778a63c fix: don't save if refresh token is empty (close #2957) 2023-01-14 20:33:07 +08:00
aaa8c440fe fix(seafile): token refresh (#3010)
* docs: add Seafile support

* fix: Seafile token refresh
2023-01-13 21:20:21 +08:00
2dc5dec83c feat: add Cloudreve driver (close #2658 in #2997)
* feat: add cloudreve support

add cloudreve support

(#2658)

* docs(README): add suppuort cloudreve

* fix(cloudreve): add cookie refresh

Co-authored-by: panici <zhangjun@zjdeMacBook-Pro.local>
2023-01-12 19:57:43 +08:00
1eca2b83ed perf(terabox): optimize prompt message (#3002)
* perf(terabox):prompt login status when init the driver

* docs:add Terabox

* perf(terabox):prompt area is not available

* style(terabox): del else
2023-01-12 19:40:38 +08:00
48e6f3bb23 feat: add Seafile driver (#2964)
* feat: add Seafile driver

* docs: add Seafile support

* refactor: optimization

* fix: close redirect on `move` and `rename`

Co-authored-by: Noah Hsu <i@nn.ci>
2023-01-10 20:51:42 +08:00
0ad9e17196 feat: lazy index creation on searcher init (#2962) 2023-01-09 14:09:21 +08:00
9398cdaac1 fix(s3): allow http/https headers to be attached from CustomHost (#2959)
* add(s3):Allow http/https headers to be attached to CustomHost

* optimize

Co-authored-by: wangwuxuan <wangwuxuan@163.com>
Co-authored-by: Noah Hsu <i@nn.ci>
2023-01-08 21:47:45 +08:00
2f19d4a834 perf(lanzou): optimize the use of list cache (#2956)
* fix:local sort not cache

* perf(lanzou): Optimize the use of list cache
2023-01-08 21:31:35 +08:00
99a186d01b fix(139): upload failed (#2950)
fix: The file size is exceeded and cannot be uploaded
fix: File name has special characters, signature fails
improve: optimize memory usage
Signed-off-by: aimuz <mr.imuz@gmail.com>

Signed-off-by: aimuz <mr.imuz@gmail.com>
2023-01-08 16:31:00 +08:00
40ef233d24 fix(USS): resolve driver problem (#2942)
* remove:"Endpoint" and "CustomHost" are the same thing, remove "CustomHost"

* fix: file download url error

* fix: too many file get list error

Co-authored-by: wangwuxuan <wangwuxuan@163.com>
2023-01-08 16:30:05 +08:00
7c3ea193ff fix(lanzou):webdav unable to download and upload (close #2700)
* fix(lanzou):Unable to get folder

* fix(lanzou):webdav unable to download and upload. (close 2700)
2023-01-08 15:37:39 +08:00
7902b646ff feat: add database non full text index (close #2916) 2023-01-07 01:40:49 +08:00
1c453ae147 feat: add a switch to enable auto update index (close #2930) 2023-01-07 00:59:30 +08:00
cf5714ba73 fix(smb): use correct path (#2933)
There is no need to add a `.` prefix as there is no leading `/` in paths
2023-01-07 00:47:08 +08:00
d655340634 fix(lanzou): cookie type failed to get file (#2926) 2023-01-06 18:08:40 +08:00
8d4ac031c3 chore(deps): update module github.com/aws/aws-sdk-go to v1.44.174 [skip ci] (#2920)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2023-01-06 15:36:33 +08:00
a1ded3a339 refactor(baidu_photo): optimize code (close #2911 pr #2924) 2023-01-06 15:36:05 +08:00
4a0e47dbac fmt: go mod tidy 2023-01-05 19:34:18 +08:00
510d266da8 chore(deps): update module github.com/aws/aws-sdk-go to v1.44.173 [skip ci] (#2832)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2023-01-05 19:32:58 +08:00
35dfb36884 chore(deps): update module gorm.io/driver/mysql to v1.4.5 [skip ci] (#2881)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2023-01-05 19:31:47 +08:00
b88f4d2ba6 chore(deps): update module gorm.io/driver/sqlite to v1.4.4 [skip ci] (#2869)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2023-01-05 19:31:28 +08:00
50318da879 chore(deps): update module gorm.io/driver/postgres to v1.4.6 [skip ci] (#2867)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2023-01-05 19:18:42 +08:00
575487a0e2 chore(deps): update module gorm.io/gorm to v1.24.3 [skip ci] (#2870)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2023-01-05 19:18:15 +08:00
69d3ccaed2 chore(deps): update module golang.org/x/net to v0.5.0 [skip ci] (#2908)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2023-01-05 19:17:41 +08:00
170859a112 chore(deps): update module golang.org/x/crypto to v0.5.0 [skip ci] (#2905)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2023-01-05 19:16:56 +08:00
7fdcb106a5 chore(deps): update module golang.org/x/image to v0.3.0 [skip ci] (#2906)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2023-01-05 17:49:45 +08:00
14d4ddb752 fix(mysql): change mysql against mode (close #2903 close #2844 pr #2904) 2023-01-05 17:11:58 +08:00
428e59a844 fix(uss): close of closed channel (close #2847 #2896)
* fix(uss): close of closed channel

* fix(uss): close of closed channel

Co-authored-by: zxdstyle <xiangdong.zhu@maitang001.com>
2023-01-04 21:43:47 +08:00
1c8d895fc0 feat(terabox): add terabox driver (close #2825 close #2678 #2849) 2022-12-31 16:44:20 +08:00
fbf3fb825b fix(baidu_netdisk): file copy and file upload [skip ci] (#2848) 2022-12-31 16:43:22 +08:00
16e07ae016 fix(s3): set default root path (close #2834) 2022-12-30 14:53:01 +08:00
d1b9db38c7 feat(docker): add docker-compose file (close #2067) 2022-12-30 14:25:22 +08:00
395f0fc5f3 fix(docker): use root user as default 2022-12-30 14:21:39 +08:00
143e4cd077 fix: mysql FULLTEXT search (#2840) 2022-12-30 14:20:04 +08:00
f777a2fab4 fix: version doesn't update 2022-12-30 01:24:37 +08:00
dad3012ec3 fix(deps): update module github.com/aws/aws-sdk-go to v1.44.169 (#2816)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2022-12-29 21:22:50 +08:00
d45209edb2 fix: /entrypoint.sh permission denied 2022-12-29 17:16:30 +08:00
e89489453d fix: cache nil value for meta 2022-12-28 17:44:34 +08:00
ed6c8194a7 feat: add PUID, PGID, Umask settings to docker image (close #2525 pr #2818)
Co-authored-by: DDSRem <1448139087@qq.com>
2022-12-28 17:18:27 +08:00
83fe17c6ec feat: support github login (#2639)
* Support Github Login

* improve according to codefactor

* fix due to last updates

* optimization

Co-authored-by: Noah Hsu <i@nn.ci>
2022-12-27 22:11:22 +08:00
c00dcc8f39 fix(deps): update module github.com/gin-gonic/gin to v1.8.2 (#2785)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2022-12-25 18:20:24 +08:00
e118f4a3b9 feat: update index by req.Paths 2022-12-24 20:23:04 +08:00
5e28d0f96a fix(deps): update module github.com/aws/aws-sdk-go to v1.44.167 (#2781)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2022-12-24 16:14:20 +08:00
3af23f6792 feat: batch reload all storages (close #2762 pr #2775) 2022-12-21 19:21:18 +08:00
3a41b929c9 fix: pgsql search [skip ci] (close #2761 pr #2774) 2022-12-21 19:19:37 +08:00
105f22969c feat: support cancel for some drivers (close #2717) 2022-12-21 15:03:09 +08:00
e4a88a7c13 fix(deps): update module github.com/aws/aws-sdk-go to v1.44.164 (#2773)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2022-12-21 12:04:32 +08:00
b0255040c6 chore: fix typo 2022-12-20 20:07:19 +08:00
f1e842e12a feat: customize settings layout (close #2765) 2022-12-20 20:04:37 +08:00
d756cf3e9f fix(local): disable copying or moving to subfolders (close #2760) 2022-12-20 16:27:04 +08:00
146619134d feat: customize proxy ignore headers (close #2763 pr #2766)
* clean referer when use proxy

* feat: customize proxy ignore headers

Co-authored-by: Noah Hsu <i@nn.ci>
2022-12-20 16:08:32 +08:00
372030071e fix(deps): update module github.com/aws/aws-sdk-go to v1.44.163 [skip ci] (#2738)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2022-12-20 15:13:14 +08:00
62a06fa0f9 feat: optimize file operation interface (#2757)
* feat: optimize file operation interface

* chore: fix typo

Co-authored-by: Noah Hsu <i@nn.ci>
2022-12-20 15:02:40 +08:00
e2bcca2fbd feat: static files for embed viewers (#2739) 2022-12-19 13:34:06 +08:00
4568af9542 feat: better static file Cache-Control (#2751) 2022-12-19 13:32:00 +08:00
b50d486a63 fix: sub path check if subPath = / 2022-12-18 21:28:38 +08:00
0ae3fc608b feat: export all cmd (#2746) 2022-12-18 19:53:39 +08:00
6024e8d832 refactor: split the db package hook and cache to the op package (#2747)
* refactor:separate the setting method from the db package to the op package and add the cache

* refactor:separate the meta method from the db package to the op package

* fix:setting not load database data

* refactor:separate the user method from the db package to the op package

* refactor:remove user JoinPath error

* fix:op package user cache

* refactor:fs package list method

* fix:tile virtual paths (close #2743)

* Revert "refactor:remove user JoinPath error"

This reverts commit 4e20daaf9e700da047000d4fd4900abbe05c3848.

* clean path directly may lead to unknown behavior

* fix: The path of the meta passed in must be prefix of reqPath

* chore: rename all virtualPath to mountPath

* fix: `getStoragesByPath` and `GetStorageVirtualFilesByPath`

is_sub_path:

/a/b isn't subpath of /a/bc

* fix: don't save setting if hook error

Co-authored-by: Noah Hsu <i@nn.ci>
2022-12-18 19:51:20 +08:00
f38f4f401b fix(139): modify chunk size to avoid large file upload failure (close #2744 close #2682 pr #2745) 2022-12-18 17:48:09 +08:00
3b2ae85009 chore: only ignore root dirs (#2741) 2022-12-18 16:48:32 +08:00
faf4150d1e docs: fix badges on README.md and README_cn.md [skip ci] (#2749) 2022-12-18 16:48:03 +08:00
fb64f00640 refactor: obj name mapping and internal path processing (#2733)
* refactor:Prepare to remove the get interface

* feat:add obj Unwarp interface

* refactor:obj name mapping and program internal path processing

* chore: fix typo

* feat: unwrap get

* fix: no use op.Get to get parent id

* fix: set the path uniformly

Co-authored-by: Noah Hsu <i@nn.ci>
2022-12-17 19:49:05 +08:00
3d336b328a feat: add pikpak share driver (close #2728 pr #2731) 2022-12-16 19:10:19 +08:00
f9cf29e0b6 fix(deps): update module golang.org/x/crypto to v0.4.0 (#2638) [skip ci]
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2022-12-16 19:08:52 +08:00
cbd038f30f fix(deps): update module golang.org/x/net to v0.4.0 (#2608) [skip ci]
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2022-12-16 19:05:20 +08:00
2aeb75a779 fix(deps): update module github.com/blevesearch/bleve/v2 to v2.3.6 (#2727) [skip ci]
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2022-12-16 19:05:06 +08:00
2f8eaf6bea fix(deps): update module github.com/pquerna/otp to v1.4.0 (#2708) [skip ci]
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2022-12-16 18:15:59 +08:00
fb7a5dec1b fix(deps): update module golang.org/x/image to v0.2.0 (#2601) [skip ci]
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2022-12-16 18:15:19 +08:00
e61bac039a fix(deps): update module github.com/aws/aws-sdk-go to v1.44.161 (#2595) [skip ci]
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2022-12-16 18:14:56 +08:00
b3be9ef428 feat(search): use FULLTEXT index (close #2716 pr #2726) 2022-12-16 16:51:36 +08:00
5a6b600ace feat: show gorm log on debug/dev mode (#2720) 2022-12-15 17:48:52 +08:00
e58ca686e3 feat: cache static files (#2715) 2022-12-15 17:48:29 +08:00
6f4b1ba4b3 feat: log to stdout & file (#2709) 2022-12-14 13:19:08 +08:00
cdc45630ae fix: whereInParent when parent = "/" (#2706) 2022-12-14 10:37:09 +08:00
7947ff1ae4 feat: limit max connection count (#2701) 2022-12-14 10:33:58 +08:00
33bae52fa1 refactor: optimize driver initialization need to manually deserialize and assign values, and remove redundant driver registration parameters (#2691)
* refactor: optimize driver initialization need to manually deserialize and assign values, and remove redundant driver registration parameters

* fix typo

Co-authored-by: Noah Hsu <i@nn.ci>
2022-12-13 18:03:30 +08:00
3ee45c69a7 fix(baidu_netdisk): encode path for create (close #2690) 2022-12-13 17:57:41 +08:00
179d285564 feat: optimize database search (#2687)
* feat: remove index on `SearchNode.Name`

As we do not use s% on name column, index there does not work

* fix: init index after init data

Or on the first run, it will log 'init index error: readObjectStart: expect { or n, but found , error found in #0 byte of ...||..., bigger context ...||...'

* fix: match parent more precisely

It will match `/a/bc` if we search in `/a/b` originally.
But it is not backward compatible by adding a suffix `/`
to all the data in parent field
2022-12-12 20:20:01 +08:00
a2e8e96c71 feat: respond static file on loading storages (#2686) 2022-12-12 20:17:58 +08:00
5043815d48 fix(search): don't delete virtual folder while update indexes (close #2677) 2022-12-11 14:59:58 +08:00
1640f06e13 feat(search): multiple keywords split by space (#2669) 2022-12-10 19:28:34 +08:00
62ea93837c feat: alist v3 index permission (#2653)
* feat: alist v3 index permission

* fix allowIndexed check

Co-authored-by: Noah Hsu <i@nn.ci>
2022-12-10 19:03:09 +08:00
446f82888c fix(local): add sign to thumbnail (close #2536 close #2650) 2022-12-09 10:08:31 +08:00
6f1aeb47fd feat: index enhancement (close #2632 pr #2636)
* feat: index paths as setting

* feat: clear index (#2632)

* feat: check indexMQ more frequently
2022-12-09 10:02:13 +08:00
1f7c1b4f43 fix(cors): allow all methods (close #2640) 2022-12-08 11:35:21 +08:00
3fa0217c4b feat(alist-v3): support write (close #2626 pr #2635) 2022-12-07 19:02:28 +08:00
2dd30f2b77 feat(search): support with password 2022-12-07 10:45:02 +08:00
6e23c8b4c0 feat: partial update index (close #2593 close #2621 pr #2624) 2022-12-07 10:41:52 +08:00
72aa63adce fix: skip virtual driver on building index (close #2604 pr #2617) 2022-12-06 20:43:32 +08:00
e65e8be59e fix(search): missed base_path of user for parent (close #2611) 2022-12-06 17:28:39 +08:00
7aa4dfb240 feat: use natural sort in SortFiles (#2612) 2022-12-06 17:28:18 +08:00
bd324233a0 fix: can't paste image while report bug (#2597) [skip ci] 2022-12-06 09:19:49 +08:00
f1a9b68022 fix(index): update indexes in database 2022-12-05 20:23:37 +08:00
dda1da4576 fix(index): nil pointer call 2022-12-05 20:22:35 +08:00
5b7aa9c1cf feat: allow all cors headers (close #2571) 2022-12-05 20:05:20 +08:00
a28aaceaad chore(ci): only build on main branch 2022-12-05 19:52:02 +08:00
2bb200af87 fix(deps): update modules by renovate[bot]
fix(deps): update module github.com/sheltonzhu/115driver to v1.0.13 (#2413) [skip ci]

fix(deps): update module github.com/golang-jwt/jwt/v4 to v4.4.3 (#2526) [skip ci]

fix(deps): update module golang.org/x/image to v0.1.0 (#2587) [skip ci]

chore: go mod tidy
Co-Authored-By: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2022-12-05 19:50:49 +08:00
97f1efbb72 feat!: disable --force-bin-dir if --data is abs
related issues: #2580 #2542

after this commit, the `--force-bin-dir` would take no effect if `--data` is absolute path
2022-12-05 18:32:48 +08:00
bf8b6f4c2c feat: customize ignore paths of indexes 2022-12-05 16:45:11 +08:00
bd33c200dc feat: optimize index build 2022-12-05 16:07:36 +08:00
bc6baf1be0 fix(ci): sort lang json file 2022-12-05 14:40:46 +08:00
dc8d5106f9 feat: auto fix address in alist & smb storages (#2582) 2022-12-05 13:31:34 +08:00
8c0dfe2f3d feat: Search enhancement (#2562)
* feat: ignore AList storage on indexing

* fix: remove unused err in `walkFn`

* chore(ci): fix auto_lang trigger and run it

* feat: batch index

* feat: quit index & init index

* feat: set DocType for bleve data

* fix: build index cleanup check origin err
2022-12-05 13:28:39 +08:00
4e1be9bee6 fix: async init aria2 to optimize start duration 2022-12-04 00:00:40 +08:00
4c5285e094 chore(ci): format lang file (#2558) 2022-12-03 12:19:10 +08:00
0838feeb82 fix:introduce buffered response writer for webdav, fix status/error return failed. (#2544)
* fix: introduce buffered response writer for webdav, fix webdav status/error return failed.

* fix: bypass buffered writer for GET/HEAD/POST requests
2022-12-02 17:59:59 +08:00
ae791c8634 fix: hide check in canAccess (#2556)
修复 meta.Password 和 meta.Hide 都为空的情况下,会导致无权限访问
2022-12-02 17:44:29 +08:00
09f480318c fix: unify settings string (#2555) 2022-12-02 17:42:42 +08:00
4c5be5f07f feat: only show CanAccess search results (#2548)
* feat: only show `CanAccess` search results

* have done in frontend

Co-authored-by: Noah Hsu <i@nn.ci>
2022-12-02 10:09:39 +08:00
9c1ffdbb82 fix(aliyundrive): return error if got wrong http code (#2543) 2022-12-01 21:48:19 +08:00
18a63e34dd fix(task): memory alignment for curID (close #2541) 2022-12-01 13:16:31 +08:00
ff0bcfef8a feat: optional sign all files 2022-11-30 22:10:07 +08:00
4980b71ba3 fix: add hide check to canAccess (close #2532) 2022-11-30 22:01:33 +08:00
b5bf5f4325 fix: check if the req path is relative path (close #2531) 2022-11-30 21:38:00 +08:00
f9788ea7cf feat(webdav): delete privacy header and optimize 302 (#2534)
* fix: delete set-cookie from sharepoint webdav response header

* fix: avoid two redirects when using webdav

* fix: return the correct Content-Type instead of just `application/octet-stream`

* feat: webdav backend localOnly -> proxyOnly
2022-11-30 20:52:33 +08:00
83644dab85 fix: mapping filename in GetName
some missed filename mapping
2022-11-30 20:46:54 +08:00
d94cf72da2 fix(local): webp image decode while generate thumbnail (close #2484 pr #2520)
* Fix that webp thumb  in local storage won't load

* Simplify code

Co-authored-by: Noah Hsu <i@nn.ci>
2022-11-29 09:47:40 +08:00
e98561ceb1 fix: filename char mapping while build index 2022-11-28 21:08:11 +08:00
76f37373e0 fix: settings map read and write concurrently 2022-11-28 16:54:03 +08:00
61a06992c3 fix(aria2): directory missing (close #1856 pr #2504) 2022-11-28 14:05:28 +08:00
ddcba93eea feat: multiple search indexes (#2514)
* refactor: abstract search interface

* wip: ~

* fix cycle import

* objs update hook

* wip: ~

* Delete search/none

* auto update index while cache changed

* db searcher

TODO: bleve init issue

cannot open index, metadata missing

* fix size type

why float64??

* fix typo

* fix nil pointer using

* api adapt ui

* bleve: fix clear & change struct
2022-11-28 13:45:25 +08:00
bb969d8dc6 fix(aliyundrive_share): get share link download url directly (close #2472) 2022-11-24 18:50:04 +08:00
2383e851e2 fix: reset index before build new one (#2471) 2022-11-24 14:47:49 +08:00
330a767fd7 feat: build index & search with bleve (close #1740 pr #2386)
* feat: build index & search with bleve (#1740)

* delete unused struct

Co-authored-by: Noah Hsu <i@nn.ci>
2022-11-24 11:46:47 +08:00
2b902de6fd fix(build): switch to crazymax/xgo 2022-11-22 21:08:27 +08:00
85e1350af8 fix: check password while upload (close #2444) 2022-11-22 16:14:01 +08:00
c09800790b feat: custom filename char mapping
fixes #2447 #2446 #2440 #2409 #2006 #1979 #1507 #324 #691 #518 #430
2022-11-22 15:54:18 +08:00
25fd343069 chore(deps): update module gorm and aws-sdk
fix(deps): update module gorm.io/gorm to v1.24.2 (#2436)

fix(deps): update module github.com/aws/aws-sdk-go to v1.44.142 (#2407)

Co-Authored-By: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2022-11-21 17:37:03 +08:00
518487e3df fix(123): optimize error messages (#2415) 2022-11-19 21:48:03 +08:00
a02d9c8463 fix: check error type on file not found (#2383) 2022-11-18 01:30:37 +08:00
8beeba7c0c fix(google_drive): check token before return link (close #2392) 2022-11-17 09:08:31 +08:00
50fb49f0c3 fix(deps): update dependencies by renovate[bot] (#2344)
chore(deps): add renovate.json (#2344)

fix(deps): update module github.com/aws/aws-sdk-go to v1.44.137 (#2345)

chore(deps): update actions-cool/issues-helper action to v2.5.0 (#2346)

fix(deps): update module github.com/caarlos0/env/v6 to v6.10.1 (#2348)

fix(deps): update module github.com/gin-contrib/cors to v1.4.0 (#2349)

fix(deps): update module github.com/sirupsen/logrus to v1.9.0 (#2354) [skip ci]

fix(deps): update module gorm.io/driver/postgres to v1.4.5 (#2361)  [skip ci]

fix(deps): update module golang.org/x/crypto to v0.2.0 (#2357) [skip ci]

fix(deps): update module github.com/aws/aws-sdk-go to v1.44.138 (#2358) [skip ci]

fix(deps): update module gorm.io/gorm to v1.24.1 (#2366) [skip ci]

fix(deps): update module gorm.io/driver/mysql to v1.4.4 (#2360) [skip ci]

fix(deps): update module github.com/spf13/cobra to v1.6.1 (#2356) [skip ci]

chore(deps): update actions-cool/issues-helper action to v3 (#2367) [skip ci]

fix(deps): update module gorm.io/driver/sqlite to v1.4.3 (#2365) [skip ci]

chore(deps): update actions/checkout action to v3 (#2368) [skip ci]

chore(deps): update actions/setup-go action to v3 (#2374) [skip ci]

chore(deps): update actions/upload-artifact action to v3 (#2375) [skip ci]

chore(deps): update docker/build-push-action action to v3 (#2377) [skip ci]

chore(deps): update docker/login-action action to v2 (#2378) [skip ci]

chore(deps): update docker/metadata-action action to v4 (#2381) [skip ci]

chore(deps): update docker/setup-buildx-action action to v2 (#2382) [skip ci]

chore(deps): update docker/setup-qemu-action action to v2 (#2387) [skip ci]

fix(deps): update module github.com/aws/aws-sdk-go to v1.44.139 (#2394) [skip ci]

fix(deps): update module golang.org/x/crypto to v0.3.0 (#2395) [skip ci]

Co-Authored-By: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2022-11-17 08:49:15 +08:00
4dcaa24758 fix: cache is modified while sorting (close #2340) 2022-11-15 14:38:23 +08:00
3fbdf6f022 fix: resolve import cycle in alist v3 driver (close #2337 pr #2338) 2022-11-15 10:51:32 +08:00
aa9ba289bb fix(123): overwrite upload if file has no change (close #2324) 2022-11-14 17:58:49 +08:00
3b6d8987db chore: add id to resp of create storage 2022-11-13 20:17:10 +08:00
6e3df9f847 fix(google_drive): type of chunk_size (close #2303) 2022-11-12 18:46:38 +08:00
efe0e6af22 feat: silent start, stop and restart 2022-11-11 18:42:06 +08:00
00de9bf16d fix!: sign with the raw path instead of filename (#2258) 2022-11-11 16:24:25 +08:00
1743110a70 fix(123): incorrect order_by (close #2285) 2022-11-10 21:47:13 +08:00
0352a8e028 feat: add alist v2 driver (#2281) 2022-11-10 17:42:12 +08:00
c601bb794b feat(123): support mail login (close #2218 pr #2276) 2022-11-10 09:34:48 +08:00
42865486f1 fix(local): deal with relative symlink dir (#2274) 2022-11-09 18:15:42 +08:00
44f5cf40ef fix(115): update 115 driver lib to fix some bugs (#2275)
* fix duplicate cookies in client.List func
* rm useless cookie when init
2022-11-09 18:15:06 +08:00
c3ab378ac5 feat(google_drive): support shortcut (close #2268) 2022-11-09 16:19:33 +08:00
cdcbfb24c4 fix(local): directory handle (#2262)
* fix(local): check symlink dir

* fix(local): set size of dir to 0 (close #2264)
2022-11-09 11:20:09 +08:00
e05e2fd663 chore: change default timeout (close #2252) 2022-11-08 20:37:42 +08:00
6639cab1ae feat(google_drive): chunk upload (close #2241) 2022-11-07 20:58:52 +08:00
8241f0999a feat(google_drive): override upload (close #1880) 2022-11-07 20:35:35 +08:00
f3a5e3702d fix(189): force use CN time zone (close #2240) 2022-11-07 16:37:47 +08:00
46701a176d feat(aria2): mark aria2 seeding as complete (#2223)
Currently if using aria2 to download a torrent file, it does not
consider seeding + active as completed, so the torrent download task
only completes as aria2 stops seeding.

This commit uses seeder property of TaskInfo, and mark tasks with active
status and true seeder as complete.
2022-11-06 16:20:09 +08:00
26a29f20c3 fix: missed encode path while use down proxy (close #2208) 2022-11-06 14:46:47 +08:00
18cd45d257 fix: disable cache for 302 redirect (close #2216) 2022-11-05 15:54:51 +08:00
f0a533a77a feat(115): put UA as a variable (#2217)
In special cases, developers can pass in custom UA to solve the speed limit problem
Mainly for developers calling from outside
2022-11-05 15:50:57 +08:00
619a9aeb6c feat(115): add qrcode login (#2206) 2022-11-04 21:16:52 +08:00
7bfa5876ed fix(189pc): fix typo 2022-11-01 19:32:40 +08:00
f95ab6ee57 docs: add 115 to readme [skip ci] 2022-11-01 19:28:24 +08:00
e75f19e9c0 feat: add Referrer-Policy while redirect (pr #2160) 2022-11-01 19:19:08 +08:00
1c212f6c30 feat!: force to use the bin dir as the data dir (close #2108)
- move default log path to `data/log/log.log`
- replace `--conf` with `--data`
2022-11-01 19:16:23 +08:00
141419056d feat(115): add cloud 115 driver (#2164)
close #2112
close #1598
close #894
2022-11-01 15:31:31 +08:00
aabfe49cb9 docs: change contributors show [skip ci] 2022-10-30 15:26:31 +08:00
a3b631f9e9 fix(smb): remount smb before each operation (close #2123 pr #2140) 2022-10-30 15:05:07 +08:00
18165eb50d fix(123): get real url (#2135)
123 今天更新多加了一层跳转`https://web-pro.cjjd18.com/download/?params=base64encode(rawurl)`,导致ip如果不符则可能下载返回403,在服务器端处理获取rawurl
2022-10-27 17:02:35 +08:00
061c462f0b feat(mediatrack): get real url (#2132)
* feat:get real url for mediatrack

redirect token 一次有效,点击第二次就抛出了`400`错误。本次提交直接获取302后的真实链接返回前端

* add cache

cache 60 Second
2022-10-27 15:26:08 +08:00
5f79d665d9 feat: add alist v3 driver (close #1833 pr #2129)
* feat: add alist v3 driver (close #1833)

* chore: use generics

Co-authored-by: Noah Hsu <i@nn.ci>
2022-10-27 10:54:49 +08:00
f0cc0a76a9 chore: fix typos (#2122)
* refactor: fix typos

* Update help.go

Co-authored-by: Noah Hsu <i@nn.ci>
2022-10-26 14:05:56 +08:00
dd4674e486 feat: add smb driver (close #1746) (#2114)
* feat: add smb driver (close #1746)

* Update driver.go
2022-10-25 23:00:23 +08:00
0019959eec fix: delete cache if files is empty 2022-10-25 16:42:06 +08:00
3e9c38697d fix: use utils.Log during static file router init (#2100)
formerly the log is not in stderr
2022-10-24 23:33:57 +08:00
e3b7c41199 docs: update demo url & sponsor content [skip ci] 2022-10-24 22:48:36 +08:00
a2c808c8ce fix: incorrect root path of initial storage for dev mode 2022-10-23 16:26:14 +08:00
da7e17aa38 feat(local): add show hidden config (#2087) 2022-10-23 14:53:07 +08:00
02df3759df docs: fix typo [skip ci] 2022-10-20 14:29:28 +08:00
4fef500795 feat(user): set default password of init user from env (#2058)
add init user default password

Signed-off-by: ysicing <i@ysicing.me>

Signed-off-by: ysicing <i@ysicing.me>
2022-10-19 20:06:06 +08:00
07ece452b3 docs: fix docker link [skip ci] 2022-10-19 17:08:01 +08:00
b8cf02ca68 fix(aria2): retry 5 times for get status (close #1857) 2022-10-18 15:27:19 +08:00
3db798a82a feat(google_photo): Add categories in root, add album support. (#2046)
* feat(google_photo): Add categories in root, add album support.

* fix(google_photo): Remove else block in `drive/google_photo/types.go:60`
2022-10-18 15:19:05 +08:00
45cc0cedbd fix(s3): mkdir and delete (close #2029) 2022-10-18 15:10:47 +08:00
2efade123e fix(189pc):slice bounds out of range close #2045 2022-10-17 22:39:51 +08:00
fc393f743f fix(thunder):no additional processing when the deviceID is correct 2022-10-17 22:37:17 +08:00
0e99e7e9b9 fix(thunder,189pc): some known problems 2022-10-17 00:54:39 +08:00
7a95850c1b fix(google_drive): incorrect ModifiedTime (close #2002) 2022-10-14 14:17:33 +08:00
549355bb29 build: change golang version 2022-10-12 17:35:44 +08:00
55aa8ee3b1 fix: version print of build script [skip ci] 2022-10-12 17:24:04 +08:00
1c22fc367e docs: change badges in readme 2022-10-12 17:08:40 +08:00
5ea8d62aa4 fix(onedrive): unable to operate if path contains % (close #1965) 2022-10-11 14:21:58 +08:00
baebc2fbe9 fix: can't delete disabled storage (close #1942) 2022-10-09 22:20:48 +08:00
8c69260972 fix(webdav): set mime by ext if it's empty 2022-10-09 19:29:55 +08:00
30f992c6a8 feat(onedrive): customize chunk size (close #1927) 2022-10-08 22:23:33 +08:00
dcaaae366b feat: add support for mega.nz (close 1553) 2022-10-08 22:16:41 +08:00
284035823f feat: add Google Photo support (#1853)
* feat: add Google Photo support

* fix: fetch all pages

* chore(google_photo): add meta info

Co-authored-by: Noah Hsu <i@nn.ci>
2022-10-07 20:36:56 +08:00
be8ff92414 docs: replace qq with discord [skip ci] 2022-10-05 14:17:00 +08:00
a4c846a424 chore(onedrive): set default value for region 2022-10-01 20:09:57 +08:00
451e418b18 perf: return cache before check obj to reduce recursion 2022-09-28 21:19:36 +08:00
4e13b1a83c perf: modify onedrive upload chunk size (#1831 close #1790)
improve onedrive upload speed
2022-09-27 20:29:54 +08:00
9d2e9887af docs: create FUNDING.yml [skip ci] 2022-09-27 14:41:43 +08:00
dc73c2e97d fix: custom token expires in doesn't work 2022-09-27 14:23:56 +08:00
a624121095 ci: manual trigger github actions 2022-09-27 14:12:36 +08:00
9d9c79179b feat: custom token expires in 2022-09-27 14:05:00 +08:00
b7479651e1 fix: incorrect base_path from site_url (close #1830) 2022-09-27 13:56:32 +08:00
2fc0ccbfe0 fix: don't init aria2 in new goroutine (close #1752) 2022-09-26 15:11:08 +08:00
f86ad1dce4 fix: create temp dir perm with 777 (close #1813) 2022-09-26 14:48:59 +08:00
f0181d92cd fix: keep type of setting item is correct 2022-09-25 21:20:32 +08:00
5ac6a30c56 fix: use get_share_link_download_url if can't get_download_url (close #1753) 2022-09-25 20:32:11 +08:00
96d8a382e8 fix(aliyundrive_share): reget share token if token expired (close #1798) 2022-09-25 20:14:33 +08:00
7c32af4649 refactor!: move api_url and base_path to config file 2022-09-25 17:57:54 +08:00
03dbb3a403 chore: fix typo of env name 2022-09-25 17:41:04 +08:00
a570e4c7a0 fix: some settings don't take effect at startup 2022-09-23 20:37:49 +08:00
539c47bd3b chore: change log if aria2 not ready 2022-09-23 20:04:47 +08:00
b6d9018ebd fix: sorting by modified doesn't work (close #1756) 2022-09-23 12:30:32 +08:00
c929888e39 fix(123): change remove api (close #1760) 2022-09-23 12:28:57 +08:00
af946ff13e fix(baidu_photo): cannot download when proxy is opened 2022-09-23 01:15:12 +08:00
0039dc18e1 fix: set cdn to basePath if cdn is empty 2022-09-22 17:11:45 +08:00
4d6ab53336 feat: add form upload api (close #1693 #1709) 2022-09-22 16:53:58 +08:00
c7f6684eed chore: add provider to fs list resp 2022-09-22 16:04:10 +08:00
b71ecc8e89 chore: add a default polyfill to head 2022-09-22 11:29:39 +08:00
3537153b91 feat: add aliyundrive share driver (close #1215) 2022-09-21 22:00:06 +08:00
9382f66f87 fix(aliyundrive): thumbnail missed 2022-09-21 21:59:07 +08:00
656f5f112c fix(ftp): nil pointer dereference (close #1722) 2022-09-20 22:23:22 +08:00
9181861f47 fix: illegal files are not displayed (close #1729) 2022-09-20 20:14:38 +08:00
1ab73e0742 feat: add lanzou driver 2022-09-20 15:29:40 +08:00
57686d9df1 fix(189): file size missed 2022-09-19 19:35:07 +08:00
ca177cc3b9 fix: set default mimetype to empty string (close #1710) 2022-09-19 18:58:40 +08:00
d8dc8d8623 fix: dir duplicate creation (close #1687) 2022-09-19 13:43:23 +08:00
5548ab62ac fix: write does not take effect on the current dir (close #1711) 2022-09-19 13:35:37 +08:00
d6d82c3138 fix: page crashes if ipa name contains chinese (close #1712) 2022-09-19 13:33:23 +08:00
2185839236 chore: safe base64 decode ipa name 2022-09-18 20:17:24 +08:00
24d58f278a fix: don't use cache if no objs 2022-09-18 18:38:47 +08:00
f80be96cf9 chore: replace sep _ with @ of ipa name 2022-09-18 16:53:39 +08:00
6c89c6c8ae fix: aria2 download magnet link (close #1665) 2022-09-18 16:07:32 +08:00
b74b55fa4a feat: support custom bundle-identifier by filename 2022-09-17 21:33:39 +08:00
09564102e7 fix(aliyundrive): rapid upload empty file (close #1699) 2022-09-17 19:39:19 +08:00
d436a6e676 fix: use base64 encode for ipa install 2022-09-17 17:06:08 +08:00
bec3a327a7 fix: hide objs if only virtual files 2022-09-17 15:31:30 +08:00
d329df70f3 fix: failed create record if use mysql (close #1690) 2022-09-16 22:21:43 +08:00
1af9f4061e fix(s3): remove folder recursively 2022-09-16 21:25:55 +08:00
0d012f85cb feat: Add thunderExpert priority video url switch 2022-09-15 22:50:27 +08:00
e3b213c398 feat: add ca-certificates for docker (fix: #1679) 2022-09-15 18:56:30 +08:00
d9f0603271 fix: copy folder between two storage (fix #1670) 2022-09-15 17:58:32 +08:00
86a625cb40 fix: set CHARSET to utf8mb4 if use mysql 2022-09-15 17:14:03 +08:00
f22232de5d chore: baidu_photo rename only duplicate folders 2022-09-15 09:25:20 +08:00
7ad3748a46 feat: update cache after remove instead of clear 2022-09-14 20:28:52 +08:00
66b2562d03 fix: allow force root while fetch dirs (close #1671) 2022-09-14 19:57:39 +08:00
b197322cd8 fix: type of file with name uppercase 2022-09-14 15:14:04 +08:00
9e5ef974a7 fix: send on closed channel 2022-09-14 15:13:02 +08:00
08a001fbd1 feat: add a start func for external calls (#1628) 2022-09-13 20:12:57 +08:00
54ae6dce0b fix(fs/get): rawURL if use proxy (close #1664) 2022-09-13 20:02:57 +08:00
a90ef201c7 fix(189pc,baidu_photo,thunder): single link limit multithreading 2022-09-13 18:44:07 +08:00
2de0da87fa fix: infinite loop if new multi-level folder (close #1661) 2022-09-13 18:34:04 +08:00
53e08e75fe fix(189pc,baidu_photo): source file not closed 2022-09-12 22:45:30 +08:00
6b5236f52e feat: add baidu_photo driver 2022-09-12 17:10:02 +08:00
78e34f0d9f fix: log error if err != nil (close #1651) 2022-09-12 17:01:06 +08:00
6aedd0f425 fix: trim slash suffix of sign 2022-09-11 19:39:24 +08:00
5ff0d850d7 feat(aliyundrive): add doc and video preview api 2022-09-11 19:12:54 +08:00
cd73e34ccc chore: optional other interface 2022-09-11 18:40:19 +08:00
107462e42e chore: change default pdf viewer address 2022-09-11 18:27:28 +08:00
e6c2d22700 workflow: update docs address [skip ci] 2022-09-11 17:17:47 +08:00
889ddcef7e feat(baidu): update upload progress 2022-09-11 17:09:48 +08:00
68a6a0c40e fix(aliyundrive): upload empty file 2022-09-11 17:04:05 +08:00
969018db37 fix: is the root folder required (close #1633) 2022-09-11 16:23:46 +08:00
fba1471ec4 docs: add thunder in storage list [skip ci] 2022-09-11 15:26:47 +08:00
8b72ac7f80 chore: rename xunlei to thunder 2022-09-11 14:30:17 +08:00
77a6aa487b chore: cancel sign if no password 2022-09-11 14:14:14 +08:00
fd99c2197b fix: remove relative path check 2022-09-11 14:05:13 +08:00
9c91f062b9 fix(189pc): some minor problems 2022-09-11 13:18:29 +08:00
537ca030b2 chore: fix xunlei some minor problems 2022-09-11 13:09:36 +08:00
b00dcdec0d docs: Create CODE_OF_CONDUCT.md [skip ci] 2022-09-10 22:23:05 +08:00
57bcd376b4 fix(webdav): incorrect href if base_path isn't root (close #1629) 2022-09-10 19:27:34 +08:00
8d4d8648c6 ci: fetch dev version of alist-web 2022-09-10 19:05:02 +08:00
35d177b67b feat: add xunlei driver 2022-09-10 17:40:30 +08:00
40882443c2 feat: add show admin's username 2022-09-10 16:39:08 +08:00
05f19cad78 ci: add since-days for similarity-analysis [skip ci] 2022-09-10 16:18:10 +08:00
7249f277b2 ci: close issue that inactive more than 60 days [skip ci] 2022-09-10 16:10:39 +08:00
849124f177 fix(quark): default root folder id 2022-09-10 14:38:47 +08:00
f5c7a11da5 chore: add client ip to key of link cache 2022-09-10 14:12:57 +08:00
043a79189d style: uniform use utils.CreateTempFile 2022-09-10 14:11:06 +08:00
5ed43fd17d fix(123): pass ip when getting download link 2022-09-10 13:54:10 +08:00
220cd4d6b8 fix: must update version if upgrade 2022-09-10 13:47:38 +08:00
f692e6c011 fix(s3): copy or move folder (close #1336) 2022-09-10 13:42:03 +08:00
f48365929e fix(pikpak): upload empty file (close #1452) 2022-09-10 13:25:52 +08:00
56219bf096 fix(google): folder judgment missed 2022-09-10 13:09:18 +08:00
5ad3849bb6 fix: if use down proxy url 2022-09-09 20:54:11 +08:00
4af9124162 fix: error if use abs temp path (close #1624) 2022-09-09 18:50:54 +08:00
92fba9a2bf ci: remove commit-hash in version 2022-09-09 16:48:12 +08:00
63569be41d fix: wrong columnName index 2022-09-09 16:44:54 +08:00
46325655e1 ci: fix compress filename [skip ci] 2022-09-09 16:31:43 +08:00
85d13c4c5a ci: static link while build musl 2022-09-09 15:51:20 +08:00
af87131cc0 chore: fix release docker name typo [skip ci] 2022-09-09 14:42:55 +08:00
2505cb40ac docs: update readme 2022-09-09 14:35:05 +08:00
4ec42a55d6 ci: fix release files path 2022-09-09 14:15:06 +08:00
7d3c3df207 ci: fix web release url 2022-09-09 13:34:22 +08:00
362d48aa98 chore: replace main color 2022-09-08 22:21:52 +08:00
dea87d098d build: fix Dockerfile CMD arguments 2022-09-08 21:40:37 +08:00
901a74e252 ci: auto release 2022-09-08 21:22:21 +08:00
8705e48e0a ci: auto build docker image 2022-09-08 20:27:13 +08:00
ed5adc21c2 ci: ignore git commit error 2022-09-08 20:04:19 +08:00
fbaebc020f fix(189pc): wrong time if location incorrect (close #1562) 2022-09-08 20:03:07 +08:00
918ca28d2b feat: add 189cloudPC driver 2022-09-08 15:00:57 +08:00
7a12f1bddd chore: add audio_cover setting 2022-09-07 19:18:19 +08:00
4ea19ae078 chore: replace $version of cdn with webVersion 2022-09-07 18:39:04 +08:00
71d30b6819 chore: rename index to order of storage 2022-09-07 15:55:15 +08:00
53fc2f32d8 ci: ignore cp error [skip ci] 2022-09-06 22:45:17 +08:00
e07654299b fix(quark): upload commit bind resp 2022-09-06 22:41:45 +08:00
f127c959a1 feat: add MediaTrack driver 2022-09-06 17:24:05 +08:00
a24dfddc2a feat: add 189cloud driver 2022-09-06 14:39:21 +08:00
534d8d30fc feat: skip generate lang if no changes 2022-09-05 16:40:51 +08:00
868a4fd49e fix(baidu): duplicate prefix of crack link request 2022-09-05 15:59:28 +08:00
900e71f78f feat: add 139yun driver 2022-09-05 13:35:01 +08:00
3416861cab style: use utils.SliceConvert uniformly 2022-09-05 00:26:04 +08:00
25ae1b8397 feat: add yandex disk driver 2022-09-05 00:24:16 +08:00
3dd4fbd76d feat: add webdav driver 2022-09-04 22:34:54 +08:00
778cee4cdf fix: download sign check 2022-09-04 18:29:41 +08:00
9d20c887df fix: webdav_policy options 2022-09-04 14:48:21 +08:00
a1c86b3350 chore!: change root folder 2022-09-04 13:22:42 +08:00
a4a8739748 feat: add upyun-uss driver 2022-09-04 13:03:10 +08:00
ffba5e0aec feat: add sftp driver (close #1466) 2022-09-04 12:43:52 +08:00
8fd56ef9dd feat: check status before storage call 2022-09-03 22:32:09 +08:00
849de88e68 feat: add ftp driver 2022-09-03 22:07:08 +08:00
c89a462d0c feat: add s3 driver 2022-09-03 21:38:43 +08:00
5d0668b00b feat: add google_drive driver 2022-09-03 20:34:06 +08:00
7da9e33c4d fix: hide access_token in error message of baidu_netdisk 2022-09-03 19:48:11 +08:00
dcc99802ec fix: panic while create empty file 2022-09-03 19:32:44 +08:00
552aba997c fix: default root folder of baidu_netdisk 2022-09-03 10:12:28 +08:00
611457c0e7 feat: add baidu_netdisk driver 2022-09-02 22:46:31 +08:00
decea4a739 feat: add quark driver 2022-09-02 21:36:47 +08:00
0f2425ce53 feat: add teambition driver 2022-09-02 18:24:14 +08:00
bc155af255 chore: remove slash of cdn 2022-09-02 16:02:06 +08:00
2d2a4f5776 docs: add go report card [skip ci] 2022-09-01 22:49:47 +08:00
284274b37e feat: add 123pan driver 2022-09-01 22:13:37 +08:00
7290f9b301 chore: remove global_readme setting 2022-09-01 14:17:58 +08:00
454f563bce fix: task id not update 2022-08-31 22:53:41 +08:00
755f4b83f6 feat: add progress for io copy 2022-08-31 22:41:27 +08:00
8e1ed4015b fix: store storage in map whether error or not 2022-08-31 22:27:04 +08:00
d31faabc24 chore: fix typo 2022-08-31 22:08:12 +08:00
b73dce33aa fix(onedrive,ali): upload progress 2022-08-31 22:04:04 +08:00
7ac1d14eeb style: shorten name operations to op 2022-08-31 21:01:15 +08:00
9ec6d5be7a chore: just use std errors in drivers 2022-08-31 20:58:57 +08:00
817d63597e feat: add aliyundrive driver 2022-08-31 20:46:19 +08:00
102384e170 feat: add pikpak driver 2022-08-31 17:32:57 +08:00
7d407de22e feat: add a driver template 2022-08-31 16:37:00 +08:00
41edac5826 fix: convert driver name while generate lang 2022-08-30 22:11:58 +08:00
f551dc76d0 feat: add onedrive driver 2022-08-30 21:52:06 +08:00
c95a7c2a04 chore: add home_container setting 2022-08-30 19:34:11 +08:00
a6b9dbfbe4 fix: use utils.Log in some places 2022-08-30 16:13:01 +08:00
615e5dd118 fix: put a placeholder file in dist [skip ci] 2022-08-30 15:53:40 +08:00
046bbb3a48 feat: use lumberjack for log rotate 2022-08-30 15:22:54 +08:00
59ec17a353 feat: add driver config in driver info 2022-08-30 14:39:10 +08:00
fec98e7f69 ci: auto build dev version 2022-08-29 22:49:20 +08:00
68a125491b chore: add refresh arg in list func 2022-08-29 19:15:52 +08:00
97d4114e38 fix: check err before check upload 2022-08-29 14:18:43 +08:00
d267c43556 feat: static file router 2022-08-28 23:13:03 +08:00
e5480b99be chore: decode filePath in header 2022-08-28 20:46:33 +08:00
e72a557b96 ci: minimize the event that triggers the workflow 2022-08-28 15:39:51 +08:00
a6f3094c9a chore: graceful restart or stop 2022-08-28 15:34:12 +08:00
5ab5cc327f feat: generate plist for ipa 2022-08-28 15:23:00 +08:00
74007a1d45 chore: add pagination settings 2022-08-27 23:07:48 +08:00
37eb3dd8f5 ci: push main branch directly 2022-08-27 18:51:10 +08:00
fbcf082ca7 feat: auto generate settings lang 2022-08-27 18:35:05 +08:00
cc9ccc4e9b ci: auto generate drivers lang file 2022-08-26 19:06:32 +08:00
7425e001db feat: auto generate drivers language json 2022-08-26 15:08:31 +08:00
d9ee174dd3 feat!: unity iframe preview 2022-08-23 16:50:54 +08:00
e9927806d4 fix(local): return ObjectNotFound if can't find file 2022-08-19 11:02:00 +08:00
38db3508a5 chore: add external_previews setting 2022-08-18 11:34:02 +08:00
d1b5c3e648 docs: fix preview dev change 2022-08-17 14:02:05 +08:00
02e2c809a8 chore: rename some request param 2022-08-14 23:52:14 +08:00
8cd05275f0 chore: change message type 2022-08-14 03:05:30 +08:00
fe0dee1196 docs: fix typo 2022-08-13 15:38:03 +08:00
05d8c27918 chore: rename icon_color to main_color 2022-08-13 15:11:46 +08:00
06e15fc149 feat: encode path of url (close #1351) 2022-08-12 14:51:23 +08:00
0f853c86da fix: do not operate storage in memory if disabled 2022-08-11 21:46:03 +08:00
0fdfd1f2c2 feat: load storages while starting 2022-08-11 21:32:33 +08:00
74f1154e5e feat: add disable option for storage (close #1476) 2022-08-11 21:08:50 +08:00
af884010d1 feat: local storage image thumbnail 2022-08-11 20:32:17 +08:00
fda4db71bf ci: new issue bot 2022-08-10 20:05:39 +08:00
669ccc40a1 chore: change related of fs get api 2022-08-10 10:48:14 +08:00
358212749b chore: add home_icon setting 2022-08-09 18:06:04 +08:00
d8b56042c3 chore: ignore opt_secret while marshal 2022-08-08 16:29:56 +08:00
6f48a0a82a chore: add custom office viewer 2022-08-08 13:03:34 +08:00
2b04cf4ac3 feat: custom hide error message by regexp (close #1468) 2022-08-08 12:53:53 +08:00
d6437a337f feat: add provider to obj get api 2022-08-08 00:58:32 +08:00
61fa6f38a8 feat: add type to fs read api 2022-08-08 00:51:05 +08:00
ccce6a30bb ci: temporarily use self-modified issue-helper 2022-08-07 21:03:37 +08:00
1fd4ebe53e feat: add related objs while get obj 2022-08-07 21:01:29 +08:00
2e8322e99b feat: set cache_expiration for each storage (close #1455) 2022-08-07 13:33:53 +08:00
5b40254e3b chore: fix drivers not import 2022-08-07 13:23:15 +08:00
0df3473337 feat: use cobra and add some command 2022-08-07 13:09:59 +08:00
2b5da3ef34 feat: cancel 2fa api 2022-08-07 11:59:33 +08:00
d01958a6bf chore: and otp to current user resp 2022-08-06 17:21:32 +08:00
a6ed4afdae feat: 2fa/otp support 2022-08-06 01:22:13 +08:00
b51e664543 chore: go fmt 2022-08-03 14:26:59 +08:00
721f18a7f4 feat: fs other api 2022-08-03 14:14:37 +08:00
2a68c3cc7b feat: add thumbnail to list resp 2022-08-03 13:03:45 +08:00
71a6ebaf43 chore: dev test 2022-08-02 22:16:58 +08:00
c7128133d6 chore: rename remove to delete 2022-07-31 21:42:01 +08:00
829ef271e3 chore(deps): upgrade cache pkg 2022-07-31 21:23:19 +08:00
cb06d3a19a feat: remove and clear task 2022-07-31 21:21:54 +08:00
be452aafde chore: fix err nil pointer 2022-07-30 22:04:21 +08:00
33b7d75d8a chore: if file exist and size = 0, delete it while upload 2022-07-30 20:04:21 +08:00
8c27ca3e8b chore: import fmt 2022-07-29 18:22:42 +08:00
eface83716 chore: set initial guest permission 0 2022-07-27 21:53:21 +08:00
212dbb277e fix: empty storage virtual file 2022-07-27 20:57:12 +08:00
53fd09814a feat: user and meta get api 2022-07-27 17:41:25 +08:00
b399c924b7 chore: slice convert util 2022-07-27 17:08:29 +08:00
e707d6b26e chore: change select values case 2022-07-27 15:49:18 +08:00
4ba04fa7db chore: rename main items 2022-07-27 11:43:49 +08:00
5166d73b4d chore: unified function name 2022-07-23 21:49:09 +08:00
826e4807dc chore: add current user log 2022-07-23 21:33:53 +08:00
4691142f80 fix: webdav_policy default value 2022-07-23 21:19:27 +08:00
9d92834ee3 chore: password can be empty when update me 2022-07-23 20:49:16 +08:00
4f3129ec28 feat: change current user's profile 2022-07-23 20:42:12 +08:00
fb65e98fa3 chore: add fuse package 2022-07-20 00:39:20 +08:00
90a5c175ed feat: virtual driver 2022-07-19 19:55:54 +08:00
872e7cf87b fix: virtual obj is a folder 2022-07-19 18:10:02 +08:00
638db77ca1 chore: rename local struct 2022-07-19 17:11:53 +08:00
fe94016289 chore: set default root folder in driver config 2022-07-19 17:07:12 +08:00
184b9d1e6c feat: get storage by id api 2022-07-18 23:02:14 +08:00
e08810a12f chore: fix test typo 2022-07-18 14:52:34 +08:00
303d245e0f docs: add sponsor 2022-07-18 00:48:55 +08:00
a16da3b45e chore: fix typo 2022-07-12 18:41:16 +08:00
2bff656f00 chore: rename VirtualPath to MountPath 2022-07-12 14:11:37 +08:00
fbc858b43c chore: optimize get settings 2022-07-12 14:03:03 +08:00
4ac312fd07 chore: add version to aria handle 2022-07-12 14:02:29 +08:00
b1d563c874 chore: add uuid to token 2022-07-12 14:01:43 +08:00
6ebb36b2eb chore: deprecated settings test data 2022-07-11 22:36:30 +08:00
3691ee5861 chore: use variable 2022-07-11 22:22:30 +08:00
dc38f21294 chore: rename controllers to handles 2022-07-11 17:12:50 +08:00
8971a924f1 fix: clear password while get current user 2022-07-10 17:09:03 +08:00
18b218c6c9 fix: the variable has the same name as the package 2022-07-10 16:39:55 +08:00
a25d76ef6e chore: fix typo 2022-07-10 16:20:13 +08:00
69d1287254 chore: remove wrapper of user 2022-07-10 15:47:09 +08:00
f102b130db chore: public settings no auth required 2022-07-10 15:23:08 +08:00
fc1204c914 chore: rename account to storage 2022-07-10 14:45:39 +08:00
efa20cc7bd feat: dirs api 2022-07-10 14:09:31 +08:00
e28c1e436d fix: only file have raw_url 2022-07-08 15:56:29 +08:00
90283ef29c chore: incorrect username retry count 2022-07-07 21:31:43 +08:00
156da2b794 fix: login don't need auth 2022-07-07 14:19:24 +08:00
9ba7cf0835 chore: add base path setting 2022-07-02 16:43:07 +08:00
fb23758d12 fix: empty public settings 2022-07-02 16:12:30 +08:00
8125fee3f9 feat: put directly api 2022-07-01 17:11:22 +08:00
e3891246b9 feat: post messenger 2022-07-01 16:53:01 +08:00
a6e5edcf53 chore: fix typo 2022-07-01 16:08:08 +08:00
4340a48633 fix: put as task from web 2022-07-01 15:11:18 +08:00
4d0ae6b1ef fix: webdav move contains rename 2022-06-30 22:55:23 +08:00
53416172e7 feat: clear cache after change 2022-06-30 22:51:49 +08:00
2b1726614b feat: webdav handle 2022-06-30 22:41:55 +08:00
dd013ac0b2 chore: add webdav package 2022-06-30 18:27:26 +08:00
3934d9029e feat: hide objects 2022-06-30 16:09:06 +08:00
fba96d024f feat: add write field to list resp 2022-06-30 15:53:57 +08:00
35b04ffa9c feat: add readme field to list resp 2022-06-30 15:41:58 +08:00
e614faa99b chore: cancel task while wait for worker 2022-06-29 22:06:56 +08:00
fd55f2cbfa chore: reduce query aria2 status interval 2022-06-29 20:32:45 +08:00
f54418bdae fix: serialize task info 2022-06-29 20:28:02 +08:00
786e44d1d2 fix: init aria2 client 2022-06-29 20:07:33 +08:00
58d153e5ff fix: task list method 2022-06-29 18:56:31 +08:00
0bf724f447 feat: task manage api 2022-06-29 18:36:14 +08:00
c88680b495 chore: aria2 task wait for transfer 2022-06-29 18:12:31 +08:00
d24e51bc86 chore: user permissions 2022-06-29 18:03:12 +08:00
3c7a2f78cf chore: init db and aria2 2022-06-29 17:37:40 +08:00
8abee6504f feat: set aria2 client and add url to aria2 api 2022-06-29 17:31:37 +08:00
a09a1b814b chore: change permission check 2022-06-29 17:08:31 +08:00
bf950ee6e1 feat: set raw url in get resp 2022-06-29 16:23:31 +08:00
40548926e6 feat: fs link api 2022-06-29 16:08:55 +08:00
f275f83de0 feat: fs manage api 2022-06-29 15:01:22 +08:00
8a0915ffb1 chore: don't and slash prefix just for windows abs path 2022-06-28 22:22:02 +08:00
505b126888 chore: optional get func for driver 2022-06-28 22:13:47 +08:00
96380a50da feat: file proxy handle 2022-06-28 21:58:46 +08:00
d1efec4539 chore: common err resp log 2022-06-28 18:12:53 +08:00
67bc66fedf feat: file down handle 2022-06-28 18:00:11 +08:00
d89ec89d51 feat: sign of file 2022-06-28 15:12:40 +08:00
5dbf5db4ff feat: token and reset 2022-06-28 14:18:10 +08:00
7903ed1f52 chore: change fs get and list resp 2022-06-27 21:34:13 +08:00
c8f10703b7 feat: obj get api 2022-06-27 21:15:39 +08:00
db6b5f8950 chore: path standardize 2022-06-27 20:56:17 +08:00
74973bc5b5 fix: local relative path 2022-06-27 20:37:05 +08:00
7c0b86a9cd feat: obj list api 2022-06-27 19:51:23 +08:00
c6007aa9e6 feat: sort obj list 2022-06-27 19:10:02 +08:00
f01a81ee9c chore: settings util 2022-06-27 17:25:19 +08:00
005ded41c3 feat: settings manage api 2022-06-27 17:06:10 +08:00
1a148eee7c feat: initial setting items 2022-06-27 15:51:02 +08:00
e4c3ef0262 feat: setting model 2022-06-27 14:51:48 +08:00
6bb2b76e25 chore: move item types 2022-06-27 14:32:21 +08:00
e71aff9d94 chore: keep guest in memory 2022-06-27 14:29:36 +08:00
490df4f5fe fix: typo of environment variable (close #1280) 2022-06-27 14:01:15 +08:00
087fae1b15 chore: webdav policy of account 2022-06-27 13:58:21 +08:00
2aff218356 fix: gin.Context nil pointer 2022-06-26 20:31:04 +08:00
b98cd915a4 feat: driver manage api 2022-06-26 20:25:02 +08:00
3349982312 fix(driver): additional items 2022-06-26 20:18:12 +08:00
5783aa99f1 feat: account manage api 2022-06-26 20:00:36 +08:00
cab498e376 feat: user manage api 2022-06-26 19:36:27 +08:00
6b9bca893b chore: change whether print log 2022-06-26 19:20:19 +08:00
c67f128f15 chore: move server package to root 2022-06-26 19:10:14 +08:00
4cef3adc90 feat: meta manage api 2022-06-26 19:09:28 +08:00
acd4083399 chore: ignore password for get current user 2022-06-26 16:55:37 +08:00
7cbfe93a02 chore: set guest while token is empty 2022-06-26 16:39:02 +08:00
54ca68e4b3 chore: init users 2022-06-25 22:05:02 +08:00
b474eefd87 chore: rename store to db 2022-06-25 21:36:35 +08:00
c5295f4d72 feat: user jwt login 2022-06-25 21:34:44 +08:00
306b90399c chore: move conf package 2022-06-25 20:38:02 +08:00
7dadab95b2 fix: missed mimetype of stream in aria2 monitor 2022-06-25 15:15:54 +08:00
ee2bc99e4c feat: cancel copy for upload 2022-06-25 15:14:03 +08:00
935416de45 chore: clear parent folder cache after upload 2022-06-24 14:24:39 +08:00
3f49271db6 feat(fs): add put return after finished 2022-06-24 14:21:28 +08:00
956a5ae906 perf: extract fs func and add error log 2022-06-23 23:03:11 +08:00
40b7ecc845 chore(aria2): export task manager 2022-06-23 21:24:23 +08:00
92983aa185 chore: get or remove by states 2022-06-23 21:19:01 +08:00
6c61f1d261 chore: add state for task 2022-06-23 21:09:54 +08:00
aedcae840d test(aria2): download and transfer file 2022-06-23 17:06:17 +08:00
ffdb198247 feat(local): basic function of driver 2022-06-23 17:06:07 +08:00
3a1fcbef1c chore: close stream after put 2022-06-23 17:05:03 +08:00
ffa0bc294a chore: optimize standardize path 2022-06-23 17:04:37 +08:00
a65dcb48b4 chore: use abs temp dir 2022-06-23 16:49:37 +08:00
b971b13362 feat: dir and file check 2022-06-23 16:09:22 +08:00
d77dea733f chore: rename errors 2022-06-23 16:03:27 +08:00
fd5c3e831d chore: change size of file to int64 2022-06-23 15:57:36 +08:00
c3040fdfc3 chore: move errors 2022-06-23 15:57:10 +08:00
2612cd7f1c test(aria2): init aria2 client 2022-06-22 19:36:49 +08:00
3fe0a7bf6b refactor(task): remove Data field 2022-06-22 19:28:41 +08:00
a6df492fff refactor(aria2): extract monitor 2022-06-22 15:16:13 +08:00
72208e052a chore(fs): rename some variable and param 2022-06-22 15:03:27 +08:00
f6242d46b1 feat: add uri to aria2 2022-06-21 17:37:02 +08:00
55c4a925ba chore(fs): rename some param 2022-06-21 16:37:51 +08:00
9633af4e25 fix: typo and error handle 2022-06-21 16:25:45 +08:00
55d6434daa refactor(task): generic task manager 2022-06-21 16:14:37 +08:00
1b3387ca1a chore: aria2 notifier 2022-06-20 22:29:52 +08:00
6c552a9d62 chore: aria2 related function 2022-06-20 20:34:58 +08:00
4db25605e7 fix(fs): typo 2022-06-20 19:50:59 +08:00
a61bb6ab1f chore: add is it support upload config for driver 2022-06-20 17:14:08 +08:00
31ff31d3dd chore: add callback for task 2022-06-20 17:13:19 +08:00
d665cce739 feat: add task work limit 2022-06-18 20:38:14 +08:00
dd46e99e66 chore: set addition type as text 2022-06-18 20:10:35 +08:00
adf0178bb7 feat: add progress for task 2022-06-18 20:06:45 +08:00
6ad2cf2003 test: add task manager test 2022-06-17 22:09:34 +08:00
68ca2abd0c chore: change task.ID to uint64 2022-06-17 21:52:31 +08:00
d73a9e4734 fix: format % is missing verb at end of string 2022-06-17 21:42:56 +08:00
73c0c0bf44 chore: export copy and upload task manager 2022-06-17 21:38:37 +08:00
72a76599e4 feat: add upload file to task manager 2022-06-17 21:35:46 +08:00
b9f9e5853e fix: copy task name 2022-06-17 21:30:16 +08:00
fa6e918fc7 feat: add copy to task manager 2022-06-17 21:23:44 +08:00
53e969e894 feat: task manager 2022-06-17 16:31:41 +08:00
6d0e54d87e chore: add driver for issue template 2022-06-17 16:31:31 +08:00
626e878861 chore: update issue template 2022-06-17 16:31:25 +08:00
52575f6ad6 feat: add meta model and test 2022-06-17 16:31:19 +08:00
ca13678105 fix: add where for get user by name 2022-06-17 16:31:19 +08:00
355db3ab9b feat: standardization virtual path while create and update 2022-06-17 16:31:19 +08:00
04f43cb684 fix: comment typo 2022-06-17 16:31:19 +08:00
52ab1310be feat: set path as ID if it's empty 2022-06-17 16:31:19 +08:00
56c95eadea feat: add user model 2022-06-17 16:30:49 +08:00
1df5472855 docs: add version explanation 2022-06-15 21:58:20 +08:00
9aa7074600 test: add get balanced account test 2022-06-15 21:52:31 +08:00
69647f73f0 chore: rename some symbols 2022-06-15 20:41:17 +08:00
09ef7c7106 refactor: change driver interface 2022-06-15 20:31:23 +08:00
d9eb188b7a feat: check parent dir before upload 2022-06-15 19:20:36 +08:00
083395ee53 feat: recursive create folder 2022-06-15 19:10:11 +08:00
2d60dab13c feat: copy files between 2 accounts 2022-06-15 18:58:26 +08:00
4fa7846f00 feat(local): check root folder while init 2022-06-15 18:48:30 +08:00
9fcdbec5c9 feat: get file stream from link 2022-06-15 18:08:13 +08:00
979f8383d8 chore: move some types to model 2022-06-15 18:06:42 +08:00
2cddd3cf2b chore: add aria2 rpc package 2022-06-15 17:15:22 +08:00
c65a9b3001 fix: typo 2022-06-15 14:57:13 +08:00
066ddd3e09 chore: create temp file util 2022-06-15 14:56:43 +08:00
6cdd85283b chore: reduce cache shards 2022-06-14 22:37:41 +08:00
5780d9d834 test: add GetAccountVirtualFilesByPath test 2022-06-14 22:23:33 +08:00
097b516dc5 fix: wrong virtual file name 2022-06-14 22:23:10 +08:00
b73dbee7e6 chore: don't export func GetAccountsByPath 2022-06-14 19:49:17 +08:00
b8e4a2e7c0 test: add driver and account test 2022-06-14 19:44:25 +08:00
0d4542a3f1 fix: delete account driver after get 2022-06-14 19:16:27 +08:00
7c4d28d55a feat: replace with generic_sync.MapOf 2022-06-14 19:09:54 +08:00
1143331b4d chore: task and message package 2022-06-14 17:19:43 +08:00
e4b956b091 chore: set log structure first 2022-06-14 17:18:58 +08:00
e3d2e6dd64 fix(local): local storage should haven't cache 2022-06-14 17:18:11 +08:00
6accc2eff6 feat: add NoCache config for driver 2022-06-13 21:15:58 +08:00
c525406516 feat: add cache for list files 2022-06-13 21:14:01 +08:00
6056fdbddc feat: use singleflight to prevent cache breakdown 2022-06-13 20:24:13 +08:00
2f52b5d354 feat: link cache 2022-06-13 19:56:33 +08:00
e16ab876aa feat: add expiration field for Link 2022-06-13 15:39:47 +08:00
3e8f36e9f3 feat: get root folder file 2022-06-13 14:53:44 +08:00
3135775250 fix: composite literal uses unkeyed fields 2022-06-11 19:01:20 +08:00
77b0c69112 feat: extract get function 2022-06-11 14:43:03 +08:00
ec89bb70c7 feat: fs and operations 2022-06-10 21:00:51 +08:00
cd7e9974df feat: add root prefix before operate 2022-06-10 20:20:45 +08:00
354dee67dc feat(fs): get file object 2022-06-10 17:26:43 +08:00
122b7baa73 feat(fs): list files 2022-06-10 17:18:27 +08:00
c5e5666b64 feat: set account modified time 2022-06-10 16:51:20 +08:00
7b6f11fa52 feat: get account by path 2022-06-10 16:49:52 +08:00
2481676c46 feat: get account files by path 2022-06-09 23:05:52 +08:00
164dab49ac feat: get accounts by path 2022-06-09 23:05:27 +08:00
e1a2ed0436 feat: driver and account operate 2022-06-09 17:11:46 +08:00
5b73b68eb5 feat: add log enable config 2022-06-09 15:12:34 +08:00
cd21f14106 fix: additional field type 2022-06-08 17:01:36 +08:00
65fba7936c chore: replace string with const 2022-06-08 16:42:06 +08:00
ba648fa10c feat: get type from field's type 2022-06-08 16:32:20 +08:00
ae755db2d2 feat: driver additional items parse 2022-06-08 16:20:58 +08:00
677047c80b feat: improve driver 2022-06-07 22:02:41 +08:00
0d93a6aa41 feat: driver manage 2022-06-07 18:13:55 +08:00
84eb978731 feat: sort and proxy config 2022-06-07 16:38:31 +08:00
ac0f984136 feat: driver config 2022-06-07 16:31:28 +08:00
79965ab4b3 feat(driver): add args to init and update func 2022-06-06 22:54:03 +08:00
492476dfe4 feat: additional info of account 2022-06-06 22:31:56 +08:00
62ac168226 chore: delete placeholder README 2022-06-06 22:08:39 +08:00
09616dbe25 feat: set gin log writer 2022-06-06 22:06:33 +08:00
fced60c2b5 feat: basic structure 2022-06-06 21:48:53 +08:00
b76060570e refactor: init v3 2022-06-06 16:28:37 +08:00
eb15bce24b ci: auto generate changelog 2022-06-06 16:22:12 +08:00
52814266b8 chore: Merge pull request #1200 from alist-org/all-contributors/add-XZB-1248
docs: add XZB-1248 as a contributor for code
2022-06-06 16:16:11 +08:00
f845ec05e0 docs: update .all-contributorsrc [skip ci] 2022-06-06 08:15:50 +00:00
29fb02c886 docs: update CONTRIBUTORS.md [skip ci] 2022-06-06 08:15:49 +00:00
072e854a71 chore: Merge pull request #1199 from alist-org/dev
Merge dev branch
2022-06-06 16:10:35 +08:00
cae0a5f603 chore: Merge pull request #1191 from XZB-1248/dev
fix: filename is urlencoded when using Safari
2022-06-03 21:57:42 +08:00
XZB
7c6d8ca222 fix(proxy): filename is urlencoded when using Safari 2022-06-02 18:13:17 +08:00
f6be50f15a fix(189): login and get link (close #1182) 2022-05-31 16:03:54 +08:00
c35d54d092 chore: Merge pull request #1167 from Xhofe/dev 2022-05-28 21:01:47 +08:00
323dad2a1c fix(sftp): infinite loop while remove file (close #1094) 2022-05-28 21:01:04 +08:00
62aefc4f68 fix(189): new resty client 2022-05-28 20:43:13 +08:00
6a7eb8b3eb fix: don't save search files of balance account (close #1125) 2022-05-21 22:12:18 +08:00
eb549f2631 feat: add pdf viewer url to settings (close #1109) 2022-05-19 15:31:47 +08:00
9207eb69ee feat: add m4v to default video types (close #1114) 2022-05-19 15:31:40 +08:00
866df0540b chore: Merge pull request #1110 from foxxorcat/dev
增加123流式上传选择
2022-05-17 12:49:22 +08:00
04e04a1aa6 fix(189pc): delete user-agent for upload 2022-05-16 23:33:12 +08:00
6a66e39d5b feat(123):add io stream upload 2022-05-16 21:03:00 +08:00
f2b2728be7 fix(123,189pc,alidriver,xunlei):tempfile remove 2022-05-16 09:48:33 +08:00
39b8f28fc4 fix: disable pprof while not debug 2022-05-15 16:17:52 +08:00
e1ccc0b215 chore: Merge pull request #1093 from Xhofe/dev
2.5.2
2022-05-13 17:39:26 +08:00
87e339850d fix(ftp): remove dir (#1082) 2022-05-13 17:38:22 +08:00
79c9b6ac77 chore: Merge pull request #1090 from foxxorcat/dev (#1090) 2022-05-13 13:54:04 +08:00
aeb2297f1f perf(123):file thumbnail 2022-05-12 22:27:32 +08:00
3b59bb5c09 perf(123):upload 2022-05-12 21:39:55 +08:00
bc4bac921f chore: Merge pull request #1089 from foxxorcat/dev
修复迅雷一些已知问题
2022-05-12 20:42:49 +08:00
f917882a84 perf(xunlei):upload 2022-05-12 19:18:28 +08:00
6a67d1cf69 fix(xunlei):check captchaToken 2022-05-12 19:15:39 +08:00
041b3587bf fix(xunlei):turn page 2022-05-12 13:27:49 +08:00
0eef7a129c fix(xunlei):the verification code cannot be obtained from the mobile phone number or email 2022-05-12 13:26:12 +08:00
4b635f06e3 fix(189pc): delete user-agent for upload 2022-05-11 20:01:15 +08:00
279111a8e2 chore: Merge pull request #1079 from foxxorcat/dev 2022-05-10 22:35:59 +08:00
67674835da fix(alidriver):fast upload file is not close 2022-05-10 21:54:44 +08:00
732e9eb1c3 feat:add pprof 2022-05-10 21:40:43 +08:00
b6af9aa587 fix(139,189,189pc,alidrive,onedrive,yandex):http response body is not close #1072 2022-05-10 21:37:48 +08:00
a9027c0f06 fix(baidu.photo):update download api 2022-05-10 20:35:19 +08:00
d780fa18a5 fix(sftp): error while has no files(close #1078) 2022-05-10 18:18:06 +08:00
d5626d6e2f fix: cancel QueryEscape Disposition (close #1074) 2022-05-10 18:16:32 +08:00
52dcbfe1a4 fix(xunlei):missing x-client-id error in some user requests 2022-05-09 18:14:52 +08:00
bf0ee3d315 refactor(baidu.photo): add a file api of download 2022-05-09 12:43:51 +08:00
0237e78c1e chore: Merge branch 'dev' into v2 2022-05-08 14:30:01 +08:00
44b8c6abf7 fix(189): typo 2022-05-08 14:28:26 +08:00
33e1acd344 chore: Merge pull request #1060 from Xhofe/dev
Dev 2.5.1
2022-05-08 14:26:27 +08:00
c54cb61f14 chore: add debug info 2022-05-08 14:25:37 +08:00
734b204709 chore: change ocr api 2022-05-08 14:22:07 +08:00
b7d9c5e4ff chore: Merge pull request #1059 from foxxorcat/dev (close #1053)
fix baidu.photo and xunlei
2022-05-08 13:59:58 +08:00
e698b457b9 fix(baidu.photo):windows path error 2022-05-08 13:37:56 +08:00
5258c21656 fix(xunlei):login error 2022-05-08 13:36:36 +08:00
1ca9a3d14e chore: Merge pull request #1052 from Xhofe/dev
docs: add baidu.photo
2022-05-07 16:47:22 +08:00
f23bec9a35 docs: add baidu.photo [skip ci] 2022-05-07 16:43:02 +08:00
62a1acd1f4 chore: Merge pull request #1051 from Xhofe/all-contributors/add-WntFlm
docs: add WntFlm as a contributor for code
2022-05-07 16:38:33 +08:00
fa6e3fe567 docs: update .all-contributorsrc [skip ci] 2022-05-07 08:38:10 +00:00
b71b62ee35 docs: update CONTRIBUTORS.md [skip ci] 2022-05-07 08:38:09 +00:00
410b4939a4 chore: Merge pull request #1050 from Xhofe/all-contributors/add-ericarena
docs: add ericarena as a contributor for code
2022-05-07 16:37:24 +08:00
62c0071f29 docs: update .all-contributorsrc [skip ci] 2022-05-07 08:36:58 +00:00
f043a41005 docs: update CONTRIBUTORS.md [skip ci] 2022-05-07 08:36:57 +00:00
2e9da57036 chore: Merge pull request #1048 from Xhofe/all-contributors/add-Windman1320
docs: add Windman1320 as a contributor for code
2022-05-07 16:36:17 +08:00
d83cd37984 docs: update .all-contributorsrc [skip ci] 2022-05-07 08:35:50 +00:00
bad8b0ebbb docs: update CONTRIBUTORS.md [skip ci] 2022-05-07 08:35:49 +00:00
4535e65948 chore: Merge pull request #1047 from Xhofe/dev
Dev 2.5.0
2022-05-07 16:29:10 +08:00
3b413c2ee2 chore: Merge pull request #1021 from WntFlm/mimefix
fix(webdav): empty mimeType
2022-05-01 19:13:29 +08:00
427ae56333 chore: Merge pull request #1020 from foxxorcat/dev
fix(xunlei):download link speed limit
2022-05-01 13:25:54 +08:00
658fd5ad6e fix(webdav): empty mimeType
Now mimeType will always be a non-empty string, by defaulting it to "application/octet-stream".
2022-05-01 09:42:25 +08:00
11830bb51c fix(xunlei):download link speed limit 2022-04-30 21:41:15 +08:00
75c98429bf fix(webdav): wrong MIMEType (close #1007) 2022-04-29 14:09:51 +08:00
f77ea1b3a5 chore: Merge pull request #1011 from foxxorcat/dev
增加一刻相册支持,优化迅雷代码
2022-04-29 14:08:04 +08:00
0a8bd96d33 feat: support baidu.photo 2022-04-28 23:44:22 +08:00
68f37fc11f refactor(xunlei): optimized code 2022-04-28 23:15:37 +08:00
d6775cda69 fix(123): can't delete folder (close #1009) 2022-04-28 21:17:11 +08:00
43c6e07bac feat: add aria2 download settings(#1000)
* feat: add aria2 support

在右键菜单中增加了使用aria2下载的item,可以直接发送选中的文件链接到aria2,省略复制再粘贴到aria2的步骤

* feat: set default value for `Aria2 RPC url`

Co-authored-by: Xhofe <i@nn.ci>
2022-04-28 18:05:07 +08:00
4901e9080c fix(quark): file size over i32 (close #997) 2022-04-26 15:22:39 +08:00
48049a5ea3 docs: upgrade golang version [skip ci] 2022-04-25 16:05:49 +08:00
bd7260f0ff chore: base for template 2022-04-24 21:22:24 +08:00
6c0d54394f chore: Merge pull request #992 from Xhofe/dev
Dev 2.4.3
2022-04-24 17:40:26 +08:00
ce5dacbf3f build: build musl first 2022-04-24 17:39:25 +08:00
08aaa5e2c0 build: rm .git before xgo 2022-04-24 16:53:47 +08:00
42c0e438d5 fix(webdav): sharepoint upload 2022-04-24 15:38:17 +08:00
e4df146043 fix(webdav): sharepoint repeat login 2022-04-24 15:37:59 +08:00
27b7dae113 feat(webdav): support range get 2022-04-23 22:43:02 +08:00
293d574ce7 build: specify xgo version 2022-04-23 16:53:26 +08:00
56b3b35556 chore: Merge pull request #984 from Xhofe/dev
2.4.2
2022-04-21 22:34:12 +08:00
a7a0e85a46 docs: update qq group 2022-04-21 22:31:17 +08:00
95c0106fdd feat(onedrive): default redirect_uri(close #967) 2022-04-20 15:21:48 +08:00
6612338fc1 fix(189pc): InvalidSessionKey (fix #920) 2022-04-20 15:16:30 +08:00
c276a1541f chore: delete useless comment 2022-04-18 18:32:29 +08:00
cc96a5bbdb chore: add windows bin to gitignore 2022-04-18 18:31:10 +08:00
0810561a8a fix(xunlei): check err prevent stack overflow 2022-04-18 18:29:21 +08:00
82a5c43b94 chore: Merge pull request #961 from Xhofe/dev
Dev v2.4.1
2022-04-17 23:12:02 +08:00
d38f36ef44 chore: delete useless test file 2022-04-17 22:50:54 +08:00
f9533440c7 build: cancel static link for glibc 2022-04-17 22:50:28 +08:00
41a186b051 fix(native): set size of folder to 0 2022-04-17 21:12:55 +08:00
4e6a44253c chore: Merge pull request #958 from Xhofe/dev
Dev 2.4.0
2022-04-17 17:29:02 +08:00
ebda77cd43 docs: add sharepoint to webdav 2022-04-17 17:16:39 +08:00
1a1e86521f fix(quark): lost files while number of files > 100 (fix #947) 2022-04-16 21:06:33 +08:00
1b4740dae3 fix: file deduplication (fix #941) 2022-04-16 17:28:16 +08:00
91fc8df84e build: cancel static link for darwin 2022-04-16 17:08:48 +08:00
e6ecf1fa30 feat(189): add tips get page 2022-04-16 17:08:21 +08:00
183a6f1b3a build: static link for compile 2022-04-16 16:55:55 +08:00
3c2d59e272 build: use crazymax/xgo 2022-04-16 16:43:21 +08:00
fd80e3eaf7 build: Use -buildvcs=false to disable VCS stamping 2022-04-16 15:23:21 +08:00
4928c331a8 build: upgrade go version 2022-04-16 15:04:28 +08:00
3ad75e54cb refactor(baidu): add a crack api of download
* 修复百度网盘API文件大于20M问题

* refactor: keep the official api

Co-authored-by: Xhofe <i@nn.ci>
2022-04-16 14:52:36 +08:00
a2cf3ab42e workflow: add checkboxes for issue template 2022-04-14 22:16:15 +08:00
e24814ee2f chore: Merge pull request #938 from Xhofe/feature/search
Feature/search
2022-04-13 21:57:59 +08:00
37b42e6e17 fix(sftp): add port 2022-04-12 20:22:09 +08:00
30ebb0f4d4 feat: support other region sharepoint with webdav 2022-04-12 09:54:22 +08:00
8e059c64b5 feat: webdav for sharepoint online (#460) 2022-04-11 21:32:38 +08:00
395de069c2 fix: extract_folder causes sorting confusion (close #929) 2022-04-11 16:50:47 +08:00
4c22f37d54 fix(search): file type 2022-04-08 22:47:04 +08:00
a73a40133d feat: search api 2022-04-08 22:03:26 +08:00
6591af58ea feat: store search file index 2022-04-08 21:51:21 +08:00
58568d4ef6 fix(189cloud): remove empty Authorization 2022-04-07 16:56:50 +08:00
5295593bf8 fix(189cloudpc): wrong modified time (close #910) 2022-04-06 17:59:45 +08:00
24d031d578 feat: clear temp file while start 2022-04-06 16:24:31 +08:00
7141bf0358 build: static compilation for musl 2022-04-06 16:19:15 +08:00
c5d707cf0a fix: multilevel virtual path (close #904) 2022-04-06 15:23:10 +08:00
dfcf66b43e fix(native): set size of folder to 0 2022-04-06 15:11:03 +08:00
fa6ee62cf0 feat: global readme url 2022-04-05 20:17:27 +08:00
1428d90361 feat: meta readme 2022-04-05 20:17:16 +08:00
c413c22201 fix(quark): denied by Referer ACL 2022-04-04 20:55:48 +08:00
9b6adecd62 feat: sharepoint webdav (unfinished) 2022-04-04 20:55:22 +08:00
b3540cf539 docs: add SFTP in readme [skip ci] 2022-04-03 18:20:33 +08:00
f8650c9c0b fix(webdav): remove default Authorization header (close #893) 2022-04-03 18:19:15 +08:00
bf2e5768d6 feat: add rapid upload switch for 189pc and alidrive (#892)
* 189PC增加快传开关

* alidrive增加快传开关
2022-04-03 17:56:21 +08:00
18c82e79b5 feat: sftp support 2022-04-02 19:28:43 +08:00
d69d24a5b2 fix(alidrive): judge status of delete folder (close #886) 2022-04-02 14:34:34 +08:00
342729179d chore: Merge pull request #884 from Xhofe/dev
fix: some issues of webdav due to virtual path
2022-04-01 22:02:39 +08:00
0537449335 fix(webdav): virtual path no account 2022-04-01 21:57:55 +08:00
df90311453 fix(webdav): alist path not found 2022-04-01 20:40:57 +08:00
876579ea3b chore: Merge pull request #874 from Xhofe/dev
support mount to root path
2022-04-01 09:42:37 +08:00
e83081380e workflow: cancel build docker for pr 2022-04-01 09:40:53 +08:00
9daeaf7562 fix: virtual path, support mount to root path 2022-04-01 09:40:08 +08:00
e6be11c17f chore: Merge pull request #873 from Xhofe/dev
fix: load balance
2022-03-31 22:04:37 +08:00
b52e1e8be3 fix: load balance 2022-03-31 21:52:19 +08:00
948bbe9136 chore: Merge pull request #872 from Xhofe/dev
fix quark cookie, virtual path
2022-03-31 20:58:56 +08:00
ced61da33a feat: virtual path 2022-03-31 20:43:17 +08:00
a0f4383d41 feat(quark): set status 2022-03-30 14:06:50 +08:00
5a527dfa2c feat(xunlei): set timeout 2022-03-30 00:18:20 +08:00
49fc475f9f feat(s3): create placeholder file for mkdir 2022-03-30 00:18:00 +08:00
83c377270e fix(webdav): add sign for webdav proxy 2022-03-29 16:34:22 +08:00
7ffaef0de6 fix: audio and video types 2022-03-28 21:53:57 +08:00
dd151480a8 fix(alidrive): change response of move and copy 2022-03-28 21:51:24 +08:00
ad3121d367 fix(quark): __puus expired (close #830) 2022-03-28 21:38:05 +08:00
c1525ebc69 feat: cookie operate util 2022-03-28 21:10:20 +08:00
30277cd81f docs: change blog address [skip ci] 2022-03-27 20:23:18 +08:00
466ec27ffe chore: Merge pull request #825 from Xhofe/dev
docs: add disclaimer
2022-03-27 20:17:36 +08:00
85c757b035 docs: add disclaimer 2022-03-27 20:16:06 +08:00
712687370a chore: Merge pull request #823 from Xhofe/dev
fix some issues
2022-03-26 23:54:17 +08:00
b68ba22df3 workflow: issue invalid bot 2022-03-26 23:51:25 +08:00
d9652e2a0b fix(189cloud): link force https (close #821) 2022-03-26 21:59:18 +08:00
a5b757b251 feat: customize audio/video types (close #819) 2022-03-26 17:10:37 +08:00
0bc05a60b0 feat(189pc): override upload 2022-03-23 18:51:07 +08:00
db275f885a fix: DProxyTypes judge 2022-03-22 19:53:26 +08:00
9e483d902f feat: adapt postgres (close #740) 2022-03-22 16:41:38 +08:00
801f843f8a workflow: reproduction is required [skip ci] 2022-04-05 03:24:51 +08:00
77ffb93cbe feat: multiple down proxy urls (close #793) 2022-03-20 16:53:30 +08:00
bf73ea7f5d chore: Merge pull request #787 from Xhofe/dev
fix some issues of webdav
2022-03-19 14:57:38 +08:00
9b23d0ab29 docs: add sponsors [skip ci] 2022-03-18 17:08:43 +08:00
908cdd2c78 revert: undo delete upFileMap 2022-03-17 21:57:54 +08:00
f4f61a5787 fix(webdav): nil pointer error (close #749) 2022-03-17 21:23:10 +08:00
6db09a2736 fix: xunlei upload error (#749) 2022-03-17 21:13:13 +08:00
b21801d505 fix: clear cookie for 189 cloud login 2022-03-16 18:02:11 +08:00
2dbedc245c fix: 189 family cloud upload (#761) 2022-03-16 14:22:42 +08:00
58426613f6 feat: add tls config for mysql (fix #758) 2022-03-15 17:05:54 +08:00
ef19e851e3 fix: check local ip for 123pan 2022-03-15 14:48:39 +08:00
5a1b16a601 feat: set overwrite for aliyundrive upload 2022-03-14 22:43:27 +08:00
4eef9cd9bc fix: nil pointer while delete baidu account (close #751) 2022-03-14 20:40:42 +08:00
79b5c018ea workflow: add translation for duplicate issue [skip ci] 2022-03-14 18:09:38 +08:00
15651a4356 feat: only show files (close #735) 2022-03-13 19:37:58 +08:00
7be476cce0 fix: wrong dockerfile 2022-03-13 19:00:19 +08:00
bb017c5f6d feat: remove env prefix for docker 2022-03-13 17:01:45 +08:00
c51dc4594d chore: add tips for announcement 2022-03-13 16:46:06 +08:00
8e30b02efc fix: cache config env typo 2022-03-13 16:38:12 +08:00
0aa438dce4 feat: add announcement setting 2022-03-12 21:09:33 +08:00
9c2fc8e860 feat: read config from environment 2022-03-12 20:38:22 +08:00
b1d7a980d9 feat: echo password while start every time 2022-03-12 00:24:55 +08:00
19d0a88b55 fix: cookie lanzou file with password 2022-03-11 19:48:32 +08:00
40567dee0e fix: lanzou url password 2022-03-11 19:16:21 +08:00
4b540a2297 feat: skip creating an existing folder 2022-03-11 18:12:13 +08:00
8a62d55efe feat(google): add default client 2022-03-10 20:08:10 +08:00
10fce6c0fe fix(xunlei): some issues about page turning(#716) 2022-03-09 22:48:15 +08:00
d31d49a9bb fix(189pc): some minor issues 2022-03-09 21:09:21 +08:00
2e91f5ffa5 feat: support 189 family cloud (close #612) 2022-03-09 20:30:56 +08:00
8f19c45a81 feat: pikpak video use media link 2022-03-09 15:11:12 +08:00
c63e05983d fix: 189cloud big file download (close #683) 2022-03-07 15:04:20 +08:00
678a982535 feat: add sleep for lanzou request (close #690) 2022-03-07 14:36:25 +08:00
b2c02e6c5e feat: add driver template 2022-03-06 21:33:58 +08:00
7e05b0317f chore: Merge pull request #689 from Xhofe/dev
docs: add Contributing and move Contributors
2022-03-06 20:49:14 +08:00
19f06dfaed docs: fix typo [skip ci] 2022-03-06 20:44:37 +08:00
1680a18578 docs: move CONTRIBUTORS [skip ci] 2022-03-06 20:38:30 +08:00
e8f440ca5c docs: create CONTRIBUTING.md [skip ci] 2022-03-06 20:30:17 +08:00
7deff76f49 chore: Merge pull request #687 from Xhofe/all-contributors/add-Clansty
docs: add Clansty as a contributor for doc
2022-03-06 17:36:13 +08:00
cd0afb9536 docs: update .all-contributorsrc [skip ci] 2022-03-06 09:35:08 +00:00
668a953cd8 docs: update README_cn.md [skip ci] 2022-03-06 09:35:07 +00:00
8bfbaa74f6 docs: update README.md [skip ci] 2022-03-06 09:35:06 +00:00
3ccf5ee620 fix: ipa plist key 2022-03-05 15:33:04 +08:00
b44243c021 feat: ipa name decode 2022-03-05 15:13:19 +08:00
4ae81b5a79 feat: aliyundrive fast upload (#652) 2022-03-05 13:14:57 +08:00
189f4c19a5 release: release v2.1.1 2022-03-04 21:00:39 +08:00
92a3d74af5 chore: Merge pull request #675 from Xhofe/all-contributors/add-vg-land [skip ci]
docs: add vg-land as a contributor for code
2022-03-04 20:16:17 +08:00
bb73a10332 docs: update .all-contributorsrc [skip ci] 2022-03-04 12:14:55 +00:00
3baf1e8c7b docs: update README_cn.md [skip ci] 2022-03-04 12:14:54 +00:00
fdb49f5fb4 docs: update README.md [skip ci] 2022-03-04 12:14:53 +00:00
2eedcc1626 feat: opus preview (#638) 2022-03-04 10:05:15 +08:00
6faecbd5d8 feat: add cache for xunlei 2022-03-04 09:55:37 +08:00
34ed05c62f style: go mod tidy 2022-03-04 09:52:08 +08:00
ce83d6eb40 build: fix dev build 2022-03-04 09:50:47 +08:00
2271cb6c7c docs: replace preview image [skip ci] 2022-03-03 23:30:45 +08:00
a42b30c96e docs: update .all-contributorsrc [skip ci] 2022-03-03 23:28:37 +08:00
ce25d16222 docs: update README_cn.md [skip ci] 2022-03-03 23:28:37 +08:00
b392e093e3 docs: update README.md [skip ci] 2022-03-03 23:28:37 +08:00
0d5b7298db docs: update .all-contributorsrc [skip ci] 2022-03-03 23:27:19 +08:00
2063ebb74d docs: update README_cn.md [skip ci] 2022-03-03 23:27:19 +08:00
0408d7ab5d docs: update README.md [skip ci] 2022-03-03 23:27:19 +08:00
d52451f9d2 docs: update .all-contributorsrc [skip ci] 2022-03-03 23:24:46 +08:00
ca9f77006a docs: update README_cn.md [skip ci] 2022-03-03 23:24:46 +08:00
e8e8d925f3 docs: update README.md [skip ci] 2022-03-03 23:24:46 +08:00
623aab4c28 docs: move all contributors 2022-03-03 23:17:13 +08:00
3bc81d471e docs: create .all-contributorsrc [skip ci] 2022-03-03 23:00:34 +08:00
dfddb5cfa1 docs: update README.md [skip ci] 2022-03-03 23:00:34 +08:00
80f5bde0cb build: just upx linux/amd64 2022-03-03 19:44:13 +08:00
9de072161e docs: add xunlei cloud (#659) 2022-03-03 19:38:59 +08:00
d08a7440bc 🎇 import uss 2022-03-03 19:33:40 +08:00
7a4bb2496d add welcome bot 2022-03-03 19:16:30 +08:00
f68ab40d26 🔀 Merge pull request #659 from foxxorcat/dev
 xunleicloud support
2022-03-03 17:03:23 +08:00
796d490fb7 🐛 fix #658 onedrive file/folder judge 2022-03-03 16:01:24 +08:00
2964d5a6db xunleicloud support 2022-03-03 15:45:33 +08:00
90b57dacee 🎇 remove set Content-Type for native 2022-03-02 19:27:40 +08:00
6af17e2509 🔒 fix #645 xss vulnerability 2022-03-01 20:09:25 +08:00
5193b2aa7d 🎨 fix some warning 2022-02-27 20:28:42 +08:00
3f644f07db ✏️ fix readme logo 2022-02-27 20:26:05 +08:00
d988f98b81 💚 fix upx 2022-02-26 00:12:00 +08:00
10634c7b77 👷 add build for macos 2022-02-25 23:59:27 +08:00
135d505192 back to cgo sqlite3 2022-02-25 23:55:57 +08:00
3f2be8a6ca 🔧 change default assets path 2022-02-25 22:08:12 +08:00
79bef09ee7 🔀 Merge branch 'feature/upyun' into dev 2022-02-25 21:06:49 +08:00
3534f6afac 💚 fix cal md5 2022-02-24 23:07:35 +08:00
106c1d069c 💚 change build platform 2022-02-24 22:55:50 +08:00
8ed0afe80d 🎇 add unupx version 2022-02-24 22:42:15 +08:00
6a6e3944d5 🔀 Merge branch 'feature/purego' into dev 2022-02-24 16:26:20 +08:00
94d5b5e47e direct but proxy types 2022-02-24 16:25:17 +08:00
e61b0f8e34 🔥 remove cgo to pure go 2022-02-24 16:08:49 +08:00
f7fbe1de6c 👷 build for all branch 2022-02-23 20:17:50 +08:00
01de01630e 🔥 remove placeholder for uss 2022-02-23 20:16:57 +08:00
f9f92e2198 ✏️ fix typo 2022-02-23 20:13:52 +08:00
7d5f50b04a 👷 build for all branch 2022-02-23 20:12:21 +08:00
72b5d25e4c ✏️ fix label typo 2022-02-23 20:07:45 +08:00
cae7f36531 🎇 refresh one folder 2022-02-23 19:16:33 +08:00
aa79f49e25 🐛 fix #600 aliyundrive move file 2022-02-23 14:56:17 +08:00
b4ad301d53 🐛 fix #599 lanzou url without password 2022-02-23 11:18:51 +08:00
00ed54c4c9 upyun uss support 2022-02-23 11:07:19 +08:00
ffa52794db 🍺 change login http method 2022-02-22 15:57:39 +08:00
24058d0c36 💚 fix dev build 2022-02-21 21:47:40 +08:00
641ca67671 💚 fix web replace 2022-02-21 20:31:01 +08:00
52ee2e0a8b 🐛 close #581 teambition update time error 2022-02-21 17:20:05 +08:00
724fc7f37e 💚 fix build web 2022-02-20 16:46:47 +08:00
9d279b104b dynamic public path 2022-02-20 15:14:18 +08:00
eb61f70164 🐛 fix show balance account 2022-02-20 13:06:59 +08:00
b3a8201768 🐛 fix that only two accounts can be load balanced 2022-02-19 21:49:37 +08:00
185795954b 🔊 add reason of failed to auto migrate model 2022-02-19 17:25:50 +08:00
cc62cc99d2 🐛 fix plist ipa name 2022-02-18 19:01:30 +08:00
270349f37c 🐛 fix write status sequence 2022-02-18 18:58:02 +08:00
977888070a 🐛 close #558 fix local file can't download 2022-02-18 18:50:01 +08:00
192d0f2bf3 🐛 fix update can't start 2022-02-18 18:49:40 +08:00
f2ec7884ec 🐛 fix only proxy webdav_direct 2022-02-17 17:49:03 +08:00
815975a4d2 🐛 fix 139 delete dir 2022-02-17 17:37:45 +08:00
f96a0238fc 👷 change issues-month-statistics 2022-02-17 17:29:45 +08:00
f695bd0959 🔧 change bundle-version 2022-02-17 16:54:34 +08:00
efe8f46e17 ✏️ fix typo 2022-02-17 11:56:00 +08:00
515daa22a9 🐛 fix #551 add S3ForcePathStyle config 2022-02-17 11:30:34 +08:00
f11e22deaf 🚧 change base64 characters 2022-02-17 09:14:21 +08:00
5be976169f 🚧 fix bundle-identifier 2022-02-17 00:44:30 +08:00
a6e08f3bf4 📝 update readme 2022-02-17 00:08:37 +08:00
944e68a979 🚧 plist generate 2022-02-17 00:06:10 +08:00
b3a6e33ce1 🎇 quark support 2022-02-16 20:20:39 +08:00
cb53ddc8e8 🐛 fix nil pointer 2022-02-16 16:10:39 +08:00
693417be4f 🔧 add hide files setting item 2022-02-15 14:51:50 +08:00
5c3f91bb55 support empty password 2022-02-15 14:42:24 +08:00
8a219d0732 👷 add issue bot 2022-02-14 20:03:42 +08:00
146a544af3 ✏️ fix typo 2022-02-14 19:43:43 +08:00
48dccc6c0b 📝 update readme 2022-02-14 15:41:20 +08:00
ce1740cec4 ✏️ fix typo 2022-02-14 15:36:21 +08:00
d40dbeae3e 💬 add issue template 2022-02-14 15:33:48 +08:00
5094b673c4 👷 add some issue bot 2022-02-14 15:32:25 +08:00
228e6d10e7 🔧 close #519 customize temp dir 2022-02-14 15:06:57 +08:00
e90b979d15 close #535 request set timeout 2022-02-14 14:59:00 +08:00
fb05a6ca48 🐛 fix #533 only encode fileName 2022-02-14 14:31:09 +08:00
e055ed3afa 🎇 lanzou proxy add user-agent 2022-02-13 17:46:07 +08:00
4371c470b3 webdav direct proxy 2022-02-13 15:57:42 +08:00
7bb237d0ef 🐛 fix #527 189 upload file name contains % 2022-02-13 13:03:34 +08:00
5c42354b01 🐛 fix #527 189 upload name contains + 2022-02-12 20:27:38 +08:00
387e8af422 🐛 fix 189 upload while filename contains & 2022-02-12 13:08:46 +08:00
5dca777caf 🔒 fix baidu direct link 2022-02-12 12:04:10 +08:00
0814778a14 🐛 fix no account error while only one 2022-02-11 17:11:02 +08:00
6827af3997 🎇 close #522 hide account for guest webdav 2022-02-11 16:19:55 +08:00
435bdea8f7 🔧 #523 add some default setting 2022-02-11 16:17:07 +08:00
4f81735af6 🎇 close #512 favicon redirect 2022-02-08 18:07:13 +08:00
bef3d2f88d 🐛 Fix the temp folder is not created at the first startup 2022-02-08 16:22:53 +08:00
ba99c7dc03 🐛 fix multiple accounts with the same prefix cannot be load balanced 2022-02-08 16:02:47 +08:00
f5c5162a9b load balance 2022-02-08 15:51:58 +08:00
a22903533e 🐛 set random seed 2022-02-04 16:08:15 +08:00
86cda58b22 🚧 echo password 2022-02-04 14:58:48 +08:00
7804cf9d5c 🔒 random webdav admin password 2022-02-04 14:39:11 +08:00
2bb7036110 🔧 change logo 2022-02-03 21:28:13 +08:00
ba545555cf 📝 update readme 2022-02-03 21:00:08 +08:00
be55ca690c 🔧 change default logo 2022-02-03 20:43:33 +08:00
9013add749 🔒 random initial password 2022-02-03 12:27:50 +08:00
3201b6da76 🐛 fix #376 windows webdav upload 2022-02-02 18:48:34 +08:00
feb42f1f4b 🔥 delete proxy interface 2022-02-02 18:03:54 +08:00
6f14d0eb5c 🎇 baidu disk support 2022-02-02 17:32:11 +08:00
7530d8f5b2 🐛 fix lanzou for download change 2022-02-01 22:33:19 +08:00
e25fe05a53 yandex disk support 2022-02-01 17:15:11 +08:00
8e0ab8f780 🔥 remove onedrive refresh token cron 2022-02-01 15:20:17 +08:00
cb2a3c2b42 🎨 change proxy interface 2022-02-01 14:28:21 +08:00
1b6ec94f33 🐛 fix s3 custom host 2022-02-01 11:34:32 +08:00
cb23edc1fe 🐛 fix #462 check connect while get ftp client 2022-01-31 11:13:29 +08:00
6fd05d7d72 🐛 fix connMap not init 2022-01-30 00:55:12 +08:00
f26ac57569 🐛 fix ftp conn not store 2022-01-30 00:04:31 +08:00
2434ac54d0 🐛 fix webdav 2022-01-29 18:36:22 +08:00
f25b557327 📝 update readme 2022-01-29 15:31:06 +08:00
81a0706d01 189 chunk upload 2022-01-29 11:16:40 +08:00
5f6b576cbf 🔧 switch cdn to jsdelivr 2022-01-28 23:29:38 +08:00
549877f71e 🔥 Delete Text 2022-01-28 23:21:11 +08:00
c6a5ba9b91 🚧 189 chunk upload 2022-01-28 19:51:54 +08:00
1a69d80489 🎨 Pull away Path 2022-01-28 11:04:56 +08:00
b797f4302c 🎇 use tempFile to cal md5 2022-01-27 23:48:29 +08:00
bf9aa5c3d3 🔒 not allowed use relative path of native 2022-01-27 15:10:33 +08:00
7390e19a7a 🔒 not allowed down with relative path 2022-01-27 15:05:17 +08:00
b31a12a0cc 🔒 not allowed access using relative path for native 2022-01-27 14:54:20 +08:00
26ce001782 🎇 add ocr for 189 2022-01-27 12:34:49 +08:00
a2c7ff3262 ✏️ Invalid Token 2022-01-26 16:27:38 +08:00
8fc7c716c0 copy api 2022-01-26 14:07:51 +08:00
c70fc3fc4b finish teambition chunk upload 2022-01-26 14:05:35 +08:00
df513b7dc0 teambition upload (<= 20 MB) 2022-01-23 14:03:04 +08:00
2a9598f4c6 🐛 fix #407 189cloud use local sort 2022-01-21 19:01:33 +08:00
224c20779c 🔧 add webp to image types 2022-01-20 23:01:59 +08:00
5d722298cb 📝 update readme 2022-01-20 22:38:53 +08:00
4bcc6359e3 💚 fix docker build 2022-01-20 20:52:14 +08:00
4144afcc92 🐛 fix #397 139yun file size overflow int32 2022-01-20 20:35:01 +08:00
2ad27046fb 🎨 split build and docker 2022-01-20 20:30:58 +08:00
9516ac6718 🎇 finish move api 2022-01-20 19:58:25 +08:00
de638c7c36 🐛 fix 123pan move 2022-01-20 19:58:10 +08:00
c6b34a033b 🔧 add swf to image types 2022-01-20 14:40:59 +08:00
31de3399d2 💚 fix musl prebuilt 2022-01-20 14:23:31 +08:00
0dc2ca019f 💚 fix musl prebuilt 2022-01-20 13:47:21 +08:00
04724f7f0f 👷 add prebuilt for musl-libc 2022-01-20 12:53:49 +08:00
75a983a965 🐛 fix webdav can't get file with password 2022-01-19 18:46:32 +08:00
e12d8bb8ca ⬆️ upgrade gorm 2022-01-19 09:15:00 +08:00
68f1ccfed4 add sslmode for postgres 2022-01-19 09:14:31 +08:00
54272db59c 🚧 add folder api 2022-01-18 19:08:44 +08:00
6d34e88360 🎇 hide files while webdav visitor 2022-01-18 18:48:08 +08:00
0a901a2eb0 🐛 fix can't find zhimg for dev 2022-01-18 18:19:58 +08:00
e1671a0511 🚧 add move api 2022-01-18 16:13:07 +08:00
dcb4ec695f rename and mkdir api 2022-01-18 14:31:52 +08:00
4a21b6fe1d 🔥 close res.body for proxy 2022-01-18 11:29:32 +08:00
96a237902b 🐛 close #379 fix google drive can't get text file 2022-01-17 09:54:19 +08:00
cfb51e9f80 Extract folder 2022-01-16 16:38:41 +08:00
e952f1c243 🔥 Optimize 189 cloud get client 2022-01-16 16:05:32 +08:00
07d6ca27db 🐛 Forbid MediaTrack to create a new folder with the same name 2022-01-16 13:21:29 +08:00
8245da485a 🐛 fix s3 for #362 2022-01-15 20:24:57 +08:00
5c759217cf 🐛 fix some lanzou can't down #360 2022-01-15 20:10:42 +08:00
0648fdebc2 ♻️ solve circular dependency 2022-01-15 19:59:24 +08:00
ed670e528f 🐛 fix 139Yun error phone close #365 2022-01-15 19:44:22 +08:00
2473309a51 🎇 execute save while delete account 2022-01-15 19:36:37 +08:00
21ca2f11b7 🔧 change default config 2022-01-14 21:20:45 +08:00
ccaa28a323 🔧 change default cdn 2022-01-14 20:56:20 +08:00
fea8b376f8 🔧 change config 2022-01-14 20:10:35 +08:00
55d244b726 🐛 fix mediatrack can't mkdir #351 2022-01-14 18:19:58 +08:00
1640a52789 cancel hide file for admin #343 2022-01-14 18:00:47 +08:00
424ec10692 🐛 fix ftp download error 2022-01-13 22:56:07 +08:00
b472c2ee18 🔒 not allowed delete root folder 2022-01-13 21:23:27 +08:00
65a01251e9 🐛 fix proxy not set status 2022-01-13 20:09:11 +08:00
96be6bbbd1 🐛 fix #334 proxy video can't use range 2022-01-13 19:49:02 +08:00
6f7465aab7 🐛 fix 139Yun no user error 2022-01-13 18:01:54 +08:00
c9bc8227bb 💚 switch back to techknowlogick/xgo 2022-01-12 11:50:26 +08:00
4ea9371c00 💚 fix release actions 2022-01-12 01:04:50 +08:00
326c74cdc6 📝 update readme 2022-01-12 01:00:18 +08:00
f11c2efa8c 🐛 fix package download 2022-01-12 00:37:44 +08:00
6dd8102a82 🔥 Optimize delete cache 2022-01-11 23:45:12 +08:00
ddf6a4955f 🎇 139yun family support 2022-01-11 23:43:55 +08:00
602994e213 ✏️ fix typo 2022-01-10 21:14:26 +08:00
7c7306bf96 💚 Optimize github actions build 2022-01-10 19:48:11 +08:00
c24894b5de 💚 add go install version 2022-01-10 18:23:44 +08:00
e10412c530 💚 change xgo 2022-01-10 18:18:11 +08:00
5c2491b6c3 📝 change LICENSE 2022-01-10 18:11:14 +08:00
036373032c 139yun personal support 2022-01-09 22:44:43 +08:00
db58dabd31 local sort bring folders to the front 2022-01-09 19:28:48 +08:00
7ef98c05fa ✏️ fix typo 2022-01-09 17:22:55 +08:00
6351f43b9b 🔥 Delete alist-proxy.js 2022-01-08 22:47:58 +08:00
6613c8a6c1 🎇 file delete 2022-01-08 22:09:27 +08:00
2b97882b42 🎇 Multiple file upload 2022-01-08 21:29:00 +08:00
925f386bed 🚧 remove customize head and body for manage 2022-01-08 21:10:34 +08:00
0fbbd54b0c 🔧 add required for teambition 2022-01-08 20:23:30 +08:00
12ba1fed00 🔧 add cookie required 2022-01-08 20:18:14 +08:00
2826bac53c 📝 update readme 2022-01-08 20:09:12 +08:00
3e7e9f354f 🎇 189cloud upload 2022-01-08 19:31:12 +08:00
86ff80885d 🐛 fix 189 logged in but get error 2022-01-08 17:19:11 +08:00
be03e34406 🐛 fix proxy onlyLocal 2022-01-07 23:30:22 +08:00
83231becba 🎇 update config file while config struct changed 2022-01-07 20:29:20 +08:00
876ee49fb0 🐛 add range header for pdf.js 2022-01-07 20:28:58 +08:00
10bec31033 support flv preview 2022-01-06 19:26:19 +08:00
fc48f29575 🎇 finish MediaTrack upload 2022-01-06 19:23:55 +08:00
f0d9a452bb 🎇 finish MediaTrack upload 2022-01-06 19:21:13 +08:00
c03a4f83d1 🚧 give up 189 family cloud 2022-01-06 18:45:07 +08:00
10f06fde5c 🚧 MediaTrack support 2022-01-06 01:14:32 +08:00
36b533cb16 🐛 remove empty title folder 2022-01-05 23:25:21 +08:00
6ff2cdab98 🎇 teambition support 2022-01-05 22:03:28 +08:00
ef5cad1bf0 🎇 Pagination #257 2022-01-04 21:21:27 +08:00
b60c7ecd9e 🐛 fix #281 set CHARSET=utf8mb4 2022-01-04 15:16:18 +08:00
68aaa8fee2 🚧 Pagination function 2022-01-03 23:30:57 +08:00
67bf14d428 🎨 Pull away link 2022-01-03 22:48:34 +08:00
c22ff77c89 🎨 use S3ForcePathStyle 2022-01-03 22:37:26 +08:00
84fc0ab1bd 🎇 123pan upload 2022-01-03 22:29:32 +08:00
beb06f2f7f 🔥 replace encoding/json with jsoniter 2022-01-03 20:25:22 +08:00
7cf30836bf 🎨 Optimize code structure 2022-01-03 20:06:36 +08:00
e7ba289d06 🐛 fix upload delete cache 2022-01-02 18:17:04 +08:00
d255ff4fd0 🐛 fix home emoji type 2022-01-02 16:29:17 +08:00
bcf19f4f3e 🐛 fix upload get wrong content-type 2022-01-02 14:51:27 +08:00
efeee0e276 🐛 fix alist path error 2022-01-02 14:44:14 +08:00
e789873eca 🚧 add home emoji setting 2022-01-02 14:38:24 +08:00
524 changed files with 51415 additions and 10240 deletions

13
.github/FUNDING.yml vendored Normal file
View File

@ -0,0 +1,13 @@
# These are supported funding model platforms
github: # Replace with up to 4 GitHub Sponsors-enabled usernames e.g., [user1, user2]
patreon: # Replace with a single Patreon username
open_collective: # Replace with a single Open Collective username
ko_fi: # Replace with a single Ko-fi username
tidelift: # Replace with a single Tidelift platform-name/package-name e.g., npm/babel
community_bridge: # Replace with a single Community Bridge project-name e.g., cloud-foundry
liberapay: # Replace with a single Liberapay username
issuehunt: # Replace with a single IssueHunt username
otechie: # Replace with a single Otechie username
lfx_crowdfunding: # Replace with a single LFX Crowdfunding project-name e.g., cloud-foundry
custom: ['https://alist.nn.ci/guide/sponsor.html']

View File

@ -1,17 +1,51 @@
name: "Bug report"
description: Bug report
labels: [pending triage]
labels: [bug]
body:
- type: markdown
attributes:
value: |
Thanks for taking the time to fill out this bug report!
Thanks for taking the time to fill out this bug report, please **confirm that your issue is not a duplicate issue and not because of your operation or version issues**
感谢您花时间填写此错误报告,请**务必确认您的issue不是重复的且不是因为您的操作或版本问题**
- type: checkboxes
attributes:
label: Please make sure of the following things
description: |
You must check all the following, otherwise your issue may be closed directly. Or you can go to the [discussions](https://github.com/alist-org/alist/discussions)
您必须勾选以下所有内容否则您的issue可能会被直接关闭。或者您可以去[讨论区](https://github.com/alist-org/alist/discussions)
options:
- label: |
I have read the [documentation](https://alist.nn.ci).
我已经阅读了[文档](https://alist.nn.ci)。
- label: |
I'm sure there are no duplicate issues or discussions.
我确定没有重复的issue或讨论。
- label: |
I'm sure it's due to `AList` and not something else(such as `Dependencies` or `Operational`).
我确定是`AList`的问题,而不是其他原因(例如`依赖`或`操作`)。
- label: |
I'm sure this issue is not fixed in the latest version.
我确定这个问题在最新版本中没有被修复。
- type: input
id: version
attributes:
label: Alist Version / Alist 版本
description: What version of our software are you running?
placeholder: v2.0.0
label: AList Version / AList 版本
description: |
What version of our software are you running? Do not use `latest` or `master` as an answer.
您使用的是哪个版本的软件?请不要使用`latest`或`master`作为答案。
placeholder: v3.xx.xx
validations:
required: true
- type: input
id: driver
attributes:
label: Driver used / 使用的存储驱动
description: |
What storage driver are you using?
您使用的是哪个存储驱动?
placeholder: "for example: Onedrive"
validations:
required: true
- type: textarea
@ -25,15 +59,23 @@ body:
attributes:
label: Reproduction / 复现链接
description: |
Please provide a link to a repo that can reproduce the problem you ran into.
请提供能复现此问题的链接
Please provide a link to a repo that can reproduce the problem you ran into. Please be aware that your issue may be closed directly if you don't provide it.
请提供能复现此问题的链接请知悉如果不提供它你的issue可能会被直接关闭。
validations:
required: false
required: true
- type: textarea
id: config
attributes:
label: Config / 配置
description: |
Please provide the configuration file of your `AList` application and take a screenshot of the relevant storage configuration. (hide privacy field)
请提供您的`AList`应用的配置文件,并截图相关存储配置。(隐藏隐私字段)
validations:
required: true
- type: textarea
id: logs
attributes:
label: 日志 / Logs
label: Logs / 日志
description: |
Please copy and paste any relevant log output.
请复制粘贴错误日志,或者截图
render: shell

View File

@ -1,5 +1,5 @@
blank_issues_enabled: false
contact_links:
- name: Questions & Discussions & Feature request
- name: Questions & Discussions
url: https://github.com/Xhofe/alist/discussions
about: Use GitHub discussions for message-board style questions and discussions or feature request.
about: Use GitHub discussions for message-board style questions and discussions.

View File

@ -0,0 +1,33 @@
name: "Feature request"
description: Feature request
labels: [enhancement]
body:
- type: checkboxes
attributes:
label: Please make sure of the following things
description: You may select more than one, even select all.
options:
- label: I have read the [documentation](https://alist.nn.ci).
- label: I'm sure there are no duplicate issues or discussions.
- label: I'm sure this feature is not implemented.
- label: I'm sure it's a reasonable and popular requirement.
- type: textarea
id: feature-description
attributes:
label: Description of the feature / 需求描述
validations:
required: true
- type: textarea
id: suggested-solution
attributes:
label: Suggested solution / 实现思路
description: |
Solutions to achieve this requirement.
实现此需求的解决思路。
- type: textarea
id: additional-context
attributes:
label: Additional context / 附件
description: |
Any other context or screenshots about the feature request here, or information you find helpful.
相关的任何其他上下文或截图,或者你觉得有帮助的信息

21
.github/config.yml vendored Normal file
View File

@ -0,0 +1,21 @@
# Configuration for welcome - https://github.com/behaviorbot/welcome
# Configuration for new-issue-welcome - https://github.com/behaviorbot/new-issue-welcome
# Comment to be posted to on first time issues
newIssueWelcomeComment: >
Thanks for opening your first issue here! Be sure to follow the issue template!
# Configuration for new-pr-welcome - https://github.com/behaviorbot/new-pr-welcome
# Comment to be posted to on PRs from first time contributors in your repository
newPRWelcomeComment: >
Thanks for opening this pull request! Please check out our contributing guidelines.
# Configuration for first-pr-merge - https://github.com/behaviorbot/first-pr-merge
# Comment to be posted to on pull requests merged by a first time user
firstPRMergeComment: >
Congrats on merging your first pull request! We here at behavior bot are proud of you!
# It is recommend to include as many gifs and emojis as possible

19
.github/stale.yml vendored Normal file
View File

@ -0,0 +1,19 @@
# Number of days of inactivity before an issue becomes stale
daysUntilStale: 44
# Number of days of inactivity before a stale issue is closed
daysUntilClose: 20
# Issues with these labels will never be considered stale
exemptLabels:
- accepted
- security
# Label to use when marking an issue as stale
staleLabel: stale
# Comment to post when marking an issue as stale. Set to `false` to disable
markComment: >
This issue has been automatically marked as stale because it has not had
recent activity. It will be closed if no further activity occurs. Thank you
for your contributions.
# Comment to post when closing a stale issue. Set to `false` to disable
closeComment: >
This issue was closed due to inactive more than 52 days. You can reopen or
recreate it if you think it should continue. Thank you for your contributions again.

67
.github/workflows/auto_lang.yml vendored Normal file
View File

@ -0,0 +1,67 @@
name: auto_lang
on:
push:
branches:
- 'main'
paths:
- 'drivers/**'
- 'internal/bootstrap/data/setting.go'
- 'internal/conf/const.go'
- 'cmd/lang.go'
workflow_dispatch:
jobs:
auto_lang:
strategy:
matrix:
platform: [ ubuntu-latest ]
go-version: [ '1.20' ]
name: auto generate lang.json
runs-on: ${{ matrix.platform }}
steps:
- name: Setup go
uses: actions/setup-go@v4
with:
go-version: ${{ matrix.go-version }}
- name: Checkout alist
uses: actions/checkout@v3
with:
path: alist
- name: Checkout alist-web
uses: actions/checkout@v3
with:
repository: 'alist-org/alist-web'
ref: main
persist-credentials: false
fetch-depth: 0
path: alist-web
- name: Generate lang
run: |
cd alist
go run ./main.go lang
cd ..
- name: Copy lang file
run: |
cp -f ./alist/lang/*.json ./alist-web/src/lang/en/ 2>/dev/null || :
- name: Commit git
run: |
cd alist-web
git add .
git config --local user.email "bot@nn.ci"
git config --local user.name "IlaBot"
git commit -m "chore: auto update i18n file" -a 2>/dev/null || :
cd ..
- name: Push lang files
uses: ad-m/github-push-action@master
with:
github_token: ${{ secrets.MY_TOKEN }}
branch: main
directory: alist-web
repository: alist-org/alist-web

View File

@ -2,58 +2,40 @@ name: build
on:
push:
branches: [ v2 ]
branches: [ 'main' ]
pull_request:
branches: [ v2 ]
branches: [ 'main' ]
jobs:
build:
strategy:
matrix:
platform: [ubuntu-latest]
go-version: [1.17]
go-version: [ '1.20' ]
name: Build
runs-on: ${{ matrix.platform }}
steps:
- name: Setup Go
uses: actions/setup-go@v2
uses: actions/setup-go@v4
with:
go-version: ${{ matrix.go-version }}
- name: Set up Node
uses: actions/setup-node@v2
with:
node-version: '16'
# - name: Setup docker
# uses: docker-practice/actions-setup-docker@master
- name: Checkout
uses: actions/checkout@v2
with:
ref: v2
path: alist
uses: actions/checkout@v3
- name: Checkout web repo
uses: actions/checkout@v2
with:
repository: Xhofe/alist-web
ref: v2
path: alist-web
- name: Set up xgo
- name: Install dependencies
run: |
docker pull techknowlogick/xgo:latest
go install src.techknowlogick.com/xgo@latest
sudo snap install zig --classic --beta
docker pull crazymax/xgo:latest
go install github.com/crazy-max/xgo@latest
sudo apt install upx
- name: Build
run: |
mv alist/build.sh .
bash build.sh
bash build.sh dev
- name: Upload artifact
uses: actions/upload-artifact@v2
uses: actions/upload-artifact@v3
with:
name: artifact
path: alist/build
name: alist
path: dist

65
.github/workflows/build_docker.yml vendored Normal file
View File

@ -0,0 +1,65 @@
name: build_docker
on:
push:
branches: [ main ]
jobs:
build_docker:
name: Build docker
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v3
- name: Docker meta
id: meta
uses: docker/metadata-action@v4
with:
images: xhofe/alist
- name: Replace release with dev
run: |
sed -i 's/release/dev/g' Dockerfile
- name: Set up QEMU
uses: docker/setup-qemu-action@v2
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v2
- name: Login to DockerHub
uses: docker/login-action@v2
with:
username: xhofe
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Build and push
id: docker_build
uses: docker/build-push-action@v4
with:
context: .
push: true
tags: ${{ steps.meta.outputs.tags }}
labels: ${{ steps.meta.outputs.labels }}
platforms: linux/amd64,linux/arm64
build_docker_with_aria2:
needs: build_docker
name: Build docker with aria2
runs-on: ubuntu-latest
steps:
- name: Checkout repo
uses: actions/checkout@v3
with:
repository: alist-org/with_aria2
ref: main
persist-credentials: false
fetch-depth: 0
- name: Commit
run: |
git config --local user.email "bot@nn.ci"
git config --local user.name "IlaBot"
git commit --allow-empty -m "Trigger build for ${{ github.sha }}"
- name: Push commit
uses: ad-m/github-push-action@master
with:
github_token: ${{ secrets.MY_TOKEN }}
branch: main
repository: alist-org/with_aria2

19
.github/workflows/changelog.yml vendored Normal file
View File

@ -0,0 +1,19 @@
name: auto changelog
on:
push:
tags:
- '*'
jobs:
changelog:
name: Create Release
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v3
with:
fetch-depth: 0
- run: npx changelogithub # or changelogithub@0.12 if ensure the stable result
env:
GITHUB_TOKEN: ${{secrets.MY_TOKEN}}

View File

@ -1,48 +0,0 @@
name: docker
on:
push:
branches:
- 'v2'
tags:
- 'v*'
pull_request:
branches:
- 'v2'
jobs:
docker:
name: Docker
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v2
- name: Docker meta
id: meta
uses: docker/metadata-action@v3
with:
images: xhofe/alist
- name: Set up Node
uses: actions/setup-node@v2
with:
node-version: '16'
- name: Build web
run: bash build.sh web
- name: Set up QEMU
uses: docker/setup-qemu-action@v1
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v1
- name: Login to DockerHub
uses: docker/login-action@v1
with:
username: xhofe
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Build and push
id: docker_build
uses: docker/build-push-action@v2
with:
context: .
push: true
tags: ${{ steps.meta.outputs.tags }}
labels: ${{ steps.meta.outputs.labels }}
platforms: linux/amd64,linux/arm64,linux/arm/v7,linux/386,linux/arm/v6,linux/s390x

View File

@ -0,0 +1,22 @@
name: Close need info
on:
schedule:
- cron: "0 0 */1 * *"
workflow_dispatch:
jobs:
close-need-info:
runs-on: ubuntu-latest
steps:
- name: close-issues
uses: actions-cool/issues-helper@v3
with:
actions: 'close-issues'
token: ${{ secrets.GITHUB_TOKEN }}
labels: 'question'
inactive-day: 3
close-reason: 'not_planned'
body: |
Hello @${{ github.event.issue.user.login }}, this issue was closed due to no activities in 3 days.
你好 @${{ github.event.issue.user.login }}此issue因超过3天未回复被关闭。

21
.github/workflows/issue_close_stale.yml vendored Normal file
View File

@ -0,0 +1,21 @@
name: Close inactive
on:
schedule:
- cron: "0 0 */7 * *"
workflow_dispatch:
jobs:
close-inactive:
runs-on: ubuntu-latest
steps:
- name: close-issues
uses: actions-cool/issues-helper@v3
with:
actions: 'close-issues'
token: ${{ secrets.GITHUB_TOKEN }}
labels: 'stale'
inactive-day: 8
close-reason: 'not_planned'
body: |
Hello @${{ github.event.issue.user.login }}, this issue was closed due to inactive more than 52 days. You can reopen or recreate it if you think it should continue. Thank you for your contributions again.

25
.github/workflows/issue_duplicate.yml vendored Normal file
View File

@ -0,0 +1,25 @@
name: Issue Duplicate
on:
issues:
types: [labeled]
jobs:
create-comment:
runs-on: ubuntu-latest
if: github.event.label.name == 'duplicate'
steps:
- name: Create comment
uses: actions-cool/issues-helper@v3
with:
actions: 'create-comment'
token: ${{ secrets.GITHUB_TOKEN }}
issue-number: ${{ github.event.issue.number }}
body: |
Hello @${{ github.event.issue.user.login }}, your issue is a duplicate and will be closed.
你好 @${{ github.event.issue.user.login }}你的issue是重复的将被关闭。
- name: Close issue
uses: actions-cool/issues-helper@v3
with:
actions: 'close-issue'
token: ${{ secrets.GITHUB_TOKEN }}

25
.github/workflows/issue_invalid.yml vendored Normal file
View File

@ -0,0 +1,25 @@
name: Issue Invalid
on:
issues:
types: [labeled]
jobs:
create-comment:
runs-on: ubuntu-latest
if: github.event.label.name == 'invalid'
steps:
- name: Create comment
uses: actions-cool/issues-helper@v3
with:
actions: 'create-comment'
token: ${{ secrets.GITHUB_TOKEN }}
issue-number: ${{ github.event.issue.number }}
body: |
Hello @${{ github.event.issue.user.login }}, your issue is invalid and will be closed.
你好 @${{ github.event.issue.user.login }}你的issue无效将被关闭。
- name: Close issue
uses: actions-cool/issues-helper@v3
with:
actions: 'close-issue'
token: ${{ secrets.GITHUB_TOKEN }}

20
.github/workflows/issue_question.yml vendored Normal file
View File

@ -0,0 +1,20 @@
name: Issue Question
on:
issues:
types: [labeled]
jobs:
create-comment:
runs-on: ubuntu-latest
if: github.event.label.name == 'question'
steps:
- name: Create comment
uses: actions-cool/issues-helper@v3.5.1
with:
actions: 'create-comment'
token: ${{ secrets.GITHUB_TOKEN }}
issue-number: ${{ github.event.issue.number }}
body: |
Hello @${{ github.event.issue.user.login }}, please input issue by template and add detail. Issues labeled by `question` will be closed if no activities in 3 days.
你好 @${{ github.event.issue.user.login }}请按照issue模板填写, 并详细说明问题/日志记录/复现步骤/复现链接/实现思路或提供更多信息等, 3天内未回复issue自动关闭。

17
.github/workflows/issue_rm_working.yml vendored Normal file
View File

@ -0,0 +1,17 @@
name: Remove working label when issue closed
on:
issues:
types: [closed]
jobs:
rm-working:
runs-on: ubuntu-latest
steps:
- name: Remove working label
uses: actions-cool/issues-helper@v3
with:
actions: 'remove-labels'
token: ${{ secrets.GITHUB_TOKEN }}
issue-number: ${{ github.event.issue.number }}
labels: 'working'

19
.github/workflows/issue_similarity.yml vendored Normal file
View File

@ -0,0 +1,19 @@
name: Issues Similarity Analysis
on:
issues:
types: [opened, edited]
jobs:
similarity-analysis:
runs-on: ubuntu-latest
steps:
- name: analysis
uses: actions-cool/issues-similarity-analysis@v1
with:
filter-threshold: 0.5
comment-title: '### See'
comment-body: '${index}. ${similarity} #${number}'
show-footer: false
show-mentioned: true
since-days: 730

13
.github/workflows/issue_translate.yml vendored Normal file
View File

@ -0,0 +1,13 @@
name: Translation Helper
on:
pull_request_target:
types: [opened]
issues:
types: [opened]
jobs:
translate:
runs-on: ubuntu-latest
steps:
- uses: actions-cool/translation-helper@v1.2.0

25
.github/workflows/issue_wontfix.yml vendored Normal file
View File

@ -0,0 +1,25 @@
name: Issue Wontfix
on:
issues:
types: [labeled]
jobs:
lock-issue:
runs-on: ubuntu-latest
if: github.event.label.name == 'wontfix'
steps:
- name: Create comment
uses: actions-cool/issues-helper@v3
with:
actions: 'create-comment'
token: ${{ secrets.GITHUB_TOKEN }}
issue-number: ${{ github.event.issue.number }}
body: |
Hello @${{ github.event.issue.user.login }}, this issue will not be worked on and will be closed.
你好 @${{ github.event.issue.user.login }},这不会被处理,将被关闭。
- name: Close issue
uses: actions-cool/issues-helper@v3
with:
actions: 'close-issue'
token: ${{ secrets.GITHUB_TOKEN }}

View File

@ -1,69 +1,75 @@
name: release
on:
push:
tags:
- '*'
release:
types: [ published ]
jobs:
release:
strategy:
matrix:
platform: [ubuntu-latest]
go-version: [1.17]
platform: [ ubuntu-latest ]
go-version: [ '1.20' ]
name: Release
runs-on: ${{ matrix.platform }}
steps:
- name: Prerelease
uses: irongut/EditRelease@v1.2.0
with:
token: ${{ secrets.MY_TOKEN }}
id: ${{ github.event.release.id }}
prerelease: true
- name: Setup Go
uses: actions/setup-go@v2
uses: actions/setup-go@v4
with:
go-version: ${{ matrix.go-version }}
# - name: Setup docker
# uses: docker-practice/actions-setup-docker@master
- name: Setup Node
uses: actions/setup-node@v2
with:
node-version: '16'
- name: Checkout
uses: actions/checkout@v2
uses: actions/checkout@v3
with:
ref: v2
path: alist
persist-credentials: false
fetch-depth: 0
- name: Checkout web repo
uses: actions/checkout@v2
with:
repository: Xhofe/alist-web
ref: v2
path: alist-web
persist-credentials: false
fetch-depth: 0
- name: Set up xgo
- name: Install dependencies
run: |
docker pull techknowlogick/xgo:latest
go install src.techknowlogick.com/xgo@latest
sudo snap install zig --classic --beta
docker pull crazymax/xgo:latest
go install github.com/crazy-max/xgo@latest
sudo apt install upx
- name: Build
run: |
mv alist/build.sh .
bash build.sh release
- name: Upload asserts files
- name: Upload assets
uses: softprops/action-gh-release@v1
with:
files: build/compress/*
prerelease: false
release_desktop:
needs: release
name: Release desktop
runs-on: ubuntu-latest
steps:
- name: Checkout repo
uses: actions/checkout@v3
with:
repository: alist-org/desktop-release
ref: main
persist-credentials: false
fetch-depth: 0
- name: Add tag
run: |
git config --local user.email "bot@nn.ci"
git config --local user.name "IlaBot"
version=$(wget -qO- -t1 -T2 "https://api.github.com/repos/alist-org/alist/releases/latest" | grep "tag_name" | head -n 1 | awk -F ":" '{print $2}' | sed 's/\"//g;s/,//g;s/ //g')
git tag -a $version -m "release $version"
- name: Push tags
uses: ad-m/github-push-action@master
with:
github_token: ${{ secrets.MY_TOKEN }}
branch: cdn
directory: alist-web
repository: Xhofe/alist-web
- name: Release
uses: softprops/action-gh-release@v1
with:
files: alist/build/compress/*
branch: main
repository: alist-org/desktop-release

68
.github/workflows/release_docker.yml vendored Normal file
View File

@ -0,0 +1,68 @@
name: release_docker
on:
push:
tags:
- '*'
jobs:
release_docker:
name: Release Docker
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v3
- name: Docker meta
id: meta
uses: docker/metadata-action@v4
with:
images: xhofe/alist
- name: Set up QEMU
uses: docker/setup-qemu-action@v2
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v2
- name: Login to DockerHub
uses: docker/login-action@v2
with:
username: xhofe
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Build and push
id: docker_build
uses: docker/build-push-action@v4
with:
context: .
push: true
tags: ${{ steps.meta.outputs.tags }}
labels: ${{ steps.meta.outputs.labels }}
platforms: linux/amd64,linux/arm64,linux/arm/v7,linux/386,linux/arm/v6,linux/s390x
release_docker_with_aria2:
needs: release_docker
name: Release docker with aria2
runs-on: ubuntu-latest
steps:
- name: Checkout repo
uses: actions/checkout@v3
with:
repository: alist-org/with_aria2
ref: main
persist-credentials: false
fetch-depth: 0
- name: Add tag
run: |
git config --local user.email "bot@nn.ci"
git config --local user.name "IlaBot"
git tag -a ${{ github.ref_name }} -m "release ${{ github.ref_name }}"
- name: Push tags
uses: ad-m/github-push-action@master
with:
github_token: ${{ secrets.MY_TOKEN }}
branch: main
repository: alist-org/with_aria2

View File

@ -0,0 +1,34 @@
name: release_linux_musl_arm
on:
release:
types: [ published ]
jobs:
release_arm:
strategy:
matrix:
platform: [ ubuntu-latest ]
go-version: [ '1.20' ]
name: Release
runs-on: ${{ matrix.platform }}
steps:
- name: Setup Go
uses: actions/setup-go@v4
with:
go-version: ${{ matrix.go-version }}
- name: Checkout
uses: actions/checkout@v3
with:
fetch-depth: 0
- name: Build
run: |
bash build.sh release linux_musl_arm
- name: Upload assets
uses: softprops/action-gh-release@v1
with:
files: build/compress/*

18
.gitignore vendored
View File

@ -1,7 +1,7 @@
.idea/
.DS_Store
output/
dist/
/dist/
# Binaries for programs and plugins
*.exe
@ -20,10 +20,14 @@ dist/
# Dependency directories (remove the comment below to include it)
# vendor/
bin/*
/alist
/bin/*
*.json
public/index.html
public/assets/
public/public/
data/
/build
/data/
/log/
/lang/
/daemon/
/public/dist/*
/!public/dist/README.md
.VSCodeCounter

128
CODE_OF_CONDUCT.md Normal file
View File

@ -0,0 +1,128 @@
# Contributor Covenant Code of Conduct
## Our Pledge
We as members, contributors, and leaders pledge to make participation in our
community a harassment-free experience for everyone, regardless of age, body
size, visible or invisible disability, ethnicity, sex characteristics, gender
identity and expression, level of experience, education, socio-economic status,
nationality, personal appearance, race, religion, or sexual identity
and orientation.
We pledge to act and interact in ways that contribute to an open, welcoming,
diverse, inclusive, and healthy community.
## Our Standards
Examples of behavior that contributes to a positive environment for our
community include:
* Demonstrating empathy and kindness toward other people
* Being respectful of differing opinions, viewpoints, and experiences
* Giving and gracefully accepting constructive feedback
* Accepting responsibility and apologizing to those affected by our mistakes,
and learning from the experience
* Focusing on what is best not just for us as individuals, but for the
overall community
Examples of unacceptable behavior include:
* The use of sexualized language or imagery, and sexual attention or
advances of any kind
* Trolling, insulting or derogatory comments, and personal or political attacks
* Public or private harassment
* Publishing others' private information, such as a physical or email
address, without their explicit permission
* Other conduct which could reasonably be considered inappropriate in a
professional setting
## Enforcement Responsibilities
Community leaders are responsible for clarifying and enforcing our standards of
acceptable behavior and will take appropriate and fair corrective action in
response to any behavior that they deem inappropriate, threatening, offensive,
or harmful.
Community leaders have the right and responsibility to remove, edit, or reject
comments, commits, code, wiki edits, issues, and other contributions that are
not aligned to this Code of Conduct, and will communicate reasons for moderation
decisions when appropriate.
## Scope
This Code of Conduct applies within all community spaces, and also applies when
an individual is officially representing the community in public spaces.
Examples of representing our community include using an official e-mail address,
posting via an official social media account, or acting as an appointed
representative at an online or offline event.
## Enforcement
Instances of abusive, harassing, or otherwise unacceptable behavior may be
reported to the community leaders responsible for enforcement at
i@nn.ci.
All complaints will be reviewed and investigated promptly and fairly.
All community leaders are obligated to respect the privacy and security of the
reporter of any incident.
## Enforcement Guidelines
Community leaders will follow these Community Impact Guidelines in determining
the consequences for any action they deem in violation of this Code of Conduct:
### 1. Correction
**Community Impact**: Use of inappropriate language or other behavior deemed
unprofessional or unwelcome in the community.
**Consequence**: A private, written warning from community leaders, providing
clarity around the nature of the violation and an explanation of why the
behavior was inappropriate. A public apology may be requested.
### 2. Warning
**Community Impact**: A violation through a single incident or series
of actions.
**Consequence**: A warning with consequences for continued behavior. No
interaction with the people involved, including unsolicited interaction with
those enforcing the Code of Conduct, for a specified period of time. This
includes avoiding interactions in community spaces as well as external channels
like social media. Violating these terms may lead to a temporary or
permanent ban.
### 3. Temporary Ban
**Community Impact**: A serious violation of community standards, including
sustained inappropriate behavior.
**Consequence**: A temporary ban from any sort of interaction or public
communication with the community for a specified period of time. No public or
private interaction with the people involved, including unsolicited interaction
with those enforcing the Code of Conduct, is allowed during this period.
Violating these terms may lead to a permanent ban.
### 4. Permanent Ban
**Community Impact**: Demonstrating a pattern of violation of community
standards, including sustained inappropriate behavior, harassment of an
individual, or aggression toward or disparagement of classes of individuals.
**Consequence**: A permanent ban from any sort of public interaction within
the community.
## Attribution
This Code of Conduct is adapted from the [Contributor Covenant][homepage],
version 2.0, available at
https://www.contributor-covenant.org/version/2/0/code_of_conduct.html.
Community Impact Guidelines were inspired by [Mozilla's code of conduct
enforcement ladder](https://github.com/mozilla/diversity).
[homepage]: https://www.contributor-covenant.org
For answers to common questions about this code of conduct, see the FAQ at
https://www.contributor-covenant.org/faq. Translations are available at
https://www.contributor-covenant.org/translations.

107
CONTRIBUTING.md Normal file
View File

@ -0,0 +1,107 @@
# Contributing
## Setup your machine
`alist` is written in [Go](https://golang.org/) and [React](https://reactjs.org/).
Prerequisites:
- [git](https://git-scm.com)
- [Go 1.20+](https://golang.org/doc/install)
- [gcc](https://gcc.gnu.org/)
- [nodejs](https://nodejs.org/)
Clone `alist` and `alist-web` anywhere:
```shell
$ git clone https://github.com/alist-org/alist.git
$ git clone --recurse-submodules https://github.com/alist-org/alist-web.git
```
You should switch to the `main` branch for development.
## Preview your change
### backend
```shell
$ go run main.go
```
### frontend
```shell
$ pnpm dev
```
## Add a new driver
Copy `drivers/template` folder and rename it, and follow the comments in it.
## Create a commit
Commit messages should be well formatted, and to make that "standardized".
### Commit Message Format
Each commit message consists of a **header**, a **body** and a **footer**. The header has a special
format that includes a **type**, a **scope** and a **subject**:
```
<type>(<scope>): <subject>
<BLANK LINE>
<body>
<BLANK LINE>
<footer>
```
The **header** is mandatory and the **scope** of the header is optional.
Any line of the commit message cannot be longer than 100 characters! This allows the message to be easier
to read on GitHub as well as in various git tools.
### Revert
If the commit reverts a previous commit, it should begin with `revert: `, followed by the header
of the reverted commit.
In the body it should say: `This reverts commit <hash>.`, where the hash is the SHA of the commit
being reverted.
### Type
Must be one of the following:
* **feat**: A new feature
* **fix**: A bug fix
* **docs**: Documentation only changes
* **style**: Changes that do not affect the meaning of the code (white-space, formatting, missing
semi-colons, etc)
* **refactor**: A code change that neither fixes a bug nor adds a feature
* **perf**: A code change that improves performance
* **test**: Adding missing or correcting existing tests
* **build**: Affects project builds or dependency modifications
* **revert**: Restore the previous commit
* **ci**: Continuous integration of related file modifications
* **chore**: Changes to the build process or auxiliary tools and libraries such as documentation
generation
* **release**: Release a new version
### Scope
The scope could be anything specifying place of the commit change. For example `$location`,
`$browser`, `$compile`, `$rootScope`, `ngHref`, `ngClick`, `ngView`, etc...
You can use `*` when the change affects more than a single scope.
### Subject
The subject contains succinct description of the change:
* use the imperative, present tense: "change" not "changed" nor "changes"
* don't capitalize first letter
* no dot (.) at the end
### Body
Just as in the **subject**, use the imperative, present tense: "change" not "changed" nor "changes".
The body should include the motivation for the change and contrast this with previous behavior.
### Footer
The footer should contain any information about **Breaking Changes** and is also the place to
[reference GitHub issues that this commit closes](https://help.github.com/articles/closing-issues-via-commit-messages/).
**Breaking Changes** should start with the word `BREAKING CHANGE:` with a space or two newlines.
The rest of the commit message is then used for this.
## Submit a pull request
Push your branch to your `alist` fork and open a pull request against the
`main` branch.

View File

@ -1,14 +1,18 @@
FROM alpine:edge as builder
FROM alpine:3.18 as builder
LABEL stage=go-builder
WORKDIR /app/
COPY ./ ./
RUN apk add --no-cache bash git go gcc musl-dev; \
sh build.sh docker
RUN apk add --no-cache bash curl gcc git go musl-dev; \
bash build.sh release docker
FROM alpine:edge
FROM alpine:3.18
LABEL MAINTAINER="i@nn.ci"
VOLUME /opt/alist/data/
WORKDIR /opt/alist/
COPY --from=builder /app/bin/alist ./
EXPOSE 5244
CMD [ "./alist" ]
COPY entrypoint.sh /entrypoint.sh
RUN apk add --no-cache bash ca-certificates su-exec tzdata; \
chmod +x /entrypoint.sh
ENV PUID=0 PGID=0 UMASK=022
EXPOSE 5244 5245
CMD [ "/entrypoint.sh" ]

674
LICENSE
View File

@ -1,21 +1,661 @@
MIT License
GNU AFFERO GENERAL PUBLIC LICENSE
Version 3, 19 November 2007
Copyright (c) 2020 Xhofe
Copyright (C) 2007 Free Software Foundation, Inc. <https://fsf.org/>
Everyone is permitted to copy and distribute verbatim copies
of this license document, but changing it is not allowed.
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
Preamble
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
The GNU Affero General Public License is a free, copyleft license for
software and other kinds of works, specifically designed to ensure
cooperation with the community in the case of network server software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
The licenses for most software and other practical works are designed
to take away your freedom to share and change the works. By contrast,
our General Public Licenses are intended to guarantee your freedom to
share and change all versions of a program--to make sure it remains free
software for all its users.
When we speak of free software, we are referring to freedom, not
price. Our General Public Licenses are designed to make sure that you
have the freedom to distribute copies of free software (and charge for
them if you wish), that you receive source code or can get it if you
want it, that you can change the software or use pieces of it in new
free programs, and that you know you can do these things.
Developers that use our General Public Licenses protect your rights
with two steps: (1) assert copyright on the software, and (2) offer
you this License which gives you legal permission to copy, distribute
and/or modify the software.
A secondary benefit of defending all users' freedom is that
improvements made in alternate versions of the program, if they
receive widespread use, become available for other developers to
incorporate. Many developers of free software are heartened and
encouraged by the resulting cooperation. However, in the case of
software used on network servers, this result may fail to come about.
The GNU General Public License permits making a modified version and
letting the public access it on a server without ever releasing its
source code to the public.
The GNU Affero General Public License is designed specifically to
ensure that, in such cases, the modified source code becomes available
to the community. It requires the operator of a network server to
provide the source code of the modified version running there to the
users of that server. Therefore, public use of a modified version, on
a publicly accessible server, gives the public access to the source
code of the modified version.
An older license, called the Affero General Public License and
published by Affero, was designed to accomplish similar goals. This is
a different license, not a version of the Affero GPL, but Affero has
released a new version of the Affero GPL which permits relicensing under
this license.
The precise terms and conditions for copying, distribution and
modification follow.
TERMS AND CONDITIONS
0. Definitions.
"This License" refers to version 3 of the GNU Affero General Public License.
"Copyright" also means copyright-like laws that apply to other kinds of
works, such as semiconductor masks.
"The Program" refers to any copyrightable work licensed under this
License. Each licensee is addressed as "you". "Licensees" and
"recipients" may be individuals or organizations.
To "modify" a work means to copy from or adapt all or part of the work
in a fashion requiring copyright permission, other than the making of an
exact copy. The resulting work is called a "modified version" of the
earlier work or a work "based on" the earlier work.
A "covered work" means either the unmodified Program or a work based
on the Program.
To "propagate" a work means to do anything with it that, without
permission, would make you directly or secondarily liable for
infringement under applicable copyright law, except executing it on a
computer or modifying a private copy. Propagation includes copying,
distribution (with or without modification), making available to the
public, and in some countries other activities as well.
To "convey" a work means any kind of propagation that enables other
parties to make or receive copies. Mere interaction with a user through
a computer network, with no transfer of a copy, is not conveying.
An interactive user interface displays "Appropriate Legal Notices"
to the extent that it includes a convenient and prominently visible
feature that (1) displays an appropriate copyright notice, and (2)
tells the user that there is no warranty for the work (except to the
extent that warranties are provided), that licensees may convey the
work under this License, and how to view a copy of this License. If
the interface presents a list of user commands or options, such as a
menu, a prominent item in the list meets this criterion.
1. Source Code.
The "source code" for a work means the preferred form of the work
for making modifications to it. "Object code" means any non-source
form of a work.
A "Standard Interface" means an interface that either is an official
standard defined by a recognized standards body, or, in the case of
interfaces specified for a particular programming language, one that
is widely used among developers working in that language.
The "System Libraries" of an executable work include anything, other
than the work as a whole, that (a) is included in the normal form of
packaging a Major Component, but which is not part of that Major
Component, and (b) serves only to enable use of the work with that
Major Component, or to implement a Standard Interface for which an
implementation is available to the public in source code form. A
"Major Component", in this context, means a major essential component
(kernel, window system, and so on) of the specific operating system
(if any) on which the executable work runs, or a compiler used to
produce the work, or an object code interpreter used to run it.
The "Corresponding Source" for a work in object code form means all
the source code needed to generate, install, and (for an executable
work) run the object code and to modify the work, including scripts to
control those activities. However, it does not include the work's
System Libraries, or general-purpose tools or generally available free
programs which are used unmodified in performing those activities but
which are not part of the work. For example, Corresponding Source
includes interface definition files associated with source files for
the work, and the source code for shared libraries and dynamically
linked subprograms that the work is specifically designed to require,
such as by intimate data communication or control flow between those
subprograms and other parts of the work.
The Corresponding Source need not include anything that users
can regenerate automatically from other parts of the Corresponding
Source.
The Corresponding Source for a work in source code form is that
same work.
2. Basic Permissions.
All rights granted under this License are granted for the term of
copyright on the Program, and are irrevocable provided the stated
conditions are met. This License explicitly affirms your unlimited
permission to run the unmodified Program. The output from running a
covered work is covered by this License only if the output, given its
content, constitutes a covered work. This License acknowledges your
rights of fair use or other equivalent, as provided by copyright law.
You may make, run and propagate covered works that you do not
convey, without conditions so long as your license otherwise remains
in force. You may convey covered works to others for the sole purpose
of having them make modifications exclusively for you, or provide you
with facilities for running those works, provided that you comply with
the terms of this License in conveying all material for which you do
not control copyright. Those thus making or running the covered works
for you must do so exclusively on your behalf, under your direction
and control, on terms that prohibit them from making any copies of
your copyrighted material outside their relationship with you.
Conveying under any other circumstances is permitted solely under
the conditions stated below. Sublicensing is not allowed; section 10
makes it unnecessary.
3. Protecting Users' Legal Rights From Anti-Circumvention Law.
No covered work shall be deemed part of an effective technological
measure under any applicable law fulfilling obligations under article
11 of the WIPO copyright treaty adopted on 20 December 1996, or
similar laws prohibiting or restricting circumvention of such
measures.
When you convey a covered work, you waive any legal power to forbid
circumvention of technological measures to the extent such circumvention
is effected by exercising rights under this License with respect to
the covered work, and you disclaim any intention to limit operation or
modification of the work as a means of enforcing, against the work's
users, your or third parties' legal rights to forbid circumvention of
technological measures.
4. Conveying Verbatim Copies.
You may convey verbatim copies of the Program's source code as you
receive it, in any medium, provided that you conspicuously and
appropriately publish on each copy an appropriate copyright notice;
keep intact all notices stating that this License and any
non-permissive terms added in accord with section 7 apply to the code;
keep intact all notices of the absence of any warranty; and give all
recipients a copy of this License along with the Program.
You may charge any price or no price for each copy that you convey,
and you may offer support or warranty protection for a fee.
5. Conveying Modified Source Versions.
You may convey a work based on the Program, or the modifications to
produce it from the Program, in the form of source code under the
terms of section 4, provided that you also meet all of these conditions:
a) The work must carry prominent notices stating that you modified
it, and giving a relevant date.
b) The work must carry prominent notices stating that it is
released under this License and any conditions added under section
7. This requirement modifies the requirement in section 4 to
"keep intact all notices".
c) You must license the entire work, as a whole, under this
License to anyone who comes into possession of a copy. This
License will therefore apply, along with any applicable section 7
additional terms, to the whole of the work, and all its parts,
regardless of how they are packaged. This License gives no
permission to license the work in any other way, but it does not
invalidate such permission if you have separately received it.
d) If the work has interactive user interfaces, each must display
Appropriate Legal Notices; however, if the Program has interactive
interfaces that do not display Appropriate Legal Notices, your
work need not make them do so.
A compilation of a covered work with other separate and independent
works, which are not by their nature extensions of the covered work,
and which are not combined with it such as to form a larger program,
in or on a volume of a storage or distribution medium, is called an
"aggregate" if the compilation and its resulting copyright are not
used to limit the access or legal rights of the compilation's users
beyond what the individual works permit. Inclusion of a covered work
in an aggregate does not cause this License to apply to the other
parts of the aggregate.
6. Conveying Non-Source Forms.
You may convey a covered work in object code form under the terms
of sections 4 and 5, provided that you also convey the
machine-readable Corresponding Source under the terms of this License,
in one of these ways:
a) Convey the object code in, or embodied in, a physical product
(including a physical distribution medium), accompanied by the
Corresponding Source fixed on a durable physical medium
customarily used for software interchange.
b) Convey the object code in, or embodied in, a physical product
(including a physical distribution medium), accompanied by a
written offer, valid for at least three years and valid for as
long as you offer spare parts or customer support for that product
model, to give anyone who possesses the object code either (1) a
copy of the Corresponding Source for all the software in the
product that is covered by this License, on a durable physical
medium customarily used for software interchange, for a price no
more than your reasonable cost of physically performing this
conveying of source, or (2) access to copy the
Corresponding Source from a network server at no charge.
c) Convey individual copies of the object code with a copy of the
written offer to provide the Corresponding Source. This
alternative is allowed only occasionally and noncommercially, and
only if you received the object code with such an offer, in accord
with subsection 6b.
d) Convey the object code by offering access from a designated
place (gratis or for a charge), and offer equivalent access to the
Corresponding Source in the same way through the same place at no
further charge. You need not require recipients to copy the
Corresponding Source along with the object code. If the place to
copy the object code is a network server, the Corresponding Source
may be on a different server (operated by you or a third party)
that supports equivalent copying facilities, provided you maintain
clear directions next to the object code saying where to find the
Corresponding Source. Regardless of what server hosts the
Corresponding Source, you remain obligated to ensure that it is
available for as long as needed to satisfy these requirements.
e) Convey the object code using peer-to-peer transmission, provided
you inform other peers where the object code and Corresponding
Source of the work are being offered to the general public at no
charge under subsection 6d.
A separable portion of the object code, whose source code is excluded
from the Corresponding Source as a System Library, need not be
included in conveying the object code work.
A "User Product" is either (1) a "consumer product", which means any
tangible personal property which is normally used for personal, family,
or household purposes, or (2) anything designed or sold for incorporation
into a dwelling. In determining whether a product is a consumer product,
doubtful cases shall be resolved in favor of coverage. For a particular
product received by a particular user, "normally used" refers to a
typical or common use of that class of product, regardless of the status
of the particular user or of the way in which the particular user
actually uses, or expects or is expected to use, the product. A product
is a consumer product regardless of whether the product has substantial
commercial, industrial or non-consumer uses, unless such uses represent
the only significant mode of use of the product.
"Installation Information" for a User Product means any methods,
procedures, authorization keys, or other information required to install
and execute modified versions of a covered work in that User Product from
a modified version of its Corresponding Source. The information must
suffice to ensure that the continued functioning of the modified object
code is in no case prevented or interfered with solely because
modification has been made.
If you convey an object code work under this section in, or with, or
specifically for use in, a User Product, and the conveying occurs as
part of a transaction in which the right of possession and use of the
User Product is transferred to the recipient in perpetuity or for a
fixed term (regardless of how the transaction is characterized), the
Corresponding Source conveyed under this section must be accompanied
by the Installation Information. But this requirement does not apply
if neither you nor any third party retains the ability to install
modified object code on the User Product (for example, the work has
been installed in ROM).
The requirement to provide Installation Information does not include a
requirement to continue to provide support service, warranty, or updates
for a work that has been modified or installed by the recipient, or for
the User Product in which it has been modified or installed. Access to a
network may be denied when the modification itself materially and
adversely affects the operation of the network or violates the rules and
protocols for communication across the network.
Corresponding Source conveyed, and Installation Information provided,
in accord with this section must be in a format that is publicly
documented (and with an implementation available to the public in
source code form), and must require no special password or key for
unpacking, reading or copying.
7. Additional Terms.
"Additional permissions" are terms that supplement the terms of this
License by making exceptions from one or more of its conditions.
Additional permissions that are applicable to the entire Program shall
be treated as though they were included in this License, to the extent
that they are valid under applicable law. If additional permissions
apply only to part of the Program, that part may be used separately
under those permissions, but the entire Program remains governed by
this License without regard to the additional permissions.
When you convey a copy of a covered work, you may at your option
remove any additional permissions from that copy, or from any part of
it. (Additional permissions may be written to require their own
removal in certain cases when you modify the work.) You may place
additional permissions on material, added by you to a covered work,
for which you have or can give appropriate copyright permission.
Notwithstanding any other provision of this License, for material you
add to a covered work, you may (if authorized by the copyright holders of
that material) supplement the terms of this License with terms:
a) Disclaiming warranty or limiting liability differently from the
terms of sections 15 and 16 of this License; or
b) Requiring preservation of specified reasonable legal notices or
author attributions in that material or in the Appropriate Legal
Notices displayed by works containing it; or
c) Prohibiting misrepresentation of the origin of that material, or
requiring that modified versions of such material be marked in
reasonable ways as different from the original version; or
d) Limiting the use for publicity purposes of names of licensors or
authors of the material; or
e) Declining to grant rights under trademark law for use of some
trade names, trademarks, or service marks; or
f) Requiring indemnification of licensors and authors of that
material by anyone who conveys the material (or modified versions of
it) with contractual assumptions of liability to the recipient, for
any liability that these contractual assumptions directly impose on
those licensors and authors.
All other non-permissive additional terms are considered "further
restrictions" within the meaning of section 10. If the Program as you
received it, or any part of it, contains a notice stating that it is
governed by this License along with a term that is a further
restriction, you may remove that term. If a license document contains
a further restriction but permits relicensing or conveying under this
License, you may add to a covered work material governed by the terms
of that license document, provided that the further restriction does
not survive such relicensing or conveying.
If you add terms to a covered work in accord with this section, you
must place, in the relevant source files, a statement of the
additional terms that apply to those files, or a notice indicating
where to find the applicable terms.
Additional terms, permissive or non-permissive, may be stated in the
form of a separately written license, or stated as exceptions;
the above requirements apply either way.
8. Termination.
You may not propagate or modify a covered work except as expressly
provided under this License. Any attempt otherwise to propagate or
modify it is void, and will automatically terminate your rights under
this License (including any patent licenses granted under the third
paragraph of section 11).
However, if you cease all violation of this License, then your
license from a particular copyright holder is reinstated (a)
provisionally, unless and until the copyright holder explicitly and
finally terminates your license, and (b) permanently, if the copyright
holder fails to notify you of the violation by some reasonable means
prior to 60 days after the cessation.
Moreover, your license from a particular copyright holder is
reinstated permanently if the copyright holder notifies you of the
violation by some reasonable means, this is the first time you have
received notice of violation of this License (for any work) from that
copyright holder, and you cure the violation prior to 30 days after
your receipt of the notice.
Termination of your rights under this section does not terminate the
licenses of parties who have received copies or rights from you under
this License. If your rights have been terminated and not permanently
reinstated, you do not qualify to receive new licenses for the same
material under section 10.
9. Acceptance Not Required for Having Copies.
You are not required to accept this License in order to receive or
run a copy of the Program. Ancillary propagation of a covered work
occurring solely as a consequence of using peer-to-peer transmission
to receive a copy likewise does not require acceptance. However,
nothing other than this License grants you permission to propagate or
modify any covered work. These actions infringe copyright if you do
not accept this License. Therefore, by modifying or propagating a
covered work, you indicate your acceptance of this License to do so.
10. Automatic Licensing of Downstream Recipients.
Each time you convey a covered work, the recipient automatically
receives a license from the original licensors, to run, modify and
propagate that work, subject to this License. You are not responsible
for enforcing compliance by third parties with this License.
An "entity transaction" is a transaction transferring control of an
organization, or substantially all assets of one, or subdividing an
organization, or merging organizations. If propagation of a covered
work results from an entity transaction, each party to that
transaction who receives a copy of the work also receives whatever
licenses to the work the party's predecessor in interest had or could
give under the previous paragraph, plus a right to possession of the
Corresponding Source of the work from the predecessor in interest, if
the predecessor has it or can get it with reasonable efforts.
You may not impose any further restrictions on the exercise of the
rights granted or affirmed under this License. For example, you may
not impose a license fee, royalty, or other charge for exercise of
rights granted under this License, and you may not initiate litigation
(including a cross-claim or counterclaim in a lawsuit) alleging that
any patent claim is infringed by making, using, selling, offering for
sale, or importing the Program or any portion of it.
11. Patents.
A "contributor" is a copyright holder who authorizes use under this
License of the Program or a work on which the Program is based. The
work thus licensed is called the contributor's "contributor version".
A contributor's "essential patent claims" are all patent claims
owned or controlled by the contributor, whether already acquired or
hereafter acquired, that would be infringed by some manner, permitted
by this License, of making, using, or selling its contributor version,
but do not include claims that would be infringed only as a
consequence of further modification of the contributor version. For
purposes of this definition, "control" includes the right to grant
patent sublicenses in a manner consistent with the requirements of
this License.
Each contributor grants you a non-exclusive, worldwide, royalty-free
patent license under the contributor's essential patent claims, to
make, use, sell, offer for sale, import and otherwise run, modify and
propagate the contents of its contributor version.
In the following three paragraphs, a "patent license" is any express
agreement or commitment, however denominated, not to enforce a patent
(such as an express permission to practice a patent or covenant not to
sue for patent infringement). To "grant" such a patent license to a
party means to make such an agreement or commitment not to enforce a
patent against the party.
If you convey a covered work, knowingly relying on a patent license,
and the Corresponding Source of the work is not available for anyone
to copy, free of charge and under the terms of this License, through a
publicly available network server or other readily accessible means,
then you must either (1) cause the Corresponding Source to be so
available, or (2) arrange to deprive yourself of the benefit of the
patent license for this particular work, or (3) arrange, in a manner
consistent with the requirements of this License, to extend the patent
license to downstream recipients. "Knowingly relying" means you have
actual knowledge that, but for the patent license, your conveying the
covered work in a country, or your recipient's use of the covered work
in a country, would infringe one or more identifiable patents in that
country that you have reason to believe are valid.
If, pursuant to or in connection with a single transaction or
arrangement, you convey, or propagate by procuring conveyance of, a
covered work, and grant a patent license to some of the parties
receiving the covered work authorizing them to use, propagate, modify
or convey a specific copy of the covered work, then the patent license
you grant is automatically extended to all recipients of the covered
work and works based on it.
A patent license is "discriminatory" if it does not include within
the scope of its coverage, prohibits the exercise of, or is
conditioned on the non-exercise of one or more of the rights that are
specifically granted under this License. You may not convey a covered
work if you are a party to an arrangement with a third party that is
in the business of distributing software, under which you make payment
to the third party based on the extent of your activity of conveying
the work, and under which the third party grants, to any of the
parties who would receive the covered work from you, a discriminatory
patent license (a) in connection with copies of the covered work
conveyed by you (or copies made from those copies), or (b) primarily
for and in connection with specific products or compilations that
contain the covered work, unless you entered into that arrangement,
or that patent license was granted, prior to 28 March 2007.
Nothing in this License shall be construed as excluding or limiting
any implied license or other defenses to infringement that may
otherwise be available to you under applicable patent law.
12. No Surrender of Others' Freedom.
If conditions are imposed on you (whether by court order, agreement or
otherwise) that contradict the conditions of this License, they do not
excuse you from the conditions of this License. If you cannot convey a
covered work so as to satisfy simultaneously your obligations under this
License and any other pertinent obligations, then as a consequence you may
not convey it at all. For example, if you agree to terms that obligate you
to collect a royalty for further conveying from those to whom you convey
the Program, the only way you could satisfy both those terms and this
License would be to refrain entirely from conveying the Program.
13. Remote Network Interaction; Use with the GNU General Public License.
Notwithstanding any other provision of this License, if you modify the
Program, your modified version must prominently offer all users
interacting with it remotely through a computer network (if your version
supports such interaction) an opportunity to receive the Corresponding
Source of your version by providing access to the Corresponding Source
from a network server at no charge, through some standard or customary
means of facilitating copying of software. This Corresponding Source
shall include the Corresponding Source for any work covered by version 3
of the GNU General Public License that is incorporated pursuant to the
following paragraph.
Notwithstanding any other provision of this License, you have
permission to link or combine any covered work with a work licensed
under version 3 of the GNU General Public License into a single
combined work, and to convey the resulting work. The terms of this
License will continue to apply to the part which is the covered work,
but the work with which it is combined will remain governed by version
3 of the GNU General Public License.
14. Revised Versions of this License.
The Free Software Foundation may publish revised and/or new versions of
the GNU Affero General Public License from time to time. Such new versions
will be similar in spirit to the present version, but may differ in detail to
address new problems or concerns.
Each version is given a distinguishing version number. If the
Program specifies that a certain numbered version of the GNU Affero General
Public License "or any later version" applies to it, you have the
option of following the terms and conditions either of that numbered
version or of any later version published by the Free Software
Foundation. If the Program does not specify a version number of the
GNU Affero General Public License, you may choose any version ever published
by the Free Software Foundation.
If the Program specifies that a proxy can decide which future
versions of the GNU Affero General Public License can be used, that proxy's
public statement of acceptance of a version permanently authorizes you
to choose that version for the Program.
Later license versions may give you additional or different
permissions. However, no additional obligations are imposed on any
author or copyright holder as a result of your choosing to follow a
later version.
15. Disclaimer of Warranty.
THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY
APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT
HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY
OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO,
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM
IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF
ALL NECESSARY SERVICING, REPAIR OR CORRECTION.
16. Limitation of Liability.
IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS
THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY
GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE
USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF
DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD
PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS),
EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF
SUCH DAMAGES.
17. Interpretation of Sections 15 and 16.
If the disclaimer of warranty and limitation of liability provided
above cannot be given local legal effect according to their terms,
reviewing courts shall apply local law that most closely approximates
an absolute waiver of all civil liability in connection with the
Program, unless a warranty or assumption of liability accompanies a
copy of the Program in return for a fee.
END OF TERMS AND CONDITIONS
How to Apply These Terms to Your New Programs
If you develop a new program, and you want it to be of the greatest
possible use to the public, the best way to achieve this is to make it
free software which everyone can redistribute and change under these terms.
To do so, attach the following notices to the program. It is safest
to attach them to the start of each source file to most effectively
state the exclusion of warranty; and each file should have at least
the "copyright" line and a pointer to where the full notice is found.
<one line to give the program's name and a brief idea of what it does.>
Copyright (C) <year> <name of author>
This program is free software: you can redistribute it and/or modify
it under the terms of the GNU Affero General Public License as published
by the Free Software Foundation, either version 3 of the License, or
(at your option) any later version.
This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU Affero General Public License for more details.
You should have received a copy of the GNU Affero General Public License
along with this program. If not, see <https://www.gnu.org/licenses/>.
Also add information on how to contact you by electronic and paper mail.
If your software can interact with users remotely through a computer
network, you should also make sure that it provides a way for users to
get its source. For example, if your program is a web application, its
interface could display a "Source" link that leads users to an archive
of the code. There are many ways you could offer source, and different
solutions will be better for different programs; see section 13 for the
specific requirements.
You should also get your employer (if you work as a programmer) or school,
if any, to sign a "copyright disclaimer" for the program, if necessary.
For more information on this, and how to apply and follow the GNU AGPL, see
<https://www.gnu.org/licenses/>.

152
README.md
View File

@ -1,72 +1,138 @@
<div align="center">
<h3><a href="https://alist.nn.ci">Alist</a></h3>
<p><em>🗂Another file list program that supports multiple storage, powered by Gin and React.</em></p>
<a href="https://github.com/Xhofe/alist/releases"><img src="https://img.shields.io/github/release/Xhofe/alist?style=flat-square" alt="latest version"></a>
<a href="https://github.com/Xhofe/alist/discussions"><img src="https://img.shields.io/github/discussions/Xhofe/alist?color=%23ED8936&style=flat-square" alt="discussions"></a>
<a href="https://github.com/Xhofe/alist/actions?query=workflow%3ABuild"><img src="https://img.shields.io/github/workflow/status/Xhofe/alist/build?style=flat-square" alt="Build status"></a>
<a href="https://github.com/Xhofe/alist/releases"><img src="https://img.shields.io/github/downloads/Xhofe/alist/total?style=flat-square&color=%239F7AEA" alt="Downloads"></a>
<a href="https://github.com/Xhofe/alist/blob/v2/LICENSE"><img src="https://img.shields.io/github/license/Xhofe/alist?style=flat-square" alt="License"></a>
<a href="https://pay.xhofe.top">
<img src="https://img.shields.io/badge/%24-donate-ff69b4.svg?style=flat-square" alt="donate">
<a href="https://alist.nn.ci"><img height="100px" alt="logo" src="https://cdn.jsdelivr.net/gh/alist-org/logo@main/logo.svg"/></a>
<p><em>🗂A file list program that supports multiple storages, powered by Gin and Solidjs.</em></p>
<div>
<a href="https://goreportcard.com/report/github.com/alist-org/alist/v3">
<img src="https://goreportcard.com/badge/github.com/alist-org/alist/v3" alt="latest version" />
</a>
<a href="https://github.com/Xhofe/alist/blob/main/LICENSE">
<img src="https://img.shields.io/github/license/Xhofe/alist" alt="License" />
</a>
<a href="https://github.com/Xhofe/alist/actions?query=workflow%3ABuild">
<img src="https://img.shields.io/github/actions/workflow/status/Xhofe/alist/build.yml?branch=main" alt="Build status" />
</a>
<a href="https://github.com/Xhofe/alist/releases">
<img src="https://img.shields.io/github/release/Xhofe/alist" alt="latest version" />
</a>
<a title="Crowdin" target="_blank" href="https://crwd.in/alist">
<img src="https://badges.crowdin.net/alist/localized.svg">
</a>
</div>
<div>
<a href="https://github.com/Xhofe/alist/discussions">
<img src="https://img.shields.io/github/discussions/Xhofe/alist?color=%23ED8936" alt="discussions" />
</a>
<a href="https://discord.gg/F4ymsH4xv2">
<img src="https://img.shields.io/discord/1018870125102895134?logo=discord" alt="discussions" />
</a>
<a href="https://github.com/Xhofe/alist/releases">
<img src="https://img.shields.io/github/downloads/Xhofe/alist/total?color=%239F7AEA&logo=github" alt="Downloads" />
</a>
<a href="https://hub.docker.com/r/xhofe/alist">
<img src="https://img.shields.io/docker/pulls/xhofe/alist?color=%2348BB78&logo=docker&label=pulls" alt="Downloads" />
</a>
<a href="https://alist.nn.ci/guide/sponsor.html">
<img src="https://img.shields.io/badge/%24-sponsor-F87171.svg" alt="sponsor" />
</a>
</div>
</div>
---
English | [中文](./README_cn.md)
English | [中文](./README_cn.md)| [日本語](./README_ja.md) | [Contributing](./CONTRIBUTING.md) | [CODE_OF_CONDUCT](./CODE_OF_CONDUCT.md)
## Features
- [x] multiple storage
- [x] Local storage
- [x] [aliyundrive](https://www.aliyundrive.com/)
- [x] OneDrive / Sharepoint ([global](https://www.office.com/), [cn](https://portal.partner.microsoftonline.cn),de,us
- [x] [189cloud](https://cloud.189.cn)
- [x] [GoogleDrive](https://drive.google.com/)
- [x] [123pan](https://www.123pan.com/)
- [x] [lanzou](https://pc.woozooo.com/)
- [x] [Alist](https://github.com/Xhofe/alist)
- [x] FTP
- [x] [PikPak](https://www.mypikpak.com/)
- [x] [ShandianPan](https://shandianpan.com/)
- [x] [S3](https://aws.amazon.com/s3/)
- [x] WebDav
- [x] Multiple storage
- [x] Local storage
- [x] [Aliyundrive](https://www.aliyundrive.com/)
- [x] OneDrive / Sharepoint ([global](https://www.office.com/), [cn](https://portal.partner.microsoftonline.cn),de,us)
- [x] [189cloud](https://cloud.189.cn) (Personal, Family)
- [x] [GoogleDrive](https://drive.google.com/)
- [x] [123pan](https://www.123pan.com/)
- [x] FTP / SFTP
- [x] [PikPak](https://www.mypikpak.com/)
- [x] [S3](https://aws.amazon.com/s3/)
- [x] [Seafile](https://seafile.com/)
- [x] [UPYUN Storage Service](https://www.upyun.com/products/file-storage)
- [x] WebDav(Support OneDrive/SharePoint without API)
- [x] Teambition([China](https://www.teambition.com/ ),[International](https://us.teambition.com/ ))
- [x] [Mediatrack](https://www.mediatrack.cn/)
- [x] [139yun](https://yun.139.com/) (Personal, Family)
- [x] [YandexDisk](https://disk.yandex.com/)
- [x] [BaiduNetdisk](http://pan.baidu.com/)
- [x] [Terabox](https://www.terabox.com/main)
- [x] [UC](https://drive.uc.cn)
- [x] [Quark](https://pan.quark.cn)
- [x] [Thunder](https://pan.xunlei.com)
- [x] [Lanzou](https://www.lanzou.com/)
- [x] [Aliyundrive share](https://www.aliyundrive.com/)
- [x] [Google photo](https://photos.google.com/)
- [x] [Mega.nz](https://mega.nz)
- [x] [Baidu photo](https://photo.baidu.com/)
- [x] SMB
- [x] [115](https://115.com/)
- [X] Cloudreve
- [x] [Dropbox](https://www.dropbox.com/)
- [x] Easy to deploy and out-of-the-box
- [x] File preview (PDF, markdown, code, plain text, ...)
- [x] Image preview in gallery mode
- [x] Video and audio preview (mp4, mp3, ...)
- [x] Video and audio preview, support lyrics and subtitles
- [x] Office documents preview (docx, pptx, xlsx, ...)
- [x] `README.md` preview rendering
- [x] File permalink copy and direct file download
- [x] Dark mode
- [x] I18n
- [x] Protected routes (password protection and authentication)
- [x] WebDav (A small part readonly, see https://alist-doc.nn.ci/en/docs/intro for details)
- [x] Protected routes (password protection and authentication)
- [x] WebDav (see https://alist.nn.ci/guide/webdav.html for details)
- [x] [Docker Deploy](https://hub.docker.com/r/xhofe/alist)
- [x] Cloudflare workers proxy
- [x] File/Folder package download
- [x] Support video list playback and subtitles(ass,srt,vtt)
- [x] Web upload(Can allow visitors to upload)
## Discussion
Please go to our [discussion forum](https://github.com/Xhofe/alist/discussions) for general questions, **issues are for bug reports only.**
## Demo
Available at: <https://alist.nn.ci>.
![demo](https://inews.gtimg.com/newsapp_ls/0/14256614096/0)
- [x] Web upload(Can allow visitors to upload), delete, mkdir, rename, move and copy
- [x] Offline download
- [x] Copy files between two storage
- [x] Multi-thread downloading acceleration for single-thread download/stream
## Document
<https://alist-doc.nn.ci/en/>
<https://alist.nn.ci/>
## Demo
<https://al.nn.ci>
## Discussion
Please go to our [discussion forum](https://github.com/Xhofe/alist/discussions) for general questions, **issues are for bug reports and feature request only.**
## Sponsor
AList is an open-source software, if you happen to like this project and want me to keep going, please consider sponsoring me or providing a single donation! Thanks for all the love and support:
https://alist.nn.ci/guide/sponsor.html
### Special sponsors
- [亚洲云 - 高防服务器|服务器租用|福州高防|广东电信|香港服务器|美国服务器|海外服务器 - 国内靠谱的企业级云计算服务提供商](https://www.asiayun.com/aff/QQCOOQKZ) (sponsored Chinese API server)
- [找资源 - 阿里云盘资源搜索引擎](https://zhaoziyuan.pw/)
- [JetBrains: Essential tools for software developers and teams](https://www.jetbrains.com/)
## Contributors
Thanks goes to these wonderful people:
[![Contributors](http://contributors.nn.ci/api?repo=alist-org/alist&repo=alist-org/alist-web&repo=alist-org/docs)](https://github.com/alist-org/alist/graphs/contributors)
## License
The `AList` is open-source software licensed under the MIT license.
The `AList` is open-source software licensed under the AGPL-3.0 license.
## Disclaimer
- This program is a free and open source project. It is designed to share files on the network disk, which is convenient for downloading and learning golang. Please abide by relevant laws and regulations when using it, and do not abuse it;
- This program is implemented by calling the official sdk/interface, without destroying the official interface behavior;
- This program only does 302 redirect/traffic forwarding, and does not intercept, store, or tamper with any user data;
- Before using this program, you should understand and bear the corresponding risks, including but not limited to account ban, download speed limit, etc., which is none of this program's business;
- If there is any infringement, please contact me by [email](mailto:i@nn.ci), and it will be dealt with in time.
---
> [@Blog](https://www.nn.ci/) · [@GitHub](https://github.com/Xhofe)
> [@Blog](https://nn.ci/) · [@GitHub](https://github.com/Xhofe) · [@TelegramGroup](https://t.me/alist_chat) · [@Discord](https://discord.gg/F4ymsH4xv2)

View File

@ -1,71 +1,136 @@
<div align="center">
<h3><a href="https://alist.nn.ci">Alist</a></h3>
<p><em>🗂一个支持多存储的文件列表程序,使用 Gin 和 React </em></p>
<a href="https://github.com/Xhofe/alist/releases"><img src="https://img.shields.io/github/release/Xhofe/alist?style=flat-square" alt="latest version"></a>
<a href="https://github.com/Xhofe/alist/discussions"><img src="https://img.shields.io/github/discussions/Xhofe/alist?color=%23ED8936&style=flat-square" alt="discussions"></a>
<a href="https://github.com/Xhofe/alist/actions?query=workflow%3ABuild"><img src="https://img.shields.io/github/workflow/status/Xhofe/alist/build?style=flat-square" alt="Build status"></a>
<a href="https://github.com/Xhofe/alist/releases"><img src="https://img.shields.io/github/downloads/Xhofe/alist/total?style=flat-square&color=%239F7AEA" alt="Downloads"></a>
<a href="https://github.com/Xhofe/alist/blob/v2/LICENSE"><img src="https://img.shields.io/github/license/Xhofe/alist?style=flat-square" alt="License"></a>
<a href="https://pay.xhofe.top">
<img src="https://img.shields.io/badge/%24-donate-ff69b4.svg?style=flat-square" alt="donate">
<a href="https://alist.nn.ci"><img height="100px" alt="logo" src="https://cdn.jsdelivr.net/gh/alist-org/logo@main/logo.svg"/></a>
<p><em>🗂一个支持多存储的文件列表程序,使用 Gin 和 Solidjs</em></p>
<div>
<a href="https://goreportcard.com/report/github.com/alist-org/alist/v3">
<img src="https://goreportcard.com/badge/github.com/alist-org/alist/v3" alt="latest version" />
</a>
<a href="https://github.com/Xhofe/alist/blob/main/LICENSE">
<img src="https://img.shields.io/github/license/Xhofe/alist" alt="License" />
</a>
<a href="https://github.com/Xhofe/alist/actions?query=workflow%3ABuild">
<img src="https://img.shields.io/github/actions/workflow/status/Xhofe/alist/build.yml?branch=main" alt="Build status" />
</a>
<a href="https://github.com/Xhofe/alist/releases">
<img src="https://img.shields.io/github/release/Xhofe/alist" alt="latest version" />
</a>
<a title="Crowdin" target="_blank" href="https://crwd.in/alist">
<img src="https://badges.crowdin.net/alist/localized.svg">
</a>
</div>
<div>
<a href="https://github.com/Xhofe/alist/discussions">
<img src="https://img.shields.io/github/discussions/Xhofe/alist?color=%23ED8936" alt="discussions" />
</a>
<a href="https://discord.gg/F4ymsH4xv2">
<img src="https://img.shields.io/discord/1018870125102895134?logo=discord" alt="discussions" />
</a>
<a href="https://github.com/Xhofe/alist/releases">
<img src="https://img.shields.io/github/downloads/Xhofe/alist/total?color=%239F7AEA&logo=github" alt="Downloads" />
</a>
<a href="https://hub.docker.com/r/xhofe/alist">
<img src="https://img.shields.io/docker/pulls/xhofe/alist?color=%2348BB78&logo=docker&label=pulls" alt="Downloads" />
</a>
<a href="https://alist.nn.ci/zh/guide/sponsor.html">
<img src="https://img.shields.io/badge/%24-sponsor-F87171.svg" alt="sponsor" />
</a>
</div>
</div>
---
[English](./README.md) | 中文
[English](./README.md) | 中文 | [日本語](./README_ja.md) | [Contributing](./CONTRIBUTING.md) | [CODE_OF_CONDUCT](./CODE_OF_CONDUCT.md)
## 支持
## 功能
- [x] 多种存储
- [x] 本地存储
- [x] [阿里云盘](https://www.aliyundrive.com/)
- [x] OneDrive / Sharepoint[国际版](https://www.office.com/), [世纪互联](https://portal.partner.microsoftonline.cn),de,us
- [x] [天翼云盘](https://cloud.189.cn)
- [x] [GoogleDrive](https://drive.google.com/)
- [x] [123云盘](https://www.123pan.com/)
- [x] [蓝奏云](https://pc.woozooo.com/)
- [x] [Alist](https://github.com/Xhofe/alist)
- [x] FTP
- [x] [PikPak](https://www.mypikpak.com/)
- [x] [闪电盘](https://shandianpan.com/)
- [x] [S3](https://aws.amazon.com/cn/s3/)
- [x] WebDav
- [x] 本地存储
- [x] [阿里云盘](https://www.aliyundrive.com/)
- [x] OneDrive / Sharepoint[国际版](https://www.office.com/), [世纪互联](https://portal.partner.microsoftonline.cn),de,us
- [x] [天翼云盘](https://cloud.189.cn) (个人云, 家庭云)
- [x] [GoogleDrive](https://drive.google.com/)
- [x] [123云盘](https://www.123pan.com/)
- [x] FTP / SFTP
- [x] [PikPak](https://www.mypikpak.com/)
- [x] [S3](https://aws.amazon.com/cn/s3/)
- [x] [Seafile](https://seafile.com/)
- [x] [又拍云对象存储](https://www.upyun.com/products/file-storage)
- [x] WebDav(支持无API的OneDrive/SharePoint)
- [x] Teambition[中国](https://www.teambition.com/ )[国际](https://us.teambition.com/ )
- [x] [分秒帧](https://www.mediatrack.cn/)
- [x] [和彩云](https://yun.139.com/) (个人云, 家庭云)
- [x] [Yandex.Disk](https://disk.yandex.com/)
- [x] [百度网盘](http://pan.baidu.com/)
- [x] [UC网盘](https://drive.uc.cn)
- [x] [夸克网盘](https://pan.quark.cn)
- [x] [迅雷网盘](https://pan.xunlei.com)
- [x] [蓝奏云](https://www.lanzou.com/)
- [x] [阿里云盘分享](https://www.aliyundrive.com/)
- [x] [谷歌相册](https://photos.google.com/)
- [x] [Mega.nz](https://mega.nz)
- [x] [一刻相册](https://photo.baidu.com/)
- [x] SMB
- [x] [115](https://115.com/)
- [X] Cloudreve
- [x] [Dropbox](https://www.dropbox.com/)
- [x] 部署方便,开箱即用
- [x] 文件预览PDF、markdown、代码、纯文本……
- [x] 画廊模式下的图像预览
- [x] 视频和音频预览mp4、mp3 等)
- [x] 视频和音频预览,支持歌词和字幕
- [x] Office 文档预览docx、pptx、xlsx、...
- [x] `README.md` 预览渲染
- [x] 文件永久链接复制和直接文件下载
- [x] 黑暗模式
- [x] 国际化
- [x] 受保护的路由(密码保护和身份验证)
- [x] WebDav(少部分只读,具体见https://alist-doc.nn.ci/docs/intro
- [x] WebDav (具体见 https://alist.nn.ci/zh/guide/webdav.html)
- [x] [Docker 部署](https://hub.docker.com/r/xhofe/alist)
- [x] Cloudflare workers 中转
- [x] 文件/文件夹打包下载
- [x] 支持视频列表播放和字幕(ass,srt,vtt)
- [x] 网页上传(可以允许访客上传)
## 讨论
一般问题请到[讨论论坛](https://github.com/Xhofe/alist/discussions) **issue仅针对错误报告。**
## 演示
<https://alist.nn.ci>
![演示](https://inews.gtimg.com/newsapp_ls/0/14256614096/0)
- [x] 网页上传(可以允许访客上传),删除,新建文件夹,重命名,移动,复制
- [x] 离线下载
- [x] 跨存储复制文件
- [x] 单线程下载/串流的多线程下载加速
## 文档
<https://alist-doc.nn.ci/>
<https://alist.nn.ci/zh/>
## 许可
## Demo
`AList` 是在 MIT 许可下许可的开源软件。
<https://al.nn.ci>
## 讨论
一般问题请到[讨论论坛](https://github.com/Xhofe/alist/discussions) **issue仅针对错误报告和功能请求。**
## 赞助
AList 是一个开源软件如果你碰巧喜欢这个项目并希望我继续下去请考虑赞助我或提供一个单一的捐款感谢所有的爱和支持https://alist.nn.ci/zh/guide/sponsor.html
### 特别赞助
- [亚洲云 - 高防服务器|服务器租用|福州高防|广东电信|香港服务器|美国服务器|海外服务器 - 国内靠谱的企业级云计算服务提供商](https://www.asiayun.com/aff/QQCOOQKZ) (国内API服务器赞助)
- [找资源 - 阿里云盘资源搜索引擎](https://zhaoziyuan.pw/)
- [JetBrains: Essential tools for software developers and teams](https://www.jetbrains.com/)
## 贡献者
Thanks goes to these wonderful people:
[![Contributors](http://contributors.nn.ci/api?repo=alist-org/alist&repo=alist-org/alist-web&repo=alist-org/docs)](https://github.com/alist-org/alist/graphs/contributors)
## 许可
`AList` 是在 AGPL-3.0 许可下许可的开源软件。
## 免责声明
- 本程序为免费开源项目旨在分享网盘文件方便下载以及学习golang使用时请遵守相关法律法规请勿滥用
- 本程序通过调用官方sdk/接口实现,无破坏官方接口行为;
- 本程序仅做302重定向/流量转发,不拦截、存储、篡改任何用户数据;
- 在使用本程序之前你应了解并承担相应的风险包括但不限于账号被ban下载限速等与本程序无关
- 如有侵权,请通过[邮件](mailto:i@nn.ci)与我联系,会及时处理。
---
> [@Blog](https://www.nn.ci/) · [@GitHub](https://github.com/Xhofe)
> [@博客](https://nn.ci/) · [@GitHub](https://github.com/Xhofe) · [@Telegram群](https://t.me/alist_chat) · [@Discord](https://discord.gg/F4ymsH4xv2)

138
README_ja.md Normal file
View File

@ -0,0 +1,138 @@
<div align="center">
<a href="https://alist.nn.ci"><img height="100px" alt="logo" src="https://cdn.jsdelivr.net/gh/alist-org/logo@main/logo.svg"/></a>
<p><em>🗂Gin と Solidjs による、複数のストレージをサポートするファイルリストプログラム。</em></p>
<div>
<a href="https://goreportcard.com/report/github.com/alist-org/alist/v3">
<img src="https://goreportcard.com/badge/github.com/alist-org/alist/v3" alt="latest version" />
</a>
<a href="https://github.com/Xhofe/alist/blob/main/LICENSE">
<img src="https://img.shields.io/github/license/Xhofe/alist" alt="License" />
</a>
<a href="https://github.com/Xhofe/alist/actions?query=workflow%3ABuild">
<img src="https://img.shields.io/github/actions/workflow/status/Xhofe/alist/build.yml?branch=main" alt="Build status" />
</a>
<a href="https://github.com/Xhofe/alist/releases">
<img src="https://img.shields.io/github/release/Xhofe/alist" alt="latest version" />
</a>
<a title="Crowdin" target="_blank" href="https://crwd.in/alist">
<img src="https://badges.crowdin.net/alist/localized.svg">
</a>
</div>
<div>
<a href="https://github.com/Xhofe/alist/discussions">
<img src="https://img.shields.io/github/discussions/Xhofe/alist?color=%23ED8936" alt="discussions" />
</a>
<a href="https://discord.gg/F4ymsH4xv2">
<img src="https://img.shields.io/discord/1018870125102895134?logo=discord" alt="discussions" />
</a>
<a href="https://github.com/Xhofe/alist/releases">
<img src="https://img.shields.io/github/downloads/Xhofe/alist/total?color=%239F7AEA&logo=github" alt="Downloads" />
</a>
<a href="https://hub.docker.com/r/xhofe/alist">
<img src="https://img.shields.io/docker/pulls/xhofe/alist?color=%2348BB78&logo=docker&label=pulls" alt="Downloads" />
</a>
<a href="https://alist.nn.ci/guide/sponsor.html">
<img src="https://img.shields.io/badge/%24-sponsor-F87171.svg" alt="sponsor" />
</a>
</div>
</div>
---
[English](./README.md) | [中文](./README_cn.md) | 日本語 | [Contributing](./CONTRIBUTING.md) | [CODE_OF_CONDUCT](./CODE_OF_CONDUCT.md)
## 特徴
- [x] マルチストレージ
- [x] ローカルストレージ
- [x] [Aliyundrive](https://www.aliyundrive.com/)
- [x] OneDrive / Sharepoint ([グローバル](https://www.office.com/), [cn](https://portal.partner.microsoftonline.cn),de,us)
- [x] [189cloud](https://cloud.189.cn) (Personal, Family)
- [x] [GoogleDrive](https://drive.google.com/)
- [x] [123pan](https://www.123pan.com/)
- [x] FTP / SFTP
- [x] [PikPak](https://www.mypikpak.com/)
- [x] [S3](https://aws.amazon.com/s3/)
- [x] [Seafile](https://seafile.com/)
- [x] [UPYUN Storage Service](https://www.upyun.com/products/file-storage)
- [x] WebDav(Support OneDrive/SharePoint without API)
- [x] Teambition([China](https://www.teambition.com/ ),[International](https://us.teambition.com/ ))
- [x] [Mediatrack](https://www.mediatrack.cn/)
- [x] [139yun](https://yun.139.com/) (Personal, Family)
- [x] [YandexDisk](https://disk.yandex.com/)
- [x] [BaiduNetdisk](http://pan.baidu.com/)
- [x] [Terabox](https://www.terabox.com/main)
- [x] [UC](https://drive.uc.cn)
- [x] [Quark](https://pan.quark.cn)
- [x] [Thunder](https://pan.xunlei.com)
- [x] [Lanzou](https://www.lanzou.com/)
- [x] [Aliyundrive share](https://www.aliyundrive.com/)
- [x] [Google photo](https://photos.google.com/)
- [x] [Mega.nz](https://mega.nz)
- [x] [Baidu photo](https://photo.baidu.com/)
- [x] SMB
- [x] [115](https://115.com/)
- [X] Cloudreve
- [x] [Dropbox](https://www.dropbox.com/)
- [x] デプロイが簡単で、すぐに使える
- [x] ファイルプレビュー (PDF, マークダウン, コード, プレーンテキスト, ...)
- [x] ギャラリーモードでの画像プレビュー
- [x] ビデオとオーディオのプレビュー、歌詞と字幕のサポート
- [x] Office ドキュメントのプレビュー (docx, pptx, xlsx, ...)
- [x] `README.md` のプレビューレンダリング
- [x] ファイルのパーマリンクコピーと直接ダウンロード
- [x] ダークモード
- [x] 国際化
- [x] 保護されたルート (パスワード保護と認証)
- [x] WebDav (詳細は https://alist.nn.ci/guide/webdav.html を参照)
- [x] [Docker デプロイ](https://hub.docker.com/r/xhofe/alist)
- [x] Cloudflare ワーカープロキシ
- [x] ファイル/フォルダパッケージのダウンロード
- [x] ウェブアップロード(訪問者にアップロードを許可できる), 削除, mkdir, 名前変更, 移動, コピー
- [x] オフラインダウンロード
- [x] 二つのストレージ間でファイルをコピー
- [x] シングルスレッドのダウンロード/ストリーム向けのマルチスレッド ダウンロード アクセラレーション
## ドキュメント
<https://alist.nn.ci/>
## デモ
<https://al.nn.ci>
## ディスカッション
一般的なご質問は[ディスカッションフォーラム](https://github.com/Xhofe/alist/discussions)をご利用ください。**問題はバグレポートと機能リクエストのみです。**
## スポンサー
AList はオープンソースのソフトウェアです。もしあなたがこのプロジェクトを気に入ってくださり、続けて欲しいと思ってくださるなら、ぜひスポンサーになってくださるか、1口でも寄付をしてくださるようご検討くださいすべての愛とサポートに感謝します:
https://alist.nn.ci/guide/sponsor.html
### スペシャルスポンサー
- [亚洲云 - 高防服务器|服务器租用|福州高防|广东电信|香港服务器|美国服务器|海外服务器 - 国内靠谱的企业级云计算服务提供商](https://www.asiayun.com/aff/QQCOOQKZ) (sponsored Chinese API server)
- [找资源 - 阿里云盘资源搜索引擎](https://zhaoziyuan.pw/)
- [JetBrains: Essential tools for software developers and teams](https://www.jetbrains.com/)
## コントリビューター
これらの素晴らしい人々に感謝します:
[![Contributors](http://contributors.nn.ci/api?repo=alist-org/alist&repo=alist-org/alist-web&repo=alist-org/docs)](https://github.com/alist-org/alist/graphs/contributors)
## ライセンス
`AList` は AGPL-3.0 ライセンスの下でライセンスされたオープンソースソフトウェアです。
## 免責事項
- このプログラムはフリーでオープンソースのプロジェクトです。ネットワークディスク上でファイルを共有するように設計されており、golang のダウンロードや学習に便利です。利用にあたっては関連法規を遵守し、悪用しないようお願いします;
- このプログラムは、公式インターフェースの動作を破壊することなく、公式 sdk/インターフェースを呼び出すことで実装されています;
- このプログラムは、302リダイレクト/トラフィック転送のみを行い、いかなるユーザーデータも傍受、保存、改ざんしません;
- このプログラムを使用する前に、アカウントの禁止、ダウンロード速度の制限など、対応するリスクを理解し、負担する必要があります;
- もし侵害があれば、[メール](mailto:i@nn.ci)で私に連絡してください。
---
> [@Blog](https://nn.ci/) · [@GitHub](https://github.com/Xhofe) · [@TelegramGroup](https://t.me/alist_chat) · [@Discord](https://discord.gg/F4ymsH4xv2)

View File

@ -1,430 +0,0 @@
const HOST = "YOUR_HOST";
const TOKEN = "YOUR_TOKEN";
addEventListener("fetch", (event) => {
const request = event.request;
const url = new URL(request.url);
const sign = url.searchParams.get("sign");
if (request.method === "OPTIONS") {
// Handle CORS preflight requests
event.respondWith(handleOptions(request));
} else if (sign && sign.length === 16) {
// Handle requests to the Down server
event.respondWith(handleDownload(request));
} else {
// Handle requests to the API server
event.respondWith(handleRequest(event));
}
});
const corsHeaders = {
"Access-Control-Allow-Origin": "*",
"Access-Control-Allow-Methods": "GET,HEAD,POST,OPTIONS",
"Access-Control-Max-Age": "86400",
};
async function handleDownload(request) {
const origin = request.headers.get("origin");
const url = new URL(request.url);
const path = decodeURI(url.pathname);
const sign = url.searchParams.get("sign");
const name = path.split("/").pop();
const right = md5(`alist-${TOKEN}-${name}`).slice(8, 24);
if (sign !== right) {
const resp = new Response(
JSON.stringify({
code: 401,
message: `sign mismatch`,
}),
{
headers: {
"content-type": "application/json;charset=UTF-8",
},
}
);
resp.headers.set("Access-Control-Allow-Origin", origin);
return resp;
}
let resp = await fetch(`${HOST}/api/admin/link`, {
method: "POST",
headers: {
"content-type": "application/json;charset=UTF-8",
Authorization: TOKEN,
},
body: JSON.stringify({
path: path,
}),
});
let res = await resp.json();
if (res.code !== 200) {
return new Response(JSON.stringify(res));
}
request = new Request(res.data.url, request);
if (res.data.headers) {
for (const header of res.data.headers) {
request.headers.set(header.name, header.value);
}
}
let response = await fetch(request);
// Recreate the response so we can modify the headers
response = new Response(response.body, response);
// Set CORS headers
response.headers.set("Access-Control-Allow-Origin", origin);
// Append to/Add Vary header so browser will cache response correctly
response.headers.append("Vary", "Origin");
return response;
}
/**
* Respond to the request
* @param {Request} request
*/
async function handleRequest(event) {
const { request } = event;
//请求头部、返回对象
let reqHeaders = new Headers(request.headers),
outBody,
outStatus = 200,
outStatusText = "OK",
outCt = null,
outHeaders = new Headers({
"Access-Control-Allow-Origin": reqHeaders.get("Origin"),
"Access-Control-Allow-Methods": "GET, POST, PUT, PATCH, DELETE, OPTIONS",
"Access-Control-Allow-Headers":
reqHeaders.get("Access-Control-Allow-Headers") ||
"Accept, Authorization, Cache-Control, Content-Type, DNT, If-Modified-Since, Keep-Alive, Origin, User-Agent, X-Requested-With, Token, x-access-token, Notion-Version",
});
try {
//取域名第一个斜杠后的所有信息为代理链接
let url = request.url.substr(8);
url = decodeURIComponent(url.substr(url.indexOf("/") + 1));
//需要忽略的代理
if (
request.method == "OPTIONS" &&
reqHeaders.has("access-control-request-headers")
) {
//输出提示
return new Response(null, PREFLIGHT_INIT);
} else if (
url.length < 3 ||
url.indexOf(".") == -1 ||
url == "favicon.ico" ||
url == "robots.txt"
) {
return Response.redirect("https://baidu.com", 301);
}
//阻断
else if (blocker.check(url)) {
return Response.redirect("https://baidu.com", 301);
} else {
//补上前缀 http://
url = url
.replace(/https:(\/)*/, "https://")
.replace(/http:(\/)*/, "http://");
if (url.indexOf("://") == -1) {
url = "http://" + url;
}
//构建 fetch 参数
let fp = {
method: request.method,
headers: {},
};
//保留头部其它信息
let he = reqHeaders.entries();
for (let h of he) {
if (!["content-length"].includes(h[0])) {
fp.headers[h[0]] = h[1];
}
}
// 是否带 body
if (["POST", "PUT", "PATCH", "DELETE"].indexOf(request.method) >= 0) {
const ct = (reqHeaders.get("content-type") || "").toLowerCase();
if (ct.includes("application/json")) {
let requestJSON = await request.json();
console.log(typeof requestJSON);
fp.body = JSON.stringify(requestJSON);
} else if (
ct.includes("application/text") ||
ct.includes("text/html")
) {
fp.body = await request.text();
} else if (ct.includes("form")) {
// fp.body = await request.formData();
fp.body = await request.text();
} else {
fp.body = await request.blob();
}
}
// 发起 fetch
let fr = await fetch(url, fp);
outCt = fr.headers.get("content-type");
if (outCt.includes("application/text") || outCt.includes("text/html")) {
try {
// 添加base
let newFr = new HTMLRewriter()
.on("head", {
element(element) {
element.prepend(`<base href="${url}" />`, {
html: true,
});
},
})
.transform(fr);
fr = newFr;
} catch (e) {}
}
outStatus = fr.status;
outStatusText = fr.statusText;
outBody = fr.body;
}
} catch (err) {
outCt = "application/json";
outBody = JSON.stringify({
code: -1,
msg: JSON.stringify(err.stack) || err,
});
}
//设置类型
if (outCt && outCt != "") {
outHeaders.set("content-type", outCt);
}
let response = new Response(outBody, {
status: outStatus,
statusText: outStatusText,
headers: outHeaders,
});
return response;
}
const blocker = {
keys: [],
check: function (url) {
url = url.toLowerCase();
let len = blocker.keys.filter((x) => url.includes(x)).length;
return len != 0;
},
};
function handleOptions(request) {
// Make sure the necessary headers are present
// for this to be a valid pre-flight request
let headers = request.headers;
if (
headers.get("Origin") !== null &&
headers.get("Access-Control-Request-Method") !== null
// && headers.get("Access-Control-Request-Headers") !== null
) {
// Handle CORS pre-flight request.
// If you want to check or reject the requested method + headers
// you can do that here.
let respHeaders = {
...corsHeaders,
// Allow all future content Request headers to go back to browser
// such as Authorization (Bearer) or X-Client-Name-Version
"Access-Control-Allow-Headers": request.headers.get(
"Access-Control-Request-Headers"
),
};
return new Response(null, {
headers: respHeaders,
});
} else {
// Handle standard OPTIONS request.
// If you want to allow other HTTP Methods, you can do that here.
return new Response(null, {
headers: {
Allow: "GET, HEAD, POST, OPTIONS",
},
});
}
}
!(function (a) {
"use strict";
function b(a, b) {
var c = (65535 & a) + (65535 & b),
d = (a >> 16) + (b >> 16) + (c >> 16);
return (d << 16) | (65535 & c);
}
function c(a, b) {
return (a << b) | (a >>> (32 - b));
}
function d(a, d, e, f, g, h) {
return b(c(b(b(d, a), b(f, h)), g), e);
}
function e(a, b, c, e, f, g, h) {
return d((b & c) | (~b & e), a, b, f, g, h);
}
function f(a, b, c, e, f, g, h) {
return d((b & e) | (c & ~e), a, b, f, g, h);
}
function g(a, b, c, e, f, g, h) {
return d(b ^ c ^ e, a, b, f, g, h);
}
function h(a, b, c, e, f, g, h) {
return d(c ^ (b | ~e), a, b, f, g, h);
}
function i(a, c) {
(a[c >> 5] |= 128 << c % 32), (a[(((c + 64) >>> 9) << 4) + 14] = c);
var d,
i,
j,
k,
l,
m = 1732584193,
n = -271733879,
o = -1732584194,
p = 271733878;
for (d = 0; d < a.length; d += 16)
(i = m),
(j = n),
(k = o),
(l = p),
(m = e(m, n, o, p, a[d], 7, -680876936)),
(p = e(p, m, n, o, a[d + 1], 12, -389564586)),
(o = e(o, p, m, n, a[d + 2], 17, 606105819)),
(n = e(n, o, p, m, a[d + 3], 22, -1044525330)),
(m = e(m, n, o, p, a[d + 4], 7, -176418897)),
(p = e(p, m, n, o, a[d + 5], 12, 1200080426)),
(o = e(o, p, m, n, a[d + 6], 17, -1473231341)),
(n = e(n, o, p, m, a[d + 7], 22, -45705983)),
(m = e(m, n, o, p, a[d + 8], 7, 1770035416)),
(p = e(p, m, n, o, a[d + 9], 12, -1958414417)),
(o = e(o, p, m, n, a[d + 10], 17, -42063)),
(n = e(n, o, p, m, a[d + 11], 22, -1990404162)),
(m = e(m, n, o, p, a[d + 12], 7, 1804603682)),
(p = e(p, m, n, o, a[d + 13], 12, -40341101)),
(o = e(o, p, m, n, a[d + 14], 17, -1502002290)),
(n = e(n, o, p, m, a[d + 15], 22, 1236535329)),
(m = f(m, n, o, p, a[d + 1], 5, -165796510)),
(p = f(p, m, n, o, a[d + 6], 9, -1069501632)),
(o = f(o, p, m, n, a[d + 11], 14, 643717713)),
(n = f(n, o, p, m, a[d], 20, -373897302)),
(m = f(m, n, o, p, a[d + 5], 5, -701558691)),
(p = f(p, m, n, o, a[d + 10], 9, 38016083)),
(o = f(o, p, m, n, a[d + 15], 14, -660478335)),
(n = f(n, o, p, m, a[d + 4], 20, -405537848)),
(m = f(m, n, o, p, a[d + 9], 5, 568446438)),
(p = f(p, m, n, o, a[d + 14], 9, -1019803690)),
(o = f(o, p, m, n, a[d + 3], 14, -187363961)),
(n = f(n, o, p, m, a[d + 8], 20, 1163531501)),
(m = f(m, n, o, p, a[d + 13], 5, -1444681467)),
(p = f(p, m, n, o, a[d + 2], 9, -51403784)),
(o = f(o, p, m, n, a[d + 7], 14, 1735328473)),
(n = f(n, o, p, m, a[d + 12], 20, -1926607734)),
(m = g(m, n, o, p, a[d + 5], 4, -378558)),
(p = g(p, m, n, o, a[d + 8], 11, -2022574463)),
(o = g(o, p, m, n, a[d + 11], 16, 1839030562)),
(n = g(n, o, p, m, a[d + 14], 23, -35309556)),
(m = g(m, n, o, p, a[d + 1], 4, -1530992060)),
(p = g(p, m, n, o, a[d + 4], 11, 1272893353)),
(o = g(o, p, m, n, a[d + 7], 16, -155497632)),
(n = g(n, o, p, m, a[d + 10], 23, -1094730640)),
(m = g(m, n, o, p, a[d + 13], 4, 681279174)),
(p = g(p, m, n, o, a[d], 11, -358537222)),
(o = g(o, p, m, n, a[d + 3], 16, -722521979)),
(n = g(n, o, p, m, a[d + 6], 23, 76029189)),
(m = g(m, n, o, p, a[d + 9], 4, -640364487)),
(p = g(p, m, n, o, a[d + 12], 11, -421815835)),
(o = g(o, p, m, n, a[d + 15], 16, 530742520)),
(n = g(n, o, p, m, a[d + 2], 23, -995338651)),
(m = h(m, n, o, p, a[d], 6, -198630844)),
(p = h(p, m, n, o, a[d + 7], 10, 1126891415)),
(o = h(o, p, m, n, a[d + 14], 15, -1416354905)),
(n = h(n, o, p, m, a[d + 5], 21, -57434055)),
(m = h(m, n, o, p, a[d + 12], 6, 1700485571)),
(p = h(p, m, n, o, a[d + 3], 10, -1894986606)),
(o = h(o, p, m, n, a[d + 10], 15, -1051523)),
(n = h(n, o, p, m, a[d + 1], 21, -2054922799)),
(m = h(m, n, o, p, a[d + 8], 6, 1873313359)),
(p = h(p, m, n, o, a[d + 15], 10, -30611744)),
(o = h(o, p, m, n, a[d + 6], 15, -1560198380)),
(n = h(n, o, p, m, a[d + 13], 21, 1309151649)),
(m = h(m, n, o, p, a[d + 4], 6, -145523070)),
(p = h(p, m, n, o, a[d + 11], 10, -1120210379)),
(o = h(o, p, m, n, a[d + 2], 15, 718787259)),
(n = h(n, o, p, m, a[d + 9], 21, -343485551)),
(m = b(m, i)),
(n = b(n, j)),
(o = b(o, k)),
(p = b(p, l));
return [m, n, o, p];
}
function j(a) {
var b,
c = "";
for (b = 0; b < 32 * a.length; b += 8)
c += String.fromCharCode((a[b >> 5] >>> b % 32) & 255);
return c;
}
function k(a) {
var b,
c = [];
for (c[(a.length >> 2) - 1] = void 0, b = 0; b < c.length; b += 1) c[b] = 0;
for (b = 0; b < 8 * a.length; b += 8)
c[b >> 5] |= (255 & a.charCodeAt(b / 8)) << b % 32;
return c;
}
function l(a) {
return j(i(k(a), 8 * a.length));
}
function m(a, b) {
var c,
d,
e = k(a),
f = [],
g = [];
for (
f[15] = g[15] = void 0, e.length > 16 && (e = i(e, 8 * a.length)), c = 0;
16 > c;
c += 1
)
(f[c] = 909522486 ^ e[c]), (g[c] = 1549556828 ^ e[c]);
return (d = i(f.concat(k(b)), 512 + 8 * b.length)), j(i(g.concat(d), 640));
}
function n(a) {
var b,
c,
d = "0123456789abcdef",
e = "";
for (c = 0; c < a.length; c += 1)
(b = a.charCodeAt(c)), (e += d.charAt((b >>> 4) & 15) + d.charAt(15 & b));
return e;
}
function o(a) {
return unescape(encodeURIComponent(a));
}
function p(a) {
return l(o(a));
}
function q(a) {
return n(p(a));
}
function r(a, b) {
return m(o(a), o(b));
}
function s(a, b) {
return n(r(a, b));
}
function t(a, b, c) {
return b ? (c ? r(b, a) : s(b, a)) : c ? p(a) : q(a);
}
"function" == typeof define && define.amd
? define(function () {
return t;
})
: (a.md5 = t);
})(this);

View File

@ -1,59 +0,0 @@
package main
import (
"fmt"
"github.com/Xhofe/alist/bootstrap"
"github.com/Xhofe/alist/conf"
_ "github.com/Xhofe/alist/drivers"
"github.com/Xhofe/alist/model"
"github.com/Xhofe/alist/server"
"github.com/gin-gonic/gin"
log "github.com/sirupsen/logrus"
)
func Init() bool {
//bootstrap.InitLog()
bootstrap.InitConf()
bootstrap.InitCron()
bootstrap.InitModel()
if conf.Password {
pass, err := model.GetSettingByKey("password")
if err != nil {
log.Errorf(err.Error())
return false
}
log.Infof("current password: %s", pass.Value)
return false
}
server.InitIndex()
bootstrap.InitSettings()
bootstrap.InitAccounts()
bootstrap.InitCache()
return true
}
func main() {
if conf.Version {
fmt.Printf("Built At: %s\nGo Version: %s\nAuthor: %s\nCommit ID: %s\nVersion: %s\n", conf.BuiltAt, conf.GoVersion, conf.GitAuthor, conf.GitCommit, conf.GitTag)
return
}
if !Init() {
return
}
if !conf.Debug {
gin.SetMode(gin.ReleaseMode)
}
r := gin.Default()
server.InitApiRouter(r)
base := fmt.Sprintf("%s:%d", conf.Conf.Address, conf.Conf.Port)
log.Infof("start server @ %s", base)
var err error
if conf.Conf.Scheme.Https {
err = r.RunTLS(base, conf.Conf.Scheme.CertFile, conf.Conf.Scheme.KeyFile)
} else {
err = r.Run(base)
}
if err != nil {
log.Errorf("failed to start: %s", err.Error())
}
}

View File

@ -1,12 +0,0 @@
package main
import (
"fmt"
"net/url"
"testing"
)
func TestUrl(t *testing.T) {
s,_ := url.QueryUnescape("/ali/%E7%8C%AA%E5%A4%B4%E7%9A%84%E6%96%87%E4%BB%B6%5B%E5%98%BF%E5%98%BF%5D/%E9%82%B9%E9%82%B9%E7%9A%84%E6%96%87%E4%BB%B6/%E6%A1%8C%E9%9D%A2%E5%A3%81%E7%BA%B8/v2-e8f266ba17ae387eefed1cb22b2b5e4e_r.jpg")
fmt.Print(s)
}

View File

@ -1,30 +0,0 @@
package bootstrap
import (
"github.com/Xhofe/alist/conf"
"github.com/Xhofe/alist/drivers/base"
"github.com/Xhofe/alist/model"
log "github.com/sirupsen/logrus"
)
func InitAccounts() {
log.Infof("init accounts...")
var accounts []model.Account
if err := conf.DB.Find(&accounts).Error; err != nil {
log.Fatalf("failed sync init accounts")
}
for i, account := range accounts {
model.RegisterAccount(account)
driver, ok := base.GetDriver(account.Type)
if !ok {
log.Errorf("no [%s] driver", account.Type)
} else {
err := driver.Save(&accounts[i], nil)
if err != nil {
log.Errorf("init account [%s] error:[%s]", account.Name, err.Error())
} else {
log.Infof("success init account: %s, type: %s", account.Name, account.Type)
}
}
}
}

View File

@ -1,22 +0,0 @@
package bootstrap
import (
"github.com/Xhofe/alist/conf"
"github.com/eko/gocache/v2/cache"
"github.com/eko/gocache/v2/store"
goCache "github.com/patrickmn/go-cache"
log "github.com/sirupsen/logrus"
"time"
)
// InitCache init cache
func InitCache() {
log.Infof("init cache...")
c := conf.Conf.Cache
if c.Expiration == 0 {
c.Expiration, c.CleanupInterval = 60, 120
}
goCacheClient := goCache.New(time.Duration(c.Expiration)*time.Minute, time.Duration(c.CleanupInterval)*time.Minute)
goCacheStore := store.NewGoCache(goCacheClient, nil)
conf.Cache = cache.New(goCacheStore)
}

View File

@ -1,36 +0,0 @@
package bootstrap
import (
"encoding/json"
"github.com/Xhofe/alist/conf"
"github.com/Xhofe/alist/utils"
log "github.com/sirupsen/logrus"
"io/ioutil"
)
// InitConf init config
func InitConf() {
log.Infof("reading config file: %s", conf.ConfigFile)
if !utils.Exists(conf.ConfigFile) {
log.Infof("config file not exists, creating default config file")
_, err := utils.CreatNestedFile(conf.ConfigFile)
if err != nil {
log.Fatalf("failed to create config file")
}
conf.Conf = conf.DefaultConfig()
if !utils.WriteToJson(conf.ConfigFile, conf.Conf) {
log.Fatalf("failed to create default config file")
}
return
}
config, err := ioutil.ReadFile(conf.ConfigFile)
if err != nil {
log.Fatalf("reading config file error:%s", err.Error())
}
conf.Conf = new(conf.Config)
err = json.Unmarshal(config, conf.Conf)
if err != nil {
log.Fatalf("load config error: %s", err.Error())
}
log.Debugf("config:%+v", conf.Conf)
}

View File

@ -1,14 +0,0 @@
package bootstrap
import (
"github.com/Xhofe/alist/conf"
"github.com/robfig/cron/v3"
log "github.com/sirupsen/logrus"
)
// InitCron init cron
func InitCron() {
log.Infof("init cron...")
conf.Cron = cron.New()
conf.Cron.Start()
}

View File

@ -1,32 +0,0 @@
package bootstrap
import (
"flag"
"github.com/Xhofe/alist/conf"
log "github.com/sirupsen/logrus"
)
// InitLog init log
func InitLog() {
if conf.Debug {
log.SetLevel(log.DebugLevel)
log.SetReportCaller(true)
}
log.SetFormatter(&log.TextFormatter{
//DisableColors: true,
ForceColors: true,
EnvironmentOverrideColors: true,
TimestampFormat: "2006-01-02 15:04:05",
FullTimestamp: true,
})
log.Infof("init log...")
}
func init() {
flag.StringVar(&conf.ConfigFile, "conf", "data/config.json", "config file")
flag.BoolVar(&conf.Debug, "debug", false, "start with debug mode")
flag.BoolVar(&conf.Version, "version", false, "print version info")
flag.BoolVar(&conf.Password, "password", false, "print current password")
flag.Parse()
InitLog()
}

View File

@ -1,80 +0,0 @@
package bootstrap
import (
"fmt"
"github.com/Xhofe/alist/conf"
"github.com/Xhofe/alist/model"
log "github.com/sirupsen/logrus"
"gorm.io/driver/mysql"
"gorm.io/driver/postgres"
"gorm.io/driver/sqlite"
"gorm.io/gorm"
"gorm.io/gorm/logger"
"gorm.io/gorm/schema"
log2 "log"
"os"
"strings"
"time"
)
func InitModel() {
log.Infof("init model...")
databaseConfig := conf.Conf.Database
newLogger := logger.New(
log2.New(os.Stdout, "\r\n", log2.LstdFlags),
logger.Config{
SlowThreshold: time.Second,
LogLevel: logger.Silent,
IgnoreRecordNotFoundError: true,
Colorful: true,
},
)
gormConfig := &gorm.Config{
NamingStrategy: schema.NamingStrategy{
TablePrefix: databaseConfig.TablePrefix,
},
Logger: newLogger,
}
switch databaseConfig.Type {
case "sqlite3":
{
if !(strings.HasSuffix(databaseConfig.DBFile, ".db") && len(databaseConfig.DBFile) > 3) {
log.Fatalf("db name error.")
}
db, err := gorm.Open(sqlite.Open(databaseConfig.DBFile), gormConfig)
if err != nil {
log.Fatalf("failed to connect database:%s", err.Error())
}
conf.DB = db
}
case "mysql":
{
dsn := fmt.Sprintf("%s:%s@tcp(%s:%d)/%s?charset=utf8mb4&parseTime=True&loc=Local",
databaseConfig.User, databaseConfig.Password, databaseConfig.Host, databaseConfig.Port, databaseConfig.Name)
db, err := gorm.Open(mysql.Open(dsn), gormConfig)
if err != nil {
log.Fatalf("failed to connect database:%s", err.Error())
}
conf.DB = db
}
case "postgres":
{
dsn := fmt.Sprintf("host=%s user=%s password=%s dbname=%s port=%d sslmode=disable TimeZone=Asia/Shanghai",
databaseConfig.Host, databaseConfig.User, databaseConfig.Password, databaseConfig.Name, databaseConfig.Port)
db, err := gorm.Open(postgres.Open(dsn), gormConfig)
if err != nil {
log.Errorf("failed to connect database:%s", err.Error())
}
conf.DB = db
}
default:
log.Fatalf("not supported database type: %s", databaseConfig.Type)
}
log.Infof("auto migrate model...")
err := conf.DB.AutoMigrate(&model.SettingItem{}, &model.Account{}, &model.Meta{})
if err != nil {
log.Fatalf("failed to auto migrate")
}
}

View File

@ -1,223 +0,0 @@
package bootstrap
import (
"github.com/Xhofe/alist/conf"
"github.com/Xhofe/alist/model"
log "github.com/sirupsen/logrus"
"gorm.io/gorm"
"strings"
)
func InitSettings() {
log.Infof("init settings...")
err := model.SaveSetting(model.Version)
if err != nil {
log.Fatalf("failed write setting: %s", err.Error())
}
settings := []model.SettingItem{
{
Key: "title",
Value: "Alist",
Description: "title",
Type: "string",
Access: model.PUBLIC,
Group: model.FRONT,
},
{
Key: "password",
Value: "alist",
Description: "password",
Type: "string",
Access: model.PRIVATE,
Group: model.BACK,
},
{
Key: "logo",
Value: "https://store.heytapimage.com/cdo-portal/feedback/202112/05/1542f45f86b8609495b69c5380753135.png",
Description: "logo",
Type: "string",
Access: model.PUBLIC,
Group: model.FRONT,
},
{
Key: "favicon",
Value: "https://store.heytapimage.com/cdo-portal/feedback/202112/05/1542f45f86b8609495b69c5380753135.png",
Description: "favicon",
Type: "string",
Access: model.PUBLIC,
Group: model.FRONT,
},
{
Key: "icon color",
Value: "#1890ff",
Description: "icon's color",
Type: "string",
Access: model.PUBLIC,
Group: model.FRONT,
},
{
Key: "text types",
Value: strings.Join(conf.TextTypes, ","),
Type: "string",
Description: "text type extensions",
Group: model.FRONT,
},
{
Key: "hide readme file",
Value: "true",
Type: "bool",
Description: "hide readme file? ",
Group: model.FRONT,
},
{
Key: "music cover",
Value: "https://store.heytapimage.com/cdo-portal/feedback/202110/30/d43c41c5d257c9bc36366e310374fb19.png",
Description: "music cover image",
Type: "string",
Access: model.PUBLIC,
Group: model.FRONT,
},
{
Key: "site beian",
Description: "chinese beian info",
Type: "string",
Access: model.PUBLIC,
Group: model.FRONT,
},
{
Key: "home readme url",
Description: "when have multiple, the readme file to show",
Type: "string",
Access: model.PUBLIC,
Group: model.FRONT,
},
{
Key: "autoplay video",
Value: "false",
Type: "bool",
Access: model.PUBLIC,
Group: model.FRONT,
},
{
Key: "autoplay audio",
Value: "false",
Type: "bool",
Access: model.PUBLIC,
Group: model.FRONT,
},
{
Key: "check parent folder",
Value: "false",
Type: "bool",
Description: "check parent folder password",
Access: model.PRIVATE,
Group: model.BACK,
},
{
Key: "customize head",
Value: "",
Type: "text",
Description: "Customize head, placed at the beginning of the head",
Access: model.PRIVATE,
Group: model.FRONT,
},
{
Key: "customize body",
Value: "",
Type: "text",
Description: "Customize script, placed at the end of the body",
Access: model.PRIVATE,
Group: model.FRONT,
},
{
Key: "animation",
Value: "true",
Type: "bool",
Description: "when there are a lot of files, the animation will freeze when opening",
Access: model.PUBLIC,
Group: model.FRONT,
},
{
Key: "check down link",
Value: "false",
Type: "bool",
Description: "check down link password, your link will be 'https://alist.com/d/filename?pw=xxx'",
Access: model.PUBLIC,
Group: model.BACK,
},
{
Key: "WebDAV username",
Value: "alist_admin",
Description: "WebDAV username",
Type: "string",
Access: model.PRIVATE,
Group: model.BACK,
},
{
Key: "WebDAV password",
Value: "alist_admin",
Description: "WebDAV password",
Type: "string",
Access: model.PRIVATE,
Group: model.BACK,
},
{
Key: "artplayer whitelist",
Value: "*",
Description: "refer to https://artplayer.org/document/options#whitelist",
Type: "string",
Access: model.PUBLIC,
Group: model.FRONT,
},
{
Key: "artplayer autoSize",
Value: "true",
Description: "refer to https://artplayer.org/document/options#autosize",
Type: "bool",
Access: model.PUBLIC,
Group: model.FRONT,
},
{
Key: "Visitor WebDAV username",
Value: "alist_visitor",
Description: "Visitor WebDAV username",
Type: "string",
Access: model.PRIVATE,
Group: model.BACK,
},
{
Key: "Visitor WebDAV password",
Value: "alist_visitor",
Description: "Visitor WebDAV password",
Type: "string",
Access: model.PRIVATE,
Group: model.BACK,
},
}
for i, _ := range settings {
v := settings[i]
v.Version = conf.GitTag
o, err := model.GetSettingByKey(v.Key)
if err != nil {
if err == gorm.ErrRecordNotFound {
err = model.SaveSetting(v)
if err != nil {
log.Fatalf("failed write setting: %s", err.Error())
}
} else {
log.Fatal("can't get setting: %s", err.Error())
}
} else {
//o.Version = conf.GitTag
//err = model.SaveSetting(*o)
v.Value = o.Value
err = model.SaveSetting(v)
if err != nil {
log.Fatalf("failed write setting: %s", err.Error())
}
}
}
model.LoadSettings()
}

289
build.sh
View File

@ -1,110 +1,201 @@
#!/bin/bash
if [ "$1" == "web" ]; then
git clone https://github.com/Xhofe/alist-web.git
cd alist-web || exit
yarn
yarn build
mv dist/* ../public
cd ..
exit 0
fi
go env -w GOPROXY=https://goproxy.cn,https://mirrors.aliyun.com/goproxy/,https://goproxy.io,direct
if [ "$1" == "docker" ]; then
appName="alist"
builtAt="$(date +'%F %T %z')"
goVersion=$(go version | sed 's/go version //')
gitAuthor=$(git show -s --format='format:%aN <%ae>' HEAD)
gitCommit=$(git log --pretty=format:"%h" -1)
gitTag=$(git describe --long --tags --dirty --always)
ldflags="\
-w -s \
-X 'github.com/Xhofe/alist/conf.BuiltAt=$builtAt' \
-X 'github.com/Xhofe/alist/conf.GoVersion=$goVersion' \
-X 'github.com/Xhofe/alist/conf.GitAuthor=$gitAuthor' \
-X 'github.com/Xhofe/alist/conf.GitCommit=$gitCommit' \
-X 'github.com/Xhofe/alist/conf.GitTag=$gitTag' \
"
go build -o ./bin/alist -ldflags="$ldflags" alist.go
exit 0
fi
cd alist-web || exit
webCommit=$(git log --pretty=format:"%h" -1)
echo "web commit id: $webCommit"
yarn
if [ "$1" == "release" ]; then
yarn build --base="https://cdn.jsdelivr.net/gh/Xhofe/alist-web@cdn/v2/$webCommit"
mv dist/assets ..
mv dist/index.html ../alist/public
# 构建local
yarn build
mv dist/index.html dist/local.html
mv dist/* ../alist/public
else
yarn build
mv dist/* ../alist/public
fi
cd ..
cd alist
appName="alist"
builtAt="$(date +'%F %T %z')"
goVersion=$(go version | sed 's/go version //')
gitAuthor=$(git show -s --format='format:%aN <%ae>' HEAD)
gitAuthor="Xhofe <i@nn.ci>"
gitCommit=$(git log --pretty=format:"%h" -1)
gitTag=$(git describe --long --tags --dirty --always)
echo "build version: $gitTag"
if [ "$1" = "dev" ]; then
version="dev"
webVersion="dev"
else
version=$(git describe --abbrev=0 --tags)
webVersion=$(wget -qO- -t1 -T2 "https://api.github.com/repos/alist-org/alist-web/releases/latest" | grep "tag_name" | head -n 1 | awk -F ":" '{print $2}' | sed 's/\"//g;s/,//g;s/ //g')
fi
echo "backend version: $version"
echo "frontend version: $webVersion"
ldflags="\
-w -s \
-X 'github.com/Xhofe/alist/conf.BuiltAt=$builtAt' \
-X 'github.com/Xhofe/alist/conf.GoVersion=$goVersion' \
-X 'github.com/Xhofe/alist/conf.GitAuthor=$gitAuthor' \
-X 'github.com/Xhofe/alist/conf.GitCommit=$gitCommit' \
-X 'github.com/Xhofe/alist/conf.GitTag=$gitTag' \
-X 'github.com/alist-org/alist/v3/internal/conf.BuiltAt=$builtAt' \
-X 'github.com/alist-org/alist/v3/internal/conf.GoVersion=$goVersion' \
-X 'github.com/alist-org/alist/v3/internal/conf.GitAuthor=$gitAuthor' \
-X 'github.com/alist-org/alist/v3/internal/conf.GitCommit=$gitCommit' \
-X 'github.com/alist-org/alist/v3/internal/conf.Version=$version' \
-X 'github.com/alist-org/alist/v3/internal/conf.WebVersion=$webVersion' \
"
if [ "$1" == "release" ]; then
xgo -out alist -ldflags="$ldflags" .
else
xgo -targets=linux/amd64,windows/amd64 -out alist -ldflags="$ldflags" .
fi
mkdir "build"
mv alist-* build
cd build || exit
upx -9 ./*
find . -type f -print0 | xargs -0 md5sum > md5.txt
cat md5.txt
# compress file (release)
if [ "$1" == "release" ]; then
mkdir compress
mv md5.txt compress
for i in `find . -type f -name "$appName-linux-*"`
do
tar -czvf compress/"$i".tar.gz "$i"
done
for i in `find . -type f -name "$appName-darwin-*"`
do
tar -czvf compress/"$i".tar.gz "$i"
done
for i in `find . -type f -name "$appName-windows-*"`
do
zip compress/$(echo $i | sed 's/\.[^.]*$//').zip "$i"
done
fi
cd ../..
FetchWebDev() {
curl -L https://codeload.github.com/alist-org/web-dist/tar.gz/refs/heads/dev -o web-dist-dev.tar.gz
tar -zxvf web-dist-dev.tar.gz
rm -rf public/dist
mv -f web-dist-dev/dist public
rm -rf web-dist-dev web-dist-dev.tar.gz
}
if [ "$1" == "release" ]; then
cd alist-web
git checkout cdn
mkdir "v2/$webCommit"
mv ../assets/ v2/$webCommit
git add .
git config --local user.email "i@nn.ci"
git config --local user.name "Xhofe"
git commit --allow-empty -m "upload $webCommit assets files" -a
cd ..
fi
FetchWebRelease() {
curl -L https://github.com/alist-org/alist-web/releases/latest/download/dist.tar.gz -o dist.tar.gz
tar -zxvf dist.tar.gz
rm -rf public/dist
mv -f dist public
rm -rf dist.tar.gz
}
BuildWinArm64() {
echo building for windows-arm64
chmod +x ./wrapper/zcc-arm64
chmod +x ./wrapper/zcxx-arm64
export GOOS=windows
export GOARCH=arm64
export CC=$(pwd)/wrapper/zcc-arm64
export CXX=$(pwd)/wrapper/zcxx-arm64
go build -o "$1" -ldflags="$ldflags" -tags=jsoniter .
}
BuildDev() {
rm -rf .git/
mkdir -p "dist"
muslflags="--extldflags '-static -fpic' $ldflags"
BASE="https://musl.nn.ci/"
FILES=(x86_64-linux-musl-cross aarch64-linux-musl-cross)
for i in "${FILES[@]}"; do
url="${BASE}${i}.tgz"
curl -L -o "${i}.tgz" "${url}"
sudo tar xf "${i}.tgz" --strip-components 1 -C /usr/local
done
OS_ARCHES=(linux-musl-amd64 linux-musl-arm64)
CGO_ARGS=(x86_64-linux-musl-gcc aarch64-linux-musl-gcc)
for i in "${!OS_ARCHES[@]}"; do
os_arch=${OS_ARCHES[$i]}
cgo_cc=${CGO_ARGS[$i]}
echo building for ${os_arch}
export GOOS=${os_arch%%-*}
export GOARCH=${os_arch##*-}
export CC=${cgo_cc}
export CGO_ENABLED=1
go build -o ./dist/$appName-$os_arch -ldflags="$muslflags" -tags=jsoniter .
done
xgo -targets=windows/amd64,darwin/amd64 -out "$appName" -ldflags="$ldflags" -tags=jsoniter .
mv alist-* dist
cd dist
cp ./alist-windows-amd64.exe ./alist-windows-amd64-upx.exe
upx -9 ./alist-windows-amd64-upx.exe
find . -type f -print0 | xargs -0 md5sum >md5.txt
cat md5.txt
}
BuildDocker() {
go build -o ./bin/alist -ldflags="$ldflags" -tags=jsoniter .
}
BuildRelease() {
rm -rf .git/
mkdir -p "build"
muslflags="--extldflags '-static -fpic' $ldflags"
BASE="https://musl.nn.ci/"
FILES=(x86_64-linux-musl-cross aarch64-linux-musl-cross mips-linux-musl-cross mips64-linux-musl-cross mips64el-linux-musl-cross mipsel-linux-musl-cross powerpc64le-linux-musl-cross s390x-linux-musl-cross)
for i in "${FILES[@]}"; do
url="${BASE}${i}.tgz"
curl -L -o "${i}.tgz" "${url}"
sudo tar xf "${i}.tgz" --strip-components 1 -C /usr/local
rm -f "${i}.tgz"
done
OS_ARCHES=(linux-musl-amd64 linux-musl-arm64 linux-musl-mips linux-musl-mips64 linux-musl-mips64le linux-musl-mipsle linux-musl-ppc64le linux-musl-s390x)
CGO_ARGS=(x86_64-linux-musl-gcc aarch64-linux-musl-gcc mips-linux-musl-gcc mips64-linux-musl-gcc mips64el-linux-musl-gcc mipsel-linux-musl-gcc powerpc64le-linux-musl-gcc s390x-linux-musl-gcc)
for i in "${!OS_ARCHES[@]}"; do
os_arch=${OS_ARCHES[$i]}
cgo_cc=${CGO_ARGS[$i]}
echo building for ${os_arch}
export GOOS=${os_arch%%-*}
export GOARCH=${os_arch##*-}
export CC=${cgo_cc}
export CGO_ENABLED=1
go build -o ./build/$appName-$os_arch -ldflags="$muslflags" -tags=jsoniter .
done
BuildWinArm64 ./build/alist-windows-arm64.exe
xgo -out "$appName" -ldflags="$ldflags" -tags=jsoniter .
# why? Because some target platforms seem to have issues with upx compression
upx -9 ./alist-linux-amd64
cp ./alist-windows-amd64.exe ./alist-windows-amd64-upx.exe
upx -9 ./alist-windows-amd64-upx.exe
mv alist-* build
}
BuildReleaseLinuxMuslArm() {
rm -rf .git/
mkdir -p "build"
muslflags="--extldflags '-static -fpic' $ldflags"
BASE="https://musl.nn.ci/"
# FILES=(arm-linux-musleabi-cross arm-linux-musleabihf-cross armeb-linux-musleabi-cross armeb-linux-musleabihf-cross armel-linux-musleabi-cross armel-linux-musleabihf-cross armv5l-linux-musleabi-cross armv5l-linux-musleabihf-cross armv6-linux-musleabi-cross armv6-linux-musleabihf-cross armv7l-linux-musleabihf-cross armv7m-linux-musleabi-cross armv7r-linux-musleabihf-cross)
FILES=(arm-linux-musleabi-cross arm-linux-musleabihf-cross armel-linux-musleabi-cross armel-linux-musleabihf-cross armv5l-linux-musleabi-cross armv5l-linux-musleabihf-cross armv6-linux-musleabi-cross armv6-linux-musleabihf-cross armv7l-linux-musleabihf-cross armv7m-linux-musleabi-cross armv7r-linux-musleabihf-cross)
for i in "${FILES[@]}"; do
url="${BASE}${i}.tgz"
curl -L -o "${i}.tgz" "${url}"
sudo tar xf "${i}.tgz" --strip-components 1 -C /usr/local
rm -f "${i}.tgz"
done
# OS_ARCHES=(linux-musleabi-arm linux-musleabihf-arm linux-musleabi-armeb linux-musleabihf-armeb linux-musleabi-armel linux-musleabihf-armel linux-musleabi-armv5l linux-musleabihf-armv5l linux-musleabi-armv6 linux-musleabihf-armv6 linux-musleabihf-armv7l linux-musleabi-armv7m linux-musleabihf-armv7r)
# CGO_ARGS=(arm-linux-musleabi-gcc arm-linux-musleabihf-gcc armeb-linux-musleabi-gcc armeb-linux-musleabihf-gcc armel-linux-musleabi-gcc armel-linux-musleabihf-gcc armv5l-linux-musleabi-gcc armv5l-linux-musleabihf-gcc armv6-linux-musleabi-gcc armv6-linux-musleabihf-gcc armv7l-linux-musleabihf-gcc armv7m-linux-musleabi-gcc armv7r-linux-musleabihf-gcc)
# GOARMS=('' '' '' '' '' '' '5' '5' '6' '6' '7' '7' '7')
OS_ARCHES=(linux-musleabi-arm linux-musleabihf-arm linux-musleabi-armel linux-musleabihf-armel linux-musleabi-armv5l linux-musleabihf-armv5l linux-musleabi-armv6 linux-musleabihf-armv6 linux-musleabihf-armv7l linux-musleabi-armv7m linux-musleabihf-armv7r)
CGO_ARGS=(arm-linux-musleabi-gcc arm-linux-musleabihf-gcc armel-linux-musleabi-gcc armel-linux-musleabihf-gcc armv5l-linux-musleabi-gcc armv5l-linux-musleabihf-gcc armv6-linux-musleabi-gcc armv6-linux-musleabihf-gcc armv7l-linux-musleabihf-gcc armv7m-linux-musleabi-gcc armv7r-linux-musleabihf-gcc)
GOARMS=('' '' '' '' '5' '5' '6' '6' '7' '7' '7')
for i in "${!OS_ARCHES[@]}"; do
os_arch=${OS_ARCHES[$i]}
cgo_cc=${CGO_ARGS[$i]}
arm=${GOARMS[$i]}
echo building for ${os_arch}
export GOOS=linux
export GOARCH=arm
export CC=${cgo_cc}
export CGO_ENABLED=1
export GOARM=${arm}
go build -o ./build/$appName-$os_arch -ldflags="$muslflags" -tags=jsoniter .
done
}
MakeRelease() {
cd build
mkdir compress
for i in $(find . -type f -name "$appName-linux-*"); do
cp "$i" alist
tar -czvf compress/"$i".tar.gz alist
rm -f alist
done
for i in $(find . -type f -name "$appName-darwin-*"); do
cp "$i" alist
tar -czvf compress/"$i".tar.gz alist
rm -f alist
done
for i in $(find . -type f -name "$appName-windows-*"); do
cp "$i" alist.exe
zip compress/$(echo $i | sed 's/\.[^.]*$//').zip alist.exe
rm -f alist.exe
done
cd compress
find . -type f -print0 | xargs -0 md5sum >"$1"
cat "$1"
cd ../..
}
if [ "$1" = "dev" ]; then
FetchWebDev
if [ "$2" = "docker" ]; then
BuildDocker
else
BuildDev
fi
elif [ "$1" = "release" ]; then
FetchWebRelease
if [ "$2" = "docker" ]; then
BuildDocker
elif [ "$2" = "linux_musl_arm" ]; then
BuildReleaseLinuxMuslArm
MakeRelease "md5-linux-musl-arm.txt"
else
BuildRelease
MakeRelease "md5.txt"
fi
else
echo -e "Parameter error"
fi

97
cmd/admin.go Normal file
View File

@ -0,0 +1,97 @@
/*
Copyright © 2022 NAME HERE <EMAIL ADDRESS>
*/
package cmd
import (
"github.com/alist-org/alist/v3/internal/conf"
"github.com/alist-org/alist/v3/internal/op"
"github.com/alist-org/alist/v3/internal/setting"
"github.com/alist-org/alist/v3/pkg/utils"
"github.com/alist-org/alist/v3/pkg/utils/random"
"github.com/spf13/cobra"
)
// AdminCmd represents the password command
var AdminCmd = &cobra.Command{
Use: "admin",
Aliases: []string{"password"},
Short: "Show admin user's info and some operations about admin user's password",
Run: func(cmd *cobra.Command, args []string) {
Init()
admin, err := op.GetAdmin()
if err != nil {
utils.Log.Errorf("failed get admin user: %+v", err)
} else {
utils.Log.Infof("Admin user's username: %s", admin.Username)
utils.Log.Infof("The password can only be output at the first startup, and then stored as a hash value, which cannot be reversed")
utils.Log.Infof("You can reset the password with a random string by running [alist admin random]")
utils.Log.Infof("You can also set a new password by running [alist admin set NEW_PASSWORD]")
}
},
}
var RandomPasswordCmd = &cobra.Command{
Use: "random",
Short: "Reset admin user's password to a random string",
Run: func(cmd *cobra.Command, args []string) {
newPwd := random.String(8)
setAdminPassword(newPwd)
},
}
var SetPasswordCmd = &cobra.Command{
Use: "set",
Short: "Set admin user's password",
Run: func(cmd *cobra.Command, args []string) {
if len(args) == 0 {
utils.Log.Errorf("Please enter the new password")
return
}
setAdminPassword(args[0])
},
}
var ShowTokenCmd = &cobra.Command{
Use: "token",
Short: "Show admin token",
Run: func(cmd *cobra.Command, args []string) {
Init()
token := setting.GetStr(conf.Token)
utils.Log.Infof("Admin token: %s", token)
},
}
func setAdminPassword(pwd string) {
Init()
admin, err := op.GetAdmin()
if err != nil {
utils.Log.Errorf("failed get admin user: %+v", err)
return
}
admin.SetPassword(pwd)
if err := op.UpdateUser(admin); err != nil {
utils.Log.Errorf("failed update admin user: %+v", err)
return
}
utils.Log.Infof("admin user has been updated:")
utils.Log.Infof("username: %s", admin.Username)
utils.Log.Infof("password: %s", pwd)
DelAdminCacheOnline()
}
func init() {
RootCmd.AddCommand(AdminCmd)
AdminCmd.AddCommand(RandomPasswordCmd)
AdminCmd.AddCommand(SetPasswordCmd)
AdminCmd.AddCommand(ShowTokenCmd)
// Here you will define your flags and configuration settings.
// Cobra supports Persistent Flags which will work for this command
// and all subcommands, e.g.:
// passwordCmd.PersistentFlags().String("foo", "", "A help for foo")
// Cobra supports local flags which will only run when this command
// is called directly, e.g.:
// passwordCmd.Flags().BoolP("toggle", "t", false, "Help message for toggle")
}

45
cmd/cancel2FA.go Normal file
View File

@ -0,0 +1,45 @@
/*
Copyright © 2022 NAME HERE <EMAIL ADDRESS>
*/
package cmd
import (
"github.com/alist-org/alist/v3/internal/op"
"github.com/alist-org/alist/v3/pkg/utils"
"github.com/spf13/cobra"
)
// Cancel2FACmd represents the delete2fa command
var Cancel2FACmd = &cobra.Command{
Use: "cancel2fa",
Short: "Delete 2FA of admin user",
Run: func(cmd *cobra.Command, args []string) {
Init()
admin, err := op.GetAdmin()
if err != nil {
utils.Log.Errorf("failed to get admin user: %+v", err)
} else {
err := op.Cancel2FAByUser(admin)
if err != nil {
utils.Log.Errorf("failed to cancel 2FA: %+v", err)
} else {
utils.Log.Info("2FA canceled")
DelAdminCacheOnline()
}
}
},
}
func init() {
RootCmd.AddCommand(Cancel2FACmd)
// Here you will define your flags and configuration settings.
// Cobra supports Persistent Flags which will work for this command
// and all subcommands, e.g.:
// cancel2FACmd.PersistentFlags().String("foo", "", "A help for foo")
// Cobra supports local flags which will only run when this command
// is called directly, e.g.:
// cancel2FACmd.Flags().BoolP("toggle", "t", false, "Help message for toggle")
}

44
cmd/common.go Normal file
View File

@ -0,0 +1,44 @@
package cmd
import (
"os"
"path/filepath"
"strconv"
"github.com/alist-org/alist/v3/internal/bootstrap"
"github.com/alist-org/alist/v3/internal/bootstrap/data"
"github.com/alist-org/alist/v3/pkg/utils"
log "github.com/sirupsen/logrus"
)
func Init() {
bootstrap.InitConfig()
bootstrap.Log()
bootstrap.InitDB()
data.InitData()
bootstrap.InitIndex()
}
var pid = -1
var pidFile string
func initDaemon() {
ex, err := os.Executable()
if err != nil {
log.Fatal(err)
}
exPath := filepath.Dir(ex)
_ = os.MkdirAll(filepath.Join(exPath, "daemon"), 0700)
pidFile = filepath.Join(exPath, "daemon/pid")
if utils.Exists(pidFile) {
bytes, err := os.ReadFile(pidFile)
if err != nil {
log.Fatal("failed to read pid file", err)
}
id, err := strconv.Atoi(string(bytes))
if err != nil {
log.Fatal("failed to parse pid data", err)
}
pid = id
}
}

10
cmd/flags/config.go Normal file
View File

@ -0,0 +1,10 @@
package flags
var (
DataDir string
Debug bool
NoPrefix bool
Dev bool
ForceBinDir bool
LogStd bool
)

161
cmd/lang.go Normal file
View File

@ -0,0 +1,161 @@
/*
Package cmd
Copyright © 2022 Noah Hsu<i@nn.ci>
*/
package cmd
import (
"fmt"
"io"
"os"
"reflect"
"strings"
_ "github.com/alist-org/alist/v3/drivers"
"github.com/alist-org/alist/v3/internal/bootstrap/data"
"github.com/alist-org/alist/v3/internal/conf"
"github.com/alist-org/alist/v3/internal/op"
"github.com/alist-org/alist/v3/pkg/utils"
log "github.com/sirupsen/logrus"
"github.com/spf13/cobra"
)
type KV[V any] map[string]V
type Drivers KV[KV[interface{}]]
func firstUpper(s string) string {
if s == "" {
return ""
}
return strings.ToUpper(s[:1]) + s[1:]
}
func convert(s string) string {
ss := strings.Split(s, "_")
ans := strings.Join(ss, " ")
return firstUpper(ans)
}
func writeFile(name string, data interface{}) {
f, err := os.Open(fmt.Sprintf("../alist-web/src/lang/en/%s.json", name))
if err != nil {
log.Errorf("failed to open %s.json: %+v", name, err)
return
}
defer f.Close()
content, err := io.ReadAll(f)
if err != nil {
log.Errorf("failed to read %s.json: %+v", name, err)
return
}
oldData := make(map[string]interface{})
newData := make(map[string]interface{})
err = utils.Json.Unmarshal(content, &oldData)
if err != nil {
log.Errorf("failed to unmarshal %s.json: %+v", name, err)
return
}
content, err = utils.Json.Marshal(data)
if err != nil {
log.Errorf("failed to marshal json: %+v", err)
return
}
err = utils.Json.Unmarshal(content, &newData)
if err != nil {
log.Errorf("failed to unmarshal json: %+v", err)
return
}
if reflect.DeepEqual(oldData, newData) {
log.Infof("%s.json no changed, skip", name)
} else {
log.Infof("%s.json changed, update file", name)
//log.Infof("old: %+v\nnew:%+v", oldData, data)
utils.WriteJsonToFile(fmt.Sprintf("lang/%s.json", name), newData, true)
}
}
func generateDriversJson() {
drivers := make(Drivers)
drivers["drivers"] = make(KV[interface{}])
drivers["config"] = make(KV[interface{}])
driverInfoMap := op.GetDriverInfoMap()
for k, v := range driverInfoMap {
drivers["drivers"][k] = convert(k)
items := make(KV[interface{}])
config := map[string]string{}
if v.Config.Alert != "" {
alert := strings.SplitN(v.Config.Alert, "|", 2)
if len(alert) > 1 {
config["alert"] = alert[1]
}
}
drivers["config"][k] = config
for i := range v.Additional {
item := v.Additional[i]
items[item.Name] = convert(item.Name)
if item.Help != "" {
items[fmt.Sprintf("%s-tips", item.Name)] = item.Help
}
if item.Type == conf.TypeSelect && len(item.Options) > 0 {
options := make(KV[string])
_options := strings.Split(item.Options, ",")
for _, o := range _options {
options[o] = convert(o)
}
items[fmt.Sprintf("%ss", item.Name)] = options
}
}
drivers[k] = items
}
writeFile("drivers", drivers)
}
func generateSettingsJson() {
settings := data.InitialSettings()
settingsLang := make(KV[any])
for _, setting := range settings {
settingsLang[setting.Key] = convert(setting.Key)
if setting.Help != "" {
settingsLang[fmt.Sprintf("%s-tips", setting.Key)] = setting.Help
}
if setting.Type == conf.TypeSelect && len(setting.Options) > 0 {
options := make(KV[string])
_options := strings.Split(setting.Options, ",")
for _, o := range _options {
options[o] = convert(o)
}
settingsLang[fmt.Sprintf("%ss", setting.Key)] = options
}
}
writeFile("settings", settingsLang)
//utils.WriteJsonToFile("lang/settings.json", settingsLang)
}
// LangCmd represents the lang command
var LangCmd = &cobra.Command{
Use: "lang",
Short: "Generate language json file",
Run: func(cmd *cobra.Command, args []string) {
err := os.MkdirAll("lang", 0777)
if err != nil {
utils.Log.Fatal("failed create folder: %s", err.Error())
}
generateDriversJson()
generateSettingsJson()
},
}
func init() {
RootCmd.AddCommand(LangCmd)
// Here you will define your flags and configuration settings.
// Cobra supports Persistent Flags which will work for this command
// and all subcommands, e.g.:
// langCmd.PersistentFlags().String("foo", "", "A help for foo")
// Cobra supports local flags which will only run when this command
// is called directly, e.g.:
// langCmd.Flags().BoolP("toggle", "t", false, "Help message for toggle")
}

32
cmd/restart.go Normal file
View File

@ -0,0 +1,32 @@
/*
Copyright © 2022 NAME HERE <EMAIL ADDRESS>
*/
package cmd
import (
"github.com/spf13/cobra"
)
// RestartCmd represents the restart command
var RestartCmd = &cobra.Command{
Use: "restart",
Short: "Restart alist server by daemon/pid file",
Run: func(cmd *cobra.Command, args []string) {
stop()
start()
},
}
func init() {
RootCmd.AddCommand(RestartCmd)
// Here you will define your flags and configuration settings.
// Cobra supports Persistent Flags which will work for this command
// and all subcommands, e.g.:
// restartCmd.PersistentFlags().String("foo", "", "A help for foo")
// Cobra supports local flags which will only run when this command
// is called directly, e.g.:
// restartCmd.Flags().BoolP("toggle", "t", false, "Help message for toggle")
}

33
cmd/root.go Normal file
View File

@ -0,0 +1,33 @@
package cmd
import (
"fmt"
"os"
"github.com/alist-org/alist/v3/cmd/flags"
"github.com/spf13/cobra"
)
var RootCmd = &cobra.Command{
Use: "alist",
Short: "A file list program that supports multiple storage.",
Long: `A file list program that supports multiple storage,
built with love by Xhofe and friends in Go/Solid.js.
Complete documentation is available at https://alist.nn.ci/`,
}
func Execute() {
if err := RootCmd.Execute(); err != nil {
fmt.Fprintln(os.Stderr, err)
os.Exit(1)
}
}
func init() {
RootCmd.PersistentFlags().StringVar(&flags.DataDir, "data", "data", "data folder")
RootCmd.PersistentFlags().BoolVar(&flags.Debug, "debug", false, "start with debug mode")
RootCmd.PersistentFlags().BoolVar(&flags.NoPrefix, "no-prefix", false, "disable env prefix")
RootCmd.PersistentFlags().BoolVar(&flags.Dev, "dev", false, "start with dev mode")
RootCmd.PersistentFlags().BoolVar(&flags.ForceBinDir, "force-bin-dir", false, "Force to use the directory where the binary file is located as data directory")
RootCmd.PersistentFlags().BoolVar(&flags.LogStd, "log-std", false, "Force to log to std")
}

160
cmd/server.go Normal file
View File

@ -0,0 +1,160 @@
package cmd
import (
"context"
"fmt"
"net"
"net/http"
"os"
"os/signal"
"strconv"
"sync"
"syscall"
"time"
"github.com/alist-org/alist/v3/cmd/flags"
_ "github.com/alist-org/alist/v3/drivers"
"github.com/alist-org/alist/v3/internal/bootstrap"
"github.com/alist-org/alist/v3/internal/conf"
"github.com/alist-org/alist/v3/pkg/utils"
"github.com/alist-org/alist/v3/server"
"github.com/gin-gonic/gin"
log "github.com/sirupsen/logrus"
"github.com/spf13/cobra"
)
// ServerCmd represents the server command
var ServerCmd = &cobra.Command{
Use: "server",
Short: "Start the server at the specified address",
Long: `Start the server at the specified address
the address is defined in config file`,
Run: func(cmd *cobra.Command, args []string) {
Init()
if conf.Conf.DelayedStart != 0 {
utils.Log.Infof("delayed start for %d seconds", conf.Conf.DelayedStart)
time.Sleep(time.Duration(conf.Conf.DelayedStart) * time.Second)
}
bootstrap.InitAria2()
bootstrap.InitQbittorrent()
bootstrap.LoadStorages()
if !flags.Debug && !flags.Dev {
gin.SetMode(gin.ReleaseMode)
}
r := gin.New()
r.Use(gin.LoggerWithWriter(log.StandardLogger().Out), gin.RecoveryWithWriter(log.StandardLogger().Out))
server.Init(r)
var httpSrv, httpsSrv, unixSrv *http.Server
if conf.Conf.Scheme.HttpPort != -1 {
httpBase := fmt.Sprintf("%s:%d", conf.Conf.Scheme.Address, conf.Conf.Scheme.HttpPort)
utils.Log.Infof("start HTTP server @ %s", httpBase)
httpSrv = &http.Server{Addr: httpBase, Handler: r}
go func() {
err := httpSrv.ListenAndServe()
if err != nil && err != http.ErrServerClosed {
utils.Log.Fatalf("failed to start http: %s", err.Error())
}
}()
}
if conf.Conf.Scheme.HttpsPort != -1 {
httpsBase := fmt.Sprintf("%s:%d", conf.Conf.Scheme.Address, conf.Conf.Scheme.HttpsPort)
utils.Log.Infof("start HTTPS server @ %s", httpsBase)
httpsSrv = &http.Server{Addr: httpsBase, Handler: r}
go func() {
err := httpsSrv.ListenAndServeTLS(conf.Conf.Scheme.CertFile, conf.Conf.Scheme.KeyFile)
if err != nil && err != http.ErrServerClosed {
utils.Log.Fatalf("failed to start https: %s", err.Error())
}
}()
}
if conf.Conf.Scheme.UnixFile != "" {
utils.Log.Infof("start unix server @ %s", conf.Conf.Scheme.UnixFile)
unixSrv = &http.Server{Handler: r}
go func() {
listener, err := net.Listen("unix", conf.Conf.Scheme.UnixFile)
if err != nil {
utils.Log.Fatalf("failed to listen unix: %+v", err)
}
// set socket file permission
mode, err := strconv.ParseUint(conf.Conf.Scheme.UnixFilePerm, 8, 32)
if err != nil {
utils.Log.Errorf("failed to parse socket file permission: %+v", err)
} else {
err = os.Chmod(conf.Conf.Scheme.UnixFile, os.FileMode(mode))
if err != nil {
utils.Log.Errorf("failed to chmod socket file: %+v", err)
}
}
err = unixSrv.Serve(listener)
if err != nil && err != http.ErrServerClosed {
utils.Log.Fatalf("failed to start unix: %s", err.Error())
}
}()
}
// Wait for interrupt signal to gracefully shutdown the server with
// a timeout of 1 second.
quit := make(chan os.Signal, 1)
// kill (no param) default send syscanll.SIGTERM
// kill -2 is syscall.SIGINT
// kill -9 is syscall. SIGKILL but can"t be catch, so don't need add it
signal.Notify(quit, syscall.SIGINT, syscall.SIGTERM)
<-quit
utils.Log.Println("Shutdown server...")
ctx, cancel := context.WithTimeout(context.Background(), 1*time.Second)
defer cancel()
var wg sync.WaitGroup
if conf.Conf.Scheme.HttpPort != -1 {
wg.Add(1)
go func() {
defer wg.Done()
if err := httpSrv.Shutdown(ctx); err != nil {
utils.Log.Fatal("HTTP server shutdown err: ", err)
}
}()
}
if conf.Conf.Scheme.HttpsPort != -1 {
wg.Add(1)
go func() {
defer wg.Done()
if err := httpsSrv.Shutdown(ctx); err != nil {
utils.Log.Fatal("HTTPS server shutdown err: ", err)
}
}()
}
if conf.Conf.Scheme.UnixFile != "" {
wg.Add(1)
go func() {
defer wg.Done()
if err := unixSrv.Shutdown(ctx); err != nil {
utils.Log.Fatal("Unix server shutdown err: ", err)
}
}()
}
wg.Wait()
utils.Log.Println("Server exit")
},
}
func init() {
RootCmd.AddCommand(ServerCmd)
// Here you will define your flags and configuration settings.
// Cobra supports Persistent Flags which will work for this command
// and all subcommands, e.g.:
// serverCmd.PersistentFlags().String("foo", "", "A help for foo")
// Cobra supports local flags which will only run when this command
// is called directly, e.g.:
// serverCmd.Flags().BoolP("toggle", "t", false, "Help message for toggle")
}
// OutAlistInit 暴露用于外部启动server的函数
func OutAlistInit() {
var (
cmd *cobra.Command
args []string
)
ServerCmd.Run(cmd, args)
}

71
cmd/start.go Normal file
View File

@ -0,0 +1,71 @@
/*
Copyright © 2022 NAME HERE <EMAIL ADDRESS>
*/
package cmd
import (
"os"
"os/exec"
"path/filepath"
"strconv"
log "github.com/sirupsen/logrus"
"github.com/spf13/cobra"
)
// StartCmd represents the start command
var StartCmd = &cobra.Command{
Use: "start",
Short: "Silent start alist server with `--force-bin-dir`",
Run: func(cmd *cobra.Command, args []string) {
start()
},
}
func start() {
initDaemon()
if pid != -1 {
_, err := os.FindProcess(pid)
if err == nil {
log.Info("alist already started, pid ", pid)
return
}
}
args := os.Args
args[1] = "server"
args = append(args, "--force-bin-dir")
cmd := &exec.Cmd{
Path: args[0],
Args: args,
Env: os.Environ(),
}
stdout, err := os.OpenFile(filepath.Join(filepath.Dir(pidFile), "start.log"), os.O_WRONLY|os.O_APPEND|os.O_CREATE, 0666)
if err != nil {
log.Fatal(os.Getpid(), ": failed to open start log file:", err)
}
cmd.Stderr = stdout
cmd.Stdout = stdout
err = cmd.Start()
if err != nil {
log.Fatal("failed to start children process: ", err)
}
log.Infof("success start pid: %d", cmd.Process.Pid)
err = os.WriteFile(pidFile, []byte(strconv.Itoa(cmd.Process.Pid)), 0666)
if err != nil {
log.Warn("failed to record pid, you may not be able to stop the program with `./alist stop`")
}
}
func init() {
RootCmd.AddCommand(StartCmd)
// Here you will define your flags and configuration settings.
// Cobra supports Persistent Flags which will work for this command
// and all subcommands, e.g.:
// startCmd.PersistentFlags().String("foo", "", "A help for foo")
// Cobra supports local flags which will only run when this command
// is called directly, e.g.:
// startCmd.Flags().BoolP("toggle", "t", false, "Help message for toggle")
}

58
cmd/stop.go Normal file
View File

@ -0,0 +1,58 @@
/*
Copyright © 2022 NAME HERE <EMAIL ADDRESS>
*/
package cmd
import (
"os"
log "github.com/sirupsen/logrus"
"github.com/spf13/cobra"
)
// StopCmd represents the stop command
var StopCmd = &cobra.Command{
Use: "stop",
Short: "Stop alist server by daemon/pid file",
Run: func(cmd *cobra.Command, args []string) {
stop()
},
}
func stop() {
initDaemon()
if pid == -1 {
log.Info("Seems not have been started. Try use `alist start` to start server.")
return
}
process, err := os.FindProcess(pid)
if err != nil {
log.Errorf("failed to find process by pid: %d, reason: %v", pid, process)
return
}
err = process.Kill()
if err != nil {
log.Errorf("failed to kill process %d: %v", pid, err)
} else {
log.Info("killed process: ", pid)
}
err = os.Remove(pidFile)
if err != nil {
log.Errorf("failed to remove pid file")
}
pid = -1
}
func init() {
RootCmd.AddCommand(StopCmd)
// Here you will define your flags and configuration settings.
// Cobra supports Persistent Flags which will work for this command
// and all subcommands, e.g.:
// stopCmd.PersistentFlags().String("foo", "", "A help for foo")
// Cobra supports local flags which will only run when this command
// is called directly, e.g.:
// stopCmd.Flags().BoolP("toggle", "t", false, "Help message for toggle")
}

52
cmd/storage.go Normal file
View File

@ -0,0 +1,52 @@
/*
Copyright © 2023 NAME HERE <EMAIL ADDRESS>
*/
package cmd
import (
"github.com/alist-org/alist/v3/internal/db"
"github.com/alist-org/alist/v3/pkg/utils"
"github.com/spf13/cobra"
)
// storageCmd represents the storage command
var storageCmd = &cobra.Command{
Use: "storage",
Short: "Manage storage",
}
func init() {
var mountPath string
var disable = &cobra.Command{
Use: "disable",
Short: "Disable a storage",
Run: func(cmd *cobra.Command, args []string) {
Init()
storage, err := db.GetStorageByMountPath(mountPath)
if err != nil {
utils.Log.Errorf("failed to query storage: %+v", err)
} else {
storage.Disabled = true
err = db.UpdateStorage(storage)
if err != nil {
utils.Log.Errorf("failed to update storage: %+v", err)
} else {
utils.Log.Infof("Storage with mount path [%s] have been disabled", mountPath)
}
}
},
}
disable.Flags().StringVarP(&mountPath, "mount-path", "m", "", "The mountPath of storage")
RootCmd.AddCommand(storageCmd)
storageCmd.AddCommand(disable)
// Here you will define your flags and configuration settings.
// Cobra supports Persistent Flags which will work for this command
// and all subcommands, e.g.:
// storageCmd.PersistentFlags().String("foo", "", "A help for foo")
// Cobra supports local flags which will only run when this command
// is called directly, e.g.:
// storageCmd.Flags().BoolP("toggle", "t", false, "Help message for toggle")
}

52
cmd/user.go Normal file
View File

@ -0,0 +1,52 @@
package cmd
import (
"crypto/tls"
"fmt"
"time"
"github.com/alist-org/alist/v3/internal/conf"
"github.com/alist-org/alist/v3/internal/op"
"github.com/alist-org/alist/v3/internal/setting"
"github.com/alist-org/alist/v3/pkg/utils"
"github.com/go-resty/resty/v2"
)
func DelAdminCacheOnline() {
admin, err := op.GetAdmin()
if err != nil {
utils.Log.Errorf("[del_admin_cache] get admin error: %+v", err)
return
}
DelUserCacheOnline(admin.Username)
}
func DelUserCacheOnline(username string) {
client := resty.New().SetTimeout(1 * time.Second).SetTLSClientConfig(&tls.Config{InsecureSkipVerify: conf.Conf.TlsInsecureSkipVerify})
token := setting.GetStr(conf.Token)
port := conf.Conf.Scheme.HttpPort
u := fmt.Sprintf("http://localhost:%d/api/admin/user/del_cache", port)
if port == -1 {
if conf.Conf.Scheme.HttpsPort == -1 {
utils.Log.Warnf("[del_user_cache] no open port")
return
}
u = fmt.Sprintf("https://localhost:%d/api/admin/user/del_cache", conf.Conf.Scheme.HttpsPort)
}
res, err := client.R().SetHeader("Authorization", token).SetQueryParam("username", username).Post(u)
if err != nil {
utils.Log.Warnf("[del_user_cache_online] failed: %+v", err)
return
}
if res.StatusCode() != 200 {
utils.Log.Warnf("[del_user_cache_online] failed: %+v", res.String())
return
}
code := utils.Json.Get(res.Body(), "code").ToInt()
msg := utils.Json.Get(res.Body(), "message").ToString()
if code != 200 {
utils.Log.Errorf("[del_user_cache_online] error: %s", msg)
return
}
utils.Log.Debugf("[del_user_cache_online] del user [%s] cache success", username)
}

43
cmd/version.go Normal file
View File

@ -0,0 +1,43 @@
/*
Copyright © 2022 NAME HERE <EMAIL ADDRESS>
*/
package cmd
import (
"fmt"
"os"
"github.com/alist-org/alist/v3/internal/conf"
"github.com/spf13/cobra"
)
// VersionCmd represents the version command
var VersionCmd = &cobra.Command{
Use: "version",
Short: "Show current version of AList",
Run: func(cmd *cobra.Command, args []string) {
fmt.Printf(`Built At: %s
Go Version: %s
Author: %s
Commit ID: %s
Version: %s
WebVersion: %s
`,
conf.BuiltAt, conf.GoVersion, conf.GitAuthor, conf.GitCommit, conf.Version, conf.WebVersion)
os.Exit(0)
},
}
func init() {
RootCmd.AddCommand(VersionCmd)
// Here you will define your flags and configuration settings.
// Cobra supports Persistent Flags which will work for this command
// and all subcommands, e.g.:
// versionCmd.PersistentFlags().String("foo", "", "A help for foo")
// Cobra supports local flags which will only run when this command
// is called directly, e.g.:
// versionCmd.Flags().BoolP("toggle", "t", false, "Help message for toggle")
}

View File

@ -1,49 +0,0 @@
package conf
type Database struct {
Type string `json:"type"`
User string `json:"user"`
Password string `json:"password"`
Host string `json:"host"`
Port int `json:"port"`
Name string `json:"name"`
TablePrefix string `json:"table_prefix"`
DBFile string `json:"db_file"`
}
type Scheme struct {
Https bool `json:"https"`
CertFile string `json:"cert_file"`
KeyFile string `json:"key_file"`
}
type CacheConfig struct {
Expiration int64 `json:"expiration"`
CleanupInterval int64 `json:"cleanup_interval"`
}
type Config struct {
Address string `json:"address"`
Port int `json:"port"`
Local bool `json:"local"`
Database Database `json:"database"`
Scheme Scheme `json:"scheme"`
Cache CacheConfig `json:"cache"`
}
func DefaultConfig() *Config {
return &Config{
Address: "0.0.0.0",
Port: 5244,
Database: Database{
Type: "sqlite3",
Port: 0,
TablePrefix: "x_",
DBFile: "data/data.db",
},
Cache: CacheConfig{
Expiration: 60,
CleanupInterval: 120,
},
}
}

View File

@ -1,11 +0,0 @@
package conf
const (
UNKNOWN = iota
FOLDER
OFFICE
VIDEO
AUDIO
TEXT
IMAGE
)

View File

@ -1,53 +0,0 @@
package conf
import (
"context"
"github.com/eko/gocache/v2/cache"
"github.com/robfig/cron/v3"
"gorm.io/gorm"
)
var (
BuiltAt string
GoVersion string
GitAuthor string
GitCommit string
GitTag string = "dev"
)
var (
ConfigFile string // config file
Conf *Config
Debug bool
Version bool
Password bool
DB *gorm.DB
Cache *cache.Cache
Ctx = context.TODO()
Cron *cron.Cron
)
var (
TextTypes = []string{"txt", "htm", "html", "xml", "java", "properties", "sql",
"js", "md", "json", "conf", "ini", "vue", "php", "py", "bat", "gitignore", "yml",
"go", "sh", "c", "cpp", "h", "hpp", "tsx", "vtt", "srt", "ass"}
OfficeTypes = []string{"doc", "docx", "xls", "xlsx", "ppt", "pptx", "pdf"}
VideoTypes = []string{"mp4", "mkv", "avi", "mov", "rmvb", "webm"}
AudioTypes = []string{"mp3", "flac", "ogg", "m4a", "wav"}
ImageTypes = []string{"jpg", "tiff", "jpeg", "png", "gif", "bmp", "svg", "ico"}
)
// settings
var (
RawIndexHtml string
IndexHtml string
CheckParent bool
CheckDown bool
Token string
DavUsername string
DavPassword string
VisitorDavUsername string
VisitorDavPassword string
)

16
docker-compose.yml Normal file
View File

@ -0,0 +1,16 @@
version: '3.3'
services:
alist:
restart: always
volumes:
- '/etc/alist:/opt/alist/data'
ports:
- '5244:5244'
- '5245:5245'
environment:
- PUID=0
- PGID=0
- UMASK=022
- TZ=UTC
container_name: alist
image: 'xhofe/alist:latest'

97
drivers/115/driver.go Normal file
View File

@ -0,0 +1,97 @@
package _115
import (
"context"
"os"
driver115 "github.com/SheltonZhu/115driver/pkg/driver"
"github.com/alist-org/alist/v3/internal/driver"
"github.com/alist-org/alist/v3/internal/model"
"github.com/alist-org/alist/v3/pkg/utils"
"github.com/pkg/errors"
)
type Pan115 struct {
model.Storage
Addition
client *driver115.Pan115Client
}
func (d *Pan115) Config() driver.Config {
return config
}
func (d *Pan115) GetAddition() driver.Additional {
return &d.Addition
}
func (d *Pan115) Init(ctx context.Context) error {
return d.login()
}
func (d *Pan115) Drop(ctx context.Context) error {
return nil
}
func (d *Pan115) List(ctx context.Context, dir model.Obj, args model.ListArgs) ([]model.Obj, error) {
files, err := d.getFiles(dir.GetID())
if err != nil && !errors.Is(err, driver115.ErrNotExist) {
return nil, err
}
return utils.SliceConvert(files, func(src driver115.File) (model.Obj, error) {
return src, nil
})
}
func (d *Pan115) Link(ctx context.Context, file model.Obj, args model.LinkArgs) (*model.Link, error) {
downloadInfo, err := d.client.
SetUserAgent(driver115.UA115Browser).
Download(file.(driver115.File).PickCode)
// recover for upload
d.client.SetUserAgent(driver115.UA115Desktop)
if err != nil {
return nil, err
}
link := &model.Link{
URL: downloadInfo.Url.Url,
Header: downloadInfo.Header,
}
return link, nil
}
func (d *Pan115) MakeDir(ctx context.Context, parentDir model.Obj, dirName string) error {
if _, err := d.client.Mkdir(parentDir.GetID(), dirName); err != nil {
return err
}
return nil
}
func (d *Pan115) Move(ctx context.Context, srcObj, dstDir model.Obj) error {
return d.client.Move(dstDir.GetID(), srcObj.GetID())
}
func (d *Pan115) Rename(ctx context.Context, srcObj model.Obj, newName string) error {
return d.client.Rename(srcObj.GetID(), newName)
}
func (d *Pan115) Copy(ctx context.Context, srcObj, dstDir model.Obj) error {
return d.client.Copy(dstDir.GetID(), srcObj.GetID())
}
func (d *Pan115) Remove(ctx context.Context, obj model.Obj) error {
return d.client.Delete(obj.GetID())
}
func (d *Pan115) Put(ctx context.Context, dstDir model.Obj, stream model.FileStreamer, up driver.UpdateProgress) error {
tempFile, err := utils.CreateTempFile(stream.GetReadCloser(), stream.GetSize())
if err != nil {
return err
}
defer func() {
_ = tempFile.Close()
_ = os.Remove(tempFile.Name())
}()
return d.client.UploadFastOrByMultipart(dstDir.GetID(), stream.GetName(), stream.GetSize(), tempFile)
}
var _ driver.Driver = (*Pan115)(nil)

27
drivers/115/meta.go Normal file
View File

@ -0,0 +1,27 @@
package _115
import (
"github.com/alist-org/alist/v3/internal/driver"
"github.com/alist-org/alist/v3/internal/op"
)
type Addition struct {
Cookie string `json:"cookie" type:"text" help:"one of QR code token and cookie required"`
QRCodeToken string `json:"qrcode_token" type:"text" help:"one of QR code token and cookie required"`
PageSize int64 `json:"page_size" type:"number" default:"56" help:"list api per page size of 115 driver"`
driver.RootID
}
var config = driver.Config{
Name: "115 Cloud",
DefaultRoot: "0",
OnlyProxy: true,
OnlyLocal: true,
NoOverwriteUpload: true,
}
func init() {
op.RegisterDriver(func() driver.Driver {
return &Pan115{}
})
}

8
drivers/115/types.go Normal file
View File

@ -0,0 +1,8 @@
package _115
import (
"github.com/SheltonZhu/115driver/pkg/driver"
"github.com/alist-org/alist/v3/internal/model"
)
var _ model.Obj = (*driver.File)(nil)

57
drivers/115/util.go Normal file
View File

@ -0,0 +1,57 @@
package _115
import (
"crypto/tls"
"fmt"
"github.com/SheltonZhu/115driver/pkg/driver"
"github.com/alist-org/alist/v3/internal/conf"
"github.com/pkg/errors"
)
var UserAgent = driver.UA115Desktop
func (d *Pan115) login() error {
var err error
opts := []driver.Option{
driver.UA(UserAgent),
func(c *driver.Pan115Client) {
c.Client.SetTLSClientConfig(&tls.Config{InsecureSkipVerify: conf.Conf.TlsInsecureSkipVerify})
},
}
d.client = driver.New(opts...)
cr := &driver.Credential{}
if d.Addition.QRCodeToken != "" {
s := &driver.QRCodeSession{
UID: d.Addition.QRCodeToken,
}
if cr, err = d.client.QRCodeLogin(s); err != nil {
return errors.Wrap(err, "failed to login by qrcode")
}
d.Addition.Cookie = fmt.Sprintf("UID=%s;CID=%s;SEID=%s", cr.UID, cr.CID, cr.SEID)
d.Addition.QRCodeToken = ""
} else if d.Addition.Cookie != "" {
if err = cr.FromCookie(d.Addition.Cookie); err != nil {
return errors.Wrap(err, "failed to login by cookies")
}
d.client.ImportCredential(cr)
} else {
return errors.New("missing cookie or qrcode account")
}
return d.client.LoginCheck()
}
func (d *Pan115) getFiles(fileId string) ([]driver.File, error) {
res := make([]driver.File, 0)
if d.PageSize <= 0 {
d.PageSize = driver.FileListLimit
}
files, err := d.client.ListWithLimit(fileId, d.PageSize)
if err != nil {
return nil, err
}
for _, file := range *files {
res = append(res, file)
}
return res, nil
}

View File

@ -1,232 +0,0 @@
package _23
import (
"crypto/hmac"
"crypto/sha256"
"errors"
"fmt"
"github.com/Xhofe/alist/conf"
"github.com/Xhofe/alist/drivers/base"
"github.com/Xhofe/alist/model"
"github.com/Xhofe/alist/utils"
"github.com/go-resty/resty/v2"
jsoniter "github.com/json-iterator/go"
log "github.com/sirupsen/logrus"
"math/rand"
"path/filepath"
"strconv"
"time"
)
type BaseResp struct {
Code int `json:"code"`
Message string `json:"message"`
}
type Pan123TokenResp struct {
BaseResp
Data struct {
Token string `json:"token"`
} `json:"data"`
}
type Pan123File struct {
FileName string `json:"FileName"`
Size int64 `json:"Size"`
UpdateAt *time.Time `json:"UpdateAt"`
FileId int64 `json:"FileId"`
Type int `json:"Type"`
Etag string `json:"Etag"`
S3KeyFlag string `json:"S3KeyFlag"`
}
type Pan123Files struct {
BaseResp
Data struct {
InfoList []Pan123File `json:"InfoList"`
Next string `json:"Next"`
} `json:"data"`
}
type Pan123DownResp struct {
BaseResp
Data struct {
DownloadUrl string `json:"DownloadUrl"`
} `json:"data"`
}
func (driver Pan123) Login(account *model.Account) error {
url := "https://www.123pan.com/api/user/sign_in"
if account.APIProxyUrl != "" {
url = fmt.Sprintf("%s/%s", account.APIProxyUrl, url)
}
var resp Pan123TokenResp
_, err := base.RestyClient.R().
SetResult(&resp).
SetBody(base.Json{
"passport": account.Username,
"password": account.Password,
}).Post(url)
if err != nil {
return err
}
if resp.Code != 200 {
err = fmt.Errorf(resp.Message)
account.Status = resp.Message
} else {
account.Status = "work"
account.AccessToken = resp.Data.Token
}
_ = model.SaveAccount(account)
return err
}
func (driver Pan123) FormatFile(file *Pan123File) *model.File {
f := &model.File{
Id: strconv.FormatInt(file.FileId, 10),
Name: file.FileName,
Size: file.Size,
Driver: driver.Config().Name,
UpdatedAt: file.UpdateAt,
}
if file.Type == 1 {
f.Type = conf.FOLDER
} else {
f.Type = utils.GetFileType(filepath.Ext(file.FileName))
}
return f
}
func (driver Pan123) GetFiles(parentId string, account *model.Account) ([]Pan123File, error) {
next := "0"
res := make([]Pan123File, 0)
for next != "-1" {
var resp Pan123Files
query := map[string]string{
"driveId": "0",
"limit": "100",
"next": next,
"orderBy": account.OrderBy,
"orderDirection": account.OrderDirection,
"parentFileId": parentId,
"trashed": "false",
}
_, err := driver.Request("https://www.123pan.com/api/file/list",
base.Get, nil, query, nil, &resp, false, account)
if err != nil {
return nil, err
}
next = resp.Data.Next
res = append(res, resp.Data.InfoList...)
}
return res, nil
}
func (driver Pan123) Request(url string, method int, headers, query map[string]string, data *base.Json, resp interface{}, proxy bool, account *model.Account) ([]byte, error) {
rawUrl := url
if account.APIProxyUrl != "" && proxy {
url = fmt.Sprintf("%s/%s", account.APIProxyUrl, url)
}
req := base.RestyClient.R()
req.SetHeader("Authorization", "Bearer "+account.AccessToken)
if headers != nil {
req.SetHeaders(headers)
}
if query != nil {
req.SetQueryParams(query)
}
if data != nil {
req.SetBody(data)
}
if resp != nil {
req.SetResult(resp)
}
var res *resty.Response
var err error
switch method {
case base.Get:
res, err = req.Get(url)
case base.Post:
res, err = req.Post(url)
default:
return nil, base.ErrNotSupport
}
if err != nil {
return nil, err
}
log.Debug(res.String())
body := res.Body()
code := jsoniter.Get(body, "code").ToInt()
if code != 0 {
if code == 401 {
err := driver.Login(account)
if err != nil {
return nil, err
}
return driver.Request(rawUrl, method, headers, query, data, resp, proxy, account)
}
return nil, errors.New(jsoniter.Get(body, "message").ToString())
}
return body, nil
}
//func (driver Pan123) Post(url string, data base.Json, account *model.Account) ([]byte, error) {
// res, err := pan123Client.R().
// SetHeader("authorization", "Bearer "+account.AccessToken).
// SetBody(data).Post(url)
// if err != nil {
// return nil, err
// }
// body := res.Body()
// if jsoniter.Get(body, "code").ToInt() != 0 {
// return nil, errors.New(jsoniter.Get(body, "message").ToString())
// }
// return body, nil
//}
func (driver Pan123) GetFile(path string, account *model.Account) (*Pan123File, error) {
dir, name := filepath.Split(path)
dir = utils.ParsePath(dir)
_, err := driver.Files(dir, account)
if err != nil {
return nil, err
}
parentFiles_, _ := base.GetCache(dir, account)
parentFiles, _ := parentFiles_.([]Pan123File)
for _, file := range parentFiles {
if file.FileName == name {
if file.Type != conf.FOLDER {
return &file, err
} else {
return nil, base.ErrNotFile
}
}
}
return nil, base.ErrPathNotFound
}
func RandStr(length int) string {
str := "123456789abcdefghijklmnopqrstuvwxyz"
bytes := []byte(str)
var result []byte
rand.Seed(time.Now().UnixNano() + int64(rand.Intn(100)))
for i := 0; i < length; i++ {
result = append(result, bytes[rand.Intn(len(bytes))])
}
return string(result)
}
func HMAC(message string, secret string) string {
key := []byte(secret)
h := hmac.New(sha256.New, key)
h.Write([]byte(message))
// fmt.Println(h.Sum(nil))
//sha := hex.EncodeToString(h.Sum(nil))
// fmt.Println(sha)
//return sha
return string(h.Sum(nil))
}
func init() {
base.RegisterDriver(&Pan123{})
}

View File

@ -1,355 +1,256 @@
package _23
package _123
import (
"context"
"crypto/md5"
"encoding/base64"
"encoding/hex"
"encoding/xml"
"fmt"
"github.com/Xhofe/alist/conf"
"github.com/Xhofe/alist/drivers/base"
"github.com/Xhofe/alist/model"
"github.com/Xhofe/alist/utils"
"github.com/gin-gonic/gin"
jsoniter "github.com/json-iterator/go"
log "github.com/sirupsen/logrus"
"io"
"net/http"
"net/url"
"path/filepath"
"strconv"
"strings"
"time"
"os"
"github.com/alist-org/alist/v3/drivers/base"
"github.com/alist-org/alist/v3/internal/driver"
"github.com/alist-org/alist/v3/internal/errs"
"github.com/alist-org/alist/v3/internal/model"
"github.com/alist-org/alist/v3/pkg/utils"
"github.com/aws/aws-sdk-go/aws"
"github.com/aws/aws-sdk-go/aws/credentials"
"github.com/aws/aws-sdk-go/aws/session"
"github.com/aws/aws-sdk-go/service/s3/s3manager"
"github.com/go-resty/resty/v2"
log "github.com/sirupsen/logrus"
)
type Pan123 struct{}
func (driver Pan123) Config() base.DriverConfig {
return base.DriverConfig{
Name: "123Pan",
}
type Pan123 struct {
model.Storage
Addition
}
func (driver Pan123) Items() []base.Item {
return []base.Item{
{
Name: "username",
Label: "username",
Type: base.TypeString,
Required: true,
Description: "account username/phone number",
},
{
Name: "password",
Label: "password",
Type: base.TypeString,
Required: true,
Description: "account password",
},
{
Name: "root_folder",
Label: "root folder file_id",
Type: base.TypeString,
Required: false,
},
{
Name: "order_by",
Label: "order_by",
Type: base.TypeSelect,
Values: "name,fileId,updateAt,createAt",
Required: true,
},
{
Name: "order_direction",
Label: "order_direction",
Type: base.TypeSelect,
Values: "asc,desc",
Required: true,
},
}
func (d *Pan123) Config() driver.Config {
return config
}
func (driver Pan123) Save(account *model.Account, old *model.Account) error {
if account.RootFolder == "" {
account.RootFolder = "0"
}
err := driver.Login(account)
func (d *Pan123) GetAddition() driver.Additional {
return &d.Addition
}
func (d *Pan123) Init(ctx context.Context) error {
_, err := d.request(UserInfo, http.MethodGet, nil, nil)
return err
}
func (driver Pan123) File(path string, account *model.Account) (*model.File, error) {
path = utils.ParsePath(path)
if path == "/" {
return &model.File{
Id: account.RootFolder,
Name: account.Name,
Size: 0,
Type: conf.FOLDER,
Driver: driver.Config().Name,
UpdatedAt: account.UpdatedAt,
}, nil
}
dir, name := filepath.Split(path)
files, err := driver.Files(dir, account)
func (d *Pan123) Drop(ctx context.Context) error {
_, _ = d.request(Logout, http.MethodPost, func(req *resty.Request) {
req.SetBody(base.Json{})
}, nil)
return nil
}
func (d *Pan123) List(ctx context.Context, dir model.Obj, args model.ListArgs) ([]model.Obj, error) {
files, err := d.getFiles(dir.GetID())
if err != nil {
return nil, err
}
for _, file := range files {
if file.Name == name {
return &file, nil
}
}
return nil, base.ErrPathNotFound
return utils.SliceConvert(files, func(src File) (model.Obj, error) {
return src, nil
})
}
func (driver Pan123) Files(path string, account *model.Account) ([]model.File, error) {
path = utils.ParsePath(path)
var rawFiles []Pan123File
cache, err := base.GetCache(path, account)
if err == nil {
rawFiles, _ = cache.([]Pan123File)
func (d *Pan123) Link(ctx context.Context, file model.Obj, args model.LinkArgs) (*model.Link, error) {
if f, ok := file.(File); ok {
//var resp DownResp
var headers map[string]string
if !utils.IsLocalIPAddr(args.IP) {
headers = map[string]string{
//"X-Real-IP": "1.1.1.1",
"X-Forwarded-For": args.IP,
}
}
data := base.Json{
"driveId": 0,
"etag": f.Etag,
"fileId": f.FileId,
"fileName": f.FileName,
"s3keyFlag": f.S3KeyFlag,
"size": f.Size,
"type": f.Type,
}
resp, err := d.request(DownloadInfo, http.MethodPost, func(req *resty.Request) {
req.SetBody(data).SetHeaders(headers)
}, nil)
if err != nil {
return nil, err
}
downloadUrl := utils.Json.Get(resp, "data", "DownloadUrl").ToString()
u, err := url.Parse(downloadUrl)
if err != nil {
return nil, err
}
nu := u.Query().Get("params")
if nu != "" {
du, _ := base64.StdEncoding.DecodeString(nu)
u, err = url.Parse(string(du))
if err != nil {
return nil, err
}
}
u_ := u.String()
log.Debug("download url: ", u_)
res, err := base.NoRedirectClient.R().SetHeader("Referer", "https://www.123pan.com/").Get(u_)
if err != nil {
return nil, err
}
log.Debug(res.String())
link := model.Link{
URL: u_,
}
log.Debugln("res code: ", res.StatusCode())
if res.StatusCode() == 302 {
link.URL = res.Header().Get("location")
} else if res.StatusCode() < 300 {
link.URL = utils.Json.Get(res.Body(), "data", "redirect_url").ToString()
}
link.Header = http.Header{
"Referer": []string{"https://www.123pan.com/"},
}
return &link, nil
} else {
file, err := driver.File(path, account)
if err != nil {
return nil, err
}
rawFiles, err = driver.GetFiles(file.Id, account)
if err != nil {
return nil, err
}
if len(rawFiles) > 0 {
_ = base.SetCache(path, rawFiles, account)
}
return nil, fmt.Errorf("can't convert obj")
}
files := make([]model.File, 0)
for _, file := range rawFiles {
files = append(files, *driver.FormatFile(&file))
}
return files, nil
}
func (driver Pan123) Link(args base.Args, account *model.Account) (*base.Link, error) {
log.Debugf("%+v", args)
file, err := driver.GetFile(utils.ParsePath(args.Path), account)
if err != nil {
return nil, err
}
var resp Pan123DownResp
var headers map[string]string
if args.IP != "" && args.IP != "::1" {
headers = map[string]string{
//"X-Real-IP": "1.1.1.1",
"X-Forwarded-For": args.IP,
}
}
data := base.Json{
"driveId": 0,
"etag": file.Etag,
"fileId": file.FileId,
"fileName": file.FileName,
"s3keyFlag": file.S3KeyFlag,
"size": file.Size,
"type": file.Type,
}
_, err = driver.Request("https://www.123pan.com/api/file/download_info",
base.Post, headers, nil, &data, &resp, false, account)
//_, err = pan123Client.R().SetResult(&resp).SetHeader("authorization", "Bearer "+account.AccessToken).
// SetBody().Post("https://www.123pan.com/api/file/download_info")
if err != nil {
return nil, err
}
u, err := url.Parse(resp.Data.DownloadUrl)
if err != nil {
return nil, err
}
u_ := fmt.Sprintf("https://%s%s", u.Host, u.Path)
res, err := base.NoRedirectClient.R().SetQueryParamsFromValues(u.Query()).Get(u_)
if err != nil {
return nil, err
}
log.Debug(res.String())
link := base.Link{
Url: resp.Data.DownloadUrl,
}
if res.StatusCode() == 302 {
link.Url = res.Header().Get("location")
}
return &link, nil
}
func (driver Pan123) Path(path string, account *model.Account) (*model.File, []model.File, error) {
path = utils.ParsePath(path)
log.Debugf("pan123 path: %s", path)
file, err := driver.File(path, account)
if err != nil {
return nil, nil, err
}
if !file.IsDir() {
return file, nil, nil
}
files, err := driver.Files(path, account)
if err != nil {
return nil, nil, err
}
return nil, files, nil
}
func (driver Pan123) Proxy(c *gin.Context, account *model.Account) {
c.Request.Header.Del("origin")
}
func (driver Pan123) Preview(path string, account *model.Account) (interface{}, error) {
return nil, base.ErrNotSupport
}
func (driver Pan123) MakeDir(path string, account *model.Account) error {
dir, name := filepath.Split(path)
parentFile, err := driver.File(dir, account)
if err != nil {
return err
}
if !parentFile.IsDir() {
return base.ErrNotFolder
}
parentFileId, _ := strconv.Atoi(parentFile.Id)
func (d *Pan123) MakeDir(ctx context.Context, parentDir model.Obj, dirName string) error {
data := base.Json{
"driveId": 0,
"etag": "",
"fileName": name,
"parentFileId": parentFileId,
"fileName": dirName,
"parentFileId": parentDir.GetID(),
"size": 0,
"type": 1,
}
_, err = driver.Request("https://www.123pan.com/api/file/upload_request",
base.Post, nil, nil, &data, nil, false, account)
//_, err = driver.Post("https://www.123pan.com/api/file/upload_request", data, account)
if err == nil {
_ = base.DeleteCache(dir, account)
}
_, err := d.request(Mkdir, http.MethodPost, func(req *resty.Request) {
req.SetBody(data)
}, nil)
return err
}
func (driver Pan123) Move(src string, dst string, account *model.Account) error {
srcDir, _ := filepath.Split(src)
dstDir, dstName := filepath.Split(dst)
srcFile, err := driver.File(src, account)
func (d *Pan123) Move(ctx context.Context, srcObj, dstDir model.Obj) error {
data := base.Json{
"fileIdList": []base.Json{{"FileId": srcObj.GetID()}},
"parentFileId": dstDir.GetID(),
}
_, err := d.request(Move, http.MethodPost, func(req *resty.Request) {
req.SetBody(data)
}, nil)
return err
}
func (d *Pan123) Rename(ctx context.Context, srcObj model.Obj, newName string) error {
data := base.Json{
"driveId": 0,
"fileId": srcObj.GetID(),
"fileName": newName,
}
_, err := d.request(Rename, http.MethodPost, func(req *resty.Request) {
req.SetBody(data)
}, nil)
return err
}
func (d *Pan123) Copy(ctx context.Context, srcObj, dstDir model.Obj) error {
return errs.NotSupport
}
func (d *Pan123) Remove(ctx context.Context, obj model.Obj) error {
if f, ok := obj.(File); ok {
data := base.Json{
"driveId": 0,
"operation": true,
"fileTrashInfoList": []File{f},
}
_, err := d.request(Trash, http.MethodPost, func(req *resty.Request) {
req.SetBody(data)
}, nil)
return err
} else {
return fmt.Errorf("can't convert obj")
}
}
func (d *Pan123) Put(ctx context.Context, dstDir model.Obj, stream model.FileStreamer, up driver.UpdateProgress) error {
// const DEFAULT int64 = 10485760
h := md5.New()
// need to calculate md5 of the full content
tempFile, err := utils.CreateTempFile(stream.GetReadCloser(), stream.GetSize())
if err != nil {
return err
}
fileId, _ := strconv.Atoi(srcFile.Id)
// rename
if srcDir == dstDir {
data := base.Json{
"driveId": 0,
"fileId": fileId,
"fileName": dstName,
}
_, err = driver.Request("https://www.123pan.com/api/file/rename",
base.Post, nil, nil, &data, nil, false, account)
//_, err = driver.Post("https://www.123pan.com/api/file/rename", data, account)
defer func() {
_ = tempFile.Close()
_ = os.Remove(tempFile.Name())
}()
if _, err = io.Copy(h, tempFile); err != nil {
return err
}
_, err = tempFile.Seek(0, io.SeekStart)
if err != nil {
return err
}
etag := hex.EncodeToString(h.Sum(nil))
data := base.Json{
"driveId": 0,
"duplicate": 2, // 2->覆盖 1->重命名 0->默认
"etag": etag,
"fileName": stream.GetName(),
"parentFileId": dstDir.GetID(),
"size": stream.GetSize(),
"type": 0,
}
var resp UploadResp
res, err := d.request(UploadRequest, http.MethodPost, func(req *resty.Request) {
req.SetBody(data).SetContext(ctx)
}, &resp)
if err != nil {
return err
}
log.Debugln("upload request res: ", string(res))
if resp.Data.Reuse || resp.Data.Key == "" {
return nil
}
if resp.Data.AccessKeyId == "" || resp.Data.SecretAccessKey == "" || resp.Data.SessionToken == "" {
err = d.newUpload(ctx, &resp, stream, tempFile, up)
return err
} else {
// move
dstDirFile, err := driver.File(dstDir, account)
cfg := &aws.Config{
Credentials: credentials.NewStaticCredentials(resp.Data.AccessKeyId, resp.Data.SecretAccessKey, resp.Data.SessionToken),
Region: aws.String("123pan"),
Endpoint: aws.String(resp.Data.EndPoint),
S3ForcePathStyle: aws.Bool(true),
}
s, err := session.NewSession(cfg)
if err != nil {
return err
}
parentFileId, _ := strconv.Atoi(dstDirFile.Id)
data := base.Json{
"fileId": fileId,
"parentFileId": parentFileId,
uploader := s3manager.NewUploader(s)
input := &s3manager.UploadInput{
Bucket: &resp.Data.Bucket,
Key: &resp.Data.Key,
Body: tempFile,
}
_, err = driver.Request("https://www.123pan.com/api/file/mod_pid",
base.Post, nil, nil, &data, nil, false, account)
//_, err = driver.Post("https://www.123pan.com/api/file/mod_pid", data, account)
_, err = uploader.UploadWithContext(ctx, input)
}
if err != nil {
_ = base.DeleteCache(srcDir, account)
_ = base.DeleteCache(dstDir, account)
return err
}
_, err = d.request(UploadComplete, http.MethodPost, func(req *resty.Request) {
req.SetBody(base.Json{
"fileId": resp.Data.FileId,
}).SetContext(ctx)
}, nil)
return err
}
func (driver Pan123) Copy(src string, dst string, account *model.Account) error {
return base.ErrNotSupport
}
func (driver Pan123) Delete(path string, account *model.Account) error {
file, err := driver.GetFile(path, account)
if err != nil {
return err
}
data := base.Json{
"driveId": 0,
"operation": true,
"fileTrashInfoList": file,
}
_, err = driver.Request("https://www.123pan.com/api/file/trash",
base.Post, nil, nil, &data, nil, false, account)
//_, err = driver.Post("https://www.123pan.com/api/file/trash", data, account)
if err == nil {
_ = base.DeleteCache(utils.Dir(path), account)
}
return err
}
type UploadResp struct {
XMLName xml.Name `xml:"InitiateMultipartUploadResult"`
Bucket string `xml:"Bucket"`
Key string `xml:"Key"`
UploadId string `xml:"UploadId"`
}
// TODO unfinished
func (driver Pan123) Upload(file *model.FileStream, account *model.Account) error {
return base.ErrNotImplement
parentFile, err := driver.File(file.ParentPath, account)
if err != nil {
return err
}
if !parentFile.IsDir() {
return base.ErrNotFolder
}
parentFileId, _ := strconv.Atoi(parentFile.Id)
data := base.Json{
"driveId": 0,
"duplicate": true,
"etag": RandStr(32), //maybe file's md5
"fileName": file.GetFileName(),
"parentFileId": parentFileId,
"size": file.GetSize(),
"type": 0,
}
res, err := driver.Request("https://www.123pan.com/api/file/upload_request",
base.Post, nil, nil, &data, nil, false, account)
//res, err := driver.Post("https://www.123pan.com/api/file/upload_request", data, account)
if err != nil {
return err
}
baseUrl := fmt.Sprintf("https://file.123pan.com/%s/%s", jsoniter.Get(res, "data.Bucket").ToString(), jsoniter.Get(res, "data.Key").ToString())
var resp UploadResp
kSecret := jsoniter.Get(res, "data.SecretAccessKey").ToString()
nowTimeStr := time.Now().String()
Date := strings.ReplaceAll(strings.Split(nowTimeStr, "T")[0], "-", "")
StringToSign := fmt.Sprintf("%s\n%s\n%s\n%s",
"AWS4-HMAC-SHA256",
nowTimeStr,
fmt.Sprintf("%s/us-east-1/s3/aws4_request", Date),
)
kDate := HMAC("AWS4"+kSecret, Date)
kRegion := HMAC(kDate, "us-east-1")
kService := HMAC(kRegion, "s3")
kSigning := HMAC(kService, "aws4_request")
_, err = base.RestyClient.R().SetResult(&resp).SetHeaders(map[string]string{
"Authorization": fmt.Sprintf("AWS4-HMAC-SHA256 Credential=%s/%s/us-east-1/s3/aws4_request, SignedHeaders=host;x-amz-content-sha256;x-amz-date;x-amz-security-token;x-amz-user-agent, Signature=%s",
jsoniter.Get(res, "data.AccessKeyId"),
Date,
hex.EncodeToString([]byte(HMAC(StringToSign, kSigning)))),
"X-Amz-Content-Sha256": "UNSIGNED-PAYLOAD",
"X-Amz-Date": nowTimeStr,
"x-amz-security-token": jsoniter.Get(res, "data.SessionToken").ToString(),
}).Post(fmt.Sprintf("%s?uploads", baseUrl))
if err != nil {
return err
}
return base.ErrNotImplement
}
var _ base.Driver = (*Pan123)(nil)
var _ driver.Driver = (*Pan123)(nil)

26
drivers/123/meta.go Normal file
View File

@ -0,0 +1,26 @@
package _123
import (
"github.com/alist-org/alist/v3/internal/driver"
"github.com/alist-org/alist/v3/internal/op"
)
type Addition struct {
Username string `json:"username" required:"true"`
Password string `json:"password" required:"true"`
driver.RootID
OrderBy string `json:"order_by" type:"select" options:"file_name,size,update_at" default:"file_name"`
OrderDirection string `json:"order_direction" type:"select" options:"asc,desc" default:"asc"`
AccessToken string
}
var config = driver.Config{
Name: "123Pan",
DefaultRoot: "0",
}
func init() {
op.RegisterDriver(func() driver.Driver {
return &Pan123{}
})
}

113
drivers/123/types.go Normal file
View File

@ -0,0 +1,113 @@
package _123
import (
"net/url"
"path"
"strconv"
"strings"
"time"
"github.com/alist-org/alist/v3/internal/model"
)
type File struct {
FileName string `json:"FileName"`
Size int64 `json:"Size"`
UpdateAt time.Time `json:"UpdateAt"`
FileId int64 `json:"FileId"`
Type int `json:"Type"`
Etag string `json:"Etag"`
S3KeyFlag string `json:"S3KeyFlag"`
DownloadUrl string `json:"DownloadUrl"`
}
func (f File) GetPath() string {
return ""
}
func (f File) GetSize() int64 {
return f.Size
}
func (f File) GetName() string {
return f.FileName
}
func (f File) ModTime() time.Time {
return f.UpdateAt
}
func (f File) IsDir() bool {
return f.Type == 1
}
func (f File) GetID() string {
return strconv.FormatInt(f.FileId, 10)
}
func (f File) Thumb() string {
if f.DownloadUrl == "" {
return ""
}
du, err := url.Parse(f.DownloadUrl)
if err != nil {
return ""
}
du.Path = strings.TrimSuffix(du.Path, "_24_24") + "_70_70"
query := du.Query()
query.Set("w", "70")
query.Set("h", "70")
if !query.Has("type") {
query.Set("type", strings.TrimPrefix(path.Base(f.FileName), "."))
}
if !query.Has("trade_key") {
query.Set("trade_key", "123pan-thumbnail")
}
du.RawQuery = query.Encode()
return du.String()
}
var _ model.Obj = (*File)(nil)
var _ model.Thumb = (*File)(nil)
//func (f File) Thumb() string {
//
//}
//var _ model.Thumb = (*File)(nil)
type Files struct {
//BaseResp
Data struct {
InfoList []File `json:"InfoList"`
Next string `json:"Next"`
} `json:"data"`
}
//type DownResp struct {
// //BaseResp
// Data struct {
// DownloadUrl string `json:"DownloadUrl"`
// } `json:"data"`
//}
type UploadResp struct {
//BaseResp
Data struct {
AccessKeyId string `json:"AccessKeyId"`
Bucket string `json:"Bucket"`
Key string `json:"Key"`
SecretAccessKey string `json:"SecretAccessKey"`
SessionToken string `json:"SessionToken"`
FileId int64 `json:"FileId"`
Reuse bool `json:"Reuse"`
EndPoint string `json:"EndPoint"`
StorageNode string `json:"StorageNode"`
UploadId string `json:"UploadId"`
} `json:"data"`
}
type S3PreSignedURLs struct {
Data struct {
PreSignedUrls map[string]string `json:"presignedUrls"`
} `json:"data"`
}

155
drivers/123/upload.go Normal file
View File

@ -0,0 +1,155 @@
package _123
import (
"context"
"fmt"
"io"
"math"
"net/http"
"strconv"
"github.com/alist-org/alist/v3/drivers/base"
"github.com/alist-org/alist/v3/internal/driver"
"github.com/alist-org/alist/v3/internal/model"
"github.com/alist-org/alist/v3/pkg/utils"
"github.com/go-resty/resty/v2"
)
func (d *Pan123) getS3PreSignedUrls(ctx context.Context, upReq *UploadResp, start, end int) (*S3PreSignedURLs, error) {
data := base.Json{
"bucket": upReq.Data.Bucket,
"key": upReq.Data.Key,
"partNumberEnd": end,
"partNumberStart": start,
"uploadId": upReq.Data.UploadId,
"StorageNode": upReq.Data.StorageNode,
}
var s3PreSignedUrls S3PreSignedURLs
_, err := d.request(S3PreSignedUrls, http.MethodPost, func(req *resty.Request) {
req.SetBody(data).SetContext(ctx)
}, &s3PreSignedUrls)
if err != nil {
return nil, err
}
return &s3PreSignedUrls, nil
}
func (d *Pan123) getS3Auth(ctx context.Context, upReq *UploadResp, start, end int) (*S3PreSignedURLs, error) {
data := base.Json{
"StorageNode": upReq.Data.StorageNode,
"bucket": upReq.Data.Bucket,
"key": upReq.Data.Key,
"partNumberEnd": end,
"partNumberStart": start,
"uploadId": upReq.Data.UploadId,
}
var s3PreSignedUrls S3PreSignedURLs
_, err := d.request(S3Auth, http.MethodPost, func(req *resty.Request) {
req.SetBody(data).SetContext(ctx)
}, &s3PreSignedUrls)
if err != nil {
return nil, err
}
return &s3PreSignedUrls, nil
}
func (d *Pan123) completeS3(ctx context.Context, upReq *UploadResp, file model.FileStreamer, isMultipart bool) error {
data := base.Json{
"StorageNode": upReq.Data.StorageNode,
"bucket": upReq.Data.Bucket,
"fileId": upReq.Data.FileId,
"fileSize": file.GetSize(),
"isMultipart": isMultipart,
"key": upReq.Data.Key,
"uploadId": upReq.Data.UploadId,
}
_, err := d.request(UploadCompleteV2, http.MethodPost, func(req *resty.Request) {
req.SetBody(data).SetContext(ctx)
}, nil)
return err
}
func (d *Pan123) newUpload(ctx context.Context, upReq *UploadResp, file model.FileStreamer, reader io.Reader, up driver.UpdateProgress) error {
chunkSize := int64(1024 * 1024 * 16)
// fetch s3 pre signed urls
chunkCount := int(math.Ceil(float64(file.GetSize()) / float64(chunkSize)))
// only 1 batch is allowed
isMultipart := chunkCount > 1
batchSize := 1
getS3UploadUrl := d.getS3Auth
if isMultipart {
batchSize = 10
getS3UploadUrl = d.getS3PreSignedUrls
}
for i := 1; i <= chunkCount; i += batchSize {
if utils.IsCanceled(ctx) {
return ctx.Err()
}
start := i
end := i + batchSize
if end > chunkCount+1 {
end = chunkCount + 1
}
s3PreSignedUrls, err := getS3UploadUrl(ctx, upReq, start, end)
if err != nil {
return err
}
// upload each chunk
for j := start; j < end; j++ {
if utils.IsCanceled(ctx) {
return ctx.Err()
}
curSize := chunkSize
if j == chunkCount {
curSize = file.GetSize() - (int64(chunkCount)-1)*chunkSize
}
err = d.uploadS3Chunk(ctx, upReq, s3PreSignedUrls, j, end, io.LimitReader(reader, chunkSize), curSize, false, getS3UploadUrl)
if err != nil {
return err
}
up(j * 100 / chunkCount)
}
}
// complete s3 upload
return d.completeS3(ctx, upReq, file, chunkCount > 1)
}
func (d *Pan123) uploadS3Chunk(ctx context.Context, upReq *UploadResp, s3PreSignedUrls *S3PreSignedURLs, cur, end int, reader io.Reader, curSize int64, retry bool, getS3UploadUrl func(ctx context.Context, upReq *UploadResp, start int, end int) (*S3PreSignedURLs, error)) error {
uploadUrl := s3PreSignedUrls.Data.PreSignedUrls[strconv.Itoa(cur)]
if uploadUrl == "" {
return fmt.Errorf("upload url is empty, s3PreSignedUrls: %+v", s3PreSignedUrls)
}
req, err := http.NewRequest("PUT", uploadUrl, reader)
if err != nil {
return err
}
req = req.WithContext(ctx)
req.ContentLength = curSize
//req.Header.Set("Content-Length", strconv.FormatInt(curSize, 10))
res, err := base.HttpClient.Do(req)
if err != nil {
return err
}
defer res.Body.Close()
if res.StatusCode == http.StatusForbidden {
if retry {
return fmt.Errorf("upload s3 chunk %d failed, status code: %d", cur, res.StatusCode)
}
// refresh s3 pre signed urls
newS3PreSignedUrls, err := getS3UploadUrl(ctx, upReq, cur, end)
if err != nil {
return err
}
s3PreSignedUrls.Data.PreSignedUrls = newS3PreSignedUrls.Data.PreSignedUrls
// retry
return d.uploadS3Chunk(ctx, upReq, s3PreSignedUrls, cur, end, reader, curSize, true, getS3UploadUrl)
}
if res.StatusCode != http.StatusOK {
body, err := io.ReadAll(res.Body)
if err != nil {
return err
}
return fmt.Errorf("upload s3 chunk %d failed, status code: %d, body: %s", cur, res.StatusCode, body)
}
return nil
}

159
drivers/123/util.go Normal file
View File

@ -0,0 +1,159 @@
package _123
import (
"errors"
"fmt"
"net/http"
"strconv"
"github.com/alist-org/alist/v3/drivers/base"
"github.com/alist-org/alist/v3/pkg/utils"
"github.com/go-resty/resty/v2"
jsoniter "github.com/json-iterator/go"
)
// do others that not defined in Driver interface
const (
Api = "https://www.123pan.com/api"
AApi = "https://www.123pan.com/a/api"
BApi = "https://www.123pan.com/b/api"
MainApi = Api
SignIn = MainApi + "/user/sign_in"
Logout = MainApi + "/user/logout"
UserInfo = MainApi + "/user/info"
FileList = MainApi + "/file/list/new"
DownloadInfo = MainApi + "/file/download_info"
Mkdir = MainApi + "/file/upload_request"
Move = MainApi + "/file/mod_pid"
Rename = MainApi + "/file/rename"
Trash = MainApi + "/file/trash"
UploadRequest = MainApi + "/file/upload_request"
UploadComplete = MainApi + "/file/upload_complete"
S3PreSignedUrls = MainApi + "/file/s3_repare_upload_parts_batch"
S3Auth = MainApi + "/file/s3_upload_object/auth"
UploadCompleteV2 = MainApi + "/file/upload_complete/v2"
S3Complete = MainApi + "/file/s3_complete_multipart_upload"
//AuthKeySalt = "8-8D$sL8gPjom7bk#cY"
)
func (d *Pan123) login() error {
var body base.Json
if utils.IsEmailFormat(d.Username) {
body = base.Json{
"mail": d.Username,
"password": d.Password,
"type": 2,
}
} else {
body = base.Json{
"passport": d.Username,
"password": d.Password,
"remember": true,
}
}
res, err := base.RestyClient.R().
SetHeaders(map[string]string{
"origin": "https://www.123pan.com",
"referer": "https://www.123pan.com/",
"user-agent": "Dart/2.19(dart:io)",
"platform": "android",
"app-version": "36",
//"user-agent": base.UserAgent,
}).
SetBody(body).Post(SignIn)
if err != nil {
return err
}
if utils.Json.Get(res.Body(), "code").ToInt() != 200 {
err = fmt.Errorf(utils.Json.Get(res.Body(), "message").ToString())
} else {
d.AccessToken = utils.Json.Get(res.Body(), "data", "token").ToString()
}
return err
}
//func authKey(reqUrl string) (*string, error) {
// reqURL, err := url.Parse(reqUrl)
// if err != nil {
// return nil, err
// }
//
// nowUnix := time.Now().Unix()
// random := rand.Intn(0x989680)
//
// p4 := fmt.Sprintf("%d|%d|%s|%s|%s|%s", nowUnix, random, reqURL.Path, "web", "3", AuthKeySalt)
// authKey := fmt.Sprintf("%d-%d-%x", nowUnix, random, md5.Sum([]byte(p4)))
// return &authKey, nil
//}
func (d *Pan123) request(url string, method string, callback base.ReqCallback, resp interface{}) ([]byte, error) {
req := base.RestyClient.R()
req.SetHeaders(map[string]string{
"origin": "https://www.123pan.com",
"referer": "https://www.123pan.com/",
"authorization": "Bearer " + d.AccessToken,
"user-agent": "Dart/2.19(dart:io)",
"platform": "android",
"app-version": "36",
//"user-agent": base.UserAgent,
})
if callback != nil {
callback(req)
}
if resp != nil {
req.SetResult(resp)
}
//authKey, err := authKey(url)
//if err != nil {
// return nil, err
//}
//req.SetQueryParam("auth-key", *authKey)
res, err := req.Execute(method, url)
if err != nil {
return nil, err
}
body := res.Body()
code := utils.Json.Get(body, "code").ToInt()
if code != 0 {
if code == 401 {
err := d.login()
if err != nil {
return nil, err
}
return d.request(url, method, callback, resp)
}
return nil, errors.New(jsoniter.Get(body, "message").ToString())
}
return body, nil
}
func (d *Pan123) getFiles(parentId string) ([]File, error) {
page := 1
res := make([]File, 0)
for {
var resp Files
query := map[string]string{
"driveId": "0",
"limit": "100",
"next": "0",
"orderBy": d.OrderBy,
"orderDirection": d.OrderDirection,
"parentFileId": parentId,
"trashed": "false",
"Page": strconv.Itoa(page),
}
_, err := d.request(FileList, http.MethodGet, func(req *resty.Request) {
req.SetQueryParams(query)
}, &resp)
if err != nil {
return nil, err
}
page++
res = append(res, resp.Data.InfoList...)
if len(resp.Data.InfoList) == 0 || resp.Data.Next == "-1" {
break
}
}
return res, nil
}

149
drivers/123_share/driver.go Normal file
View File

@ -0,0 +1,149 @@
package _123Share
import (
"context"
"encoding/base64"
"fmt"
"net/http"
"net/url"
"github.com/alist-org/alist/v3/drivers/base"
"github.com/alist-org/alist/v3/internal/driver"
"github.com/alist-org/alist/v3/internal/errs"
"github.com/alist-org/alist/v3/internal/model"
"github.com/alist-org/alist/v3/pkg/utils"
"github.com/go-resty/resty/v2"
log "github.com/sirupsen/logrus"
)
type Pan123Share struct {
model.Storage
Addition
}
func (d *Pan123Share) Config() driver.Config {
return config
}
func (d *Pan123Share) GetAddition() driver.Additional {
return &d.Addition
}
func (d *Pan123Share) Init(ctx context.Context) error {
// TODO login / refresh token
//op.MustSaveDriverStorage(d)
return nil
}
func (d *Pan123Share) Drop(ctx context.Context) error {
return nil
}
func (d *Pan123Share) List(ctx context.Context, dir model.Obj, args model.ListArgs) ([]model.Obj, error) {
// TODO return the files list, required
files, err := d.getFiles(dir.GetID())
if err != nil {
return nil, err
}
return utils.SliceConvert(files, func(src File) (model.Obj, error) {
return src, nil
})
}
func (d *Pan123Share) Link(ctx context.Context, file model.Obj, args model.LinkArgs) (*model.Link, error) {
// TODO return link of file, required
if f, ok := file.(File); ok {
//var resp DownResp
var headers map[string]string
if !utils.IsLocalIPAddr(args.IP) {
headers = map[string]string{
//"X-Real-IP": "1.1.1.1",
"X-Forwarded-For": args.IP,
}
}
data := base.Json{
"shareKey": d.ShareKey,
"SharePwd": d.SharePwd,
"etag": f.Etag,
"fileId": f.FileId,
"s3keyFlag": f.S3KeyFlag,
"size": f.Size,
}
resp, err := d.request(DownloadInfo, http.MethodPost, func(req *resty.Request) {
req.SetBody(data).SetHeaders(headers)
}, nil)
if err != nil {
return nil, err
}
downloadUrl := utils.Json.Get(resp, "data", "DownloadURL").ToString()
u, err := url.Parse(downloadUrl)
if err != nil {
return nil, err
}
nu := u.Query().Get("params")
if nu != "" {
du, _ := base64.StdEncoding.DecodeString(nu)
u, err = url.Parse(string(du))
if err != nil {
return nil, err
}
}
u_ := u.String()
log.Debug("download url: ", u_)
res, err := base.NoRedirectClient.R().SetHeader("Referer", "https://www.123pan.com/").Get(u_)
if err != nil {
return nil, err
}
log.Debug(res.String())
link := model.Link{
URL: u_,
}
log.Debugln("res code: ", res.StatusCode())
if res.StatusCode() == 302 {
link.URL = res.Header().Get("location")
} else if res.StatusCode() < 300 {
link.URL = utils.Json.Get(res.Body(), "data", "redirect_url").ToString()
}
link.Header = http.Header{
"Referer": []string{"https://www.123pan.com/"},
}
return &link, nil
}
return nil, fmt.Errorf("can't convert obj")
}
func (d *Pan123Share) MakeDir(ctx context.Context, parentDir model.Obj, dirName string) error {
// TODO create folder, optional
return errs.NotSupport
}
func (d *Pan123Share) Move(ctx context.Context, srcObj, dstDir model.Obj) error {
// TODO move obj, optional
return errs.NotSupport
}
func (d *Pan123Share) Rename(ctx context.Context, srcObj model.Obj, newName string) error {
// TODO rename obj, optional
return errs.NotSupport
}
func (d *Pan123Share) Copy(ctx context.Context, srcObj, dstDir model.Obj) error {
// TODO copy obj, optional
return errs.NotSupport
}
func (d *Pan123Share) Remove(ctx context.Context, obj model.Obj) error {
// TODO remove obj, optional
return errs.NotSupport
}
func (d *Pan123Share) Put(ctx context.Context, dstDir model.Obj, stream model.FileStreamer, up driver.UpdateProgress) error {
// TODO upload file, optional
return errs.NotSupport
}
//func (d *Pan123Share) Other(ctx context.Context, args model.OtherArgs) (interface{}, error) {
// return nil, errs.NotSupport
//}
var _ driver.Driver = (*Pan123Share)(nil)

34
drivers/123_share/meta.go Normal file
View File

@ -0,0 +1,34 @@
package _123Share
import (
"github.com/alist-org/alist/v3/internal/driver"
"github.com/alist-org/alist/v3/internal/op"
)
type Addition struct {
ShareKey string `json:"sharekey" required:"true"`
SharePwd string `json:"sharepassword" required:"true"`
driver.RootID
OrderBy string `json:"order_by" type:"select" options:"file_name,size,update_at" default:"file_name"`
OrderDirection string `json:"order_direction" type:"select" options:"asc,desc" default:"asc"`
}
var config = driver.Config{
Name: "123PanShare",
LocalSort: true,
OnlyLocal: false,
OnlyProxy: false,
NoCache: false,
NoUpload: true,
NeedMs: false,
DefaultRoot: "0",
CheckStatus: false,
Alert: "",
NoOverwriteUpload: false,
}
func init() {
op.RegisterDriver(func() driver.Driver {
return &Pan123Share{}
})
}

View File

@ -0,0 +1,91 @@
package _123Share
import (
"net/url"
"path"
"strconv"
"strings"
"time"
"github.com/alist-org/alist/v3/internal/model"
)
type File struct {
FileName string `json:"FileName"`
Size int64 `json:"Size"`
UpdateAt time.Time `json:"UpdateAt"`
FileId int64 `json:"FileId"`
Type int `json:"Type"`
Etag string `json:"Etag"`
S3KeyFlag string `json:"S3KeyFlag"`
DownloadUrl string `json:"DownloadUrl"`
}
func (f File) GetPath() string {
return ""
}
func (f File) GetSize() int64 {
return f.Size
}
func (f File) GetName() string {
return f.FileName
}
func (f File) ModTime() time.Time {
return f.UpdateAt
}
func (f File) IsDir() bool {
return f.Type == 1
}
func (f File) GetID() string {
return strconv.FormatInt(f.FileId, 10)
}
func (f File) Thumb() string {
if f.DownloadUrl == "" {
return ""
}
du, err := url.Parse(f.DownloadUrl)
if err != nil {
return ""
}
du.Path = strings.TrimSuffix(du.Path, "_24_24") + "_70_70"
query := du.Query()
query.Set("w", "70")
query.Set("h", "70")
if !query.Has("type") {
query.Set("type", strings.TrimPrefix(path.Base(f.FileName), "."))
}
if !query.Has("trade_key") {
query.Set("trade_key", "123pan-thumbnail")
}
du.RawQuery = query.Encode()
return du.String()
}
var _ model.Obj = (*File)(nil)
var _ model.Thumb = (*File)(nil)
//func (f File) Thumb() string {
//
//}
//var _ model.Thumb = (*File)(nil)
type Files struct {
//BaseResp
Data struct {
InfoList []File `json:"InfoList"`
Next string `json:"Next"`
} `json:"data"`
}
//type DownResp struct {
// //BaseResp
// Data struct {
// DownloadUrl string `json:"DownloadUrl"`
// } `json:"data"`
//}

81
drivers/123_share/util.go Normal file
View File

@ -0,0 +1,81 @@
package _123Share
import (
"errors"
"net/http"
"strconv"
"github.com/alist-org/alist/v3/drivers/base"
"github.com/alist-org/alist/v3/pkg/utils"
"github.com/go-resty/resty/v2"
jsoniter "github.com/json-iterator/go"
)
const (
Api = "https://www.123pan.com/api"
AApi = "https://www.123pan.com/a/api"
BApi = "https://www.123pan.com/b/api"
MainApi = Api
FileList = MainApi + "/share/get"
DownloadInfo = MainApi + "/share/download/info"
//AuthKeySalt = "8-8D$sL8gPjom7bk#cY"
)
func (d *Pan123Share) request(url string, method string, callback base.ReqCallback, resp interface{}) ([]byte, error) {
req := base.RestyClient.R()
req.SetHeaders(map[string]string{
"origin": "https://www.123pan.com",
"referer": "https://www.123pan.com/",
"user-agent": "Dart/2.19(dart:io)",
"platform": "android",
"app-version": "36",
})
if callback != nil {
callback(req)
}
if resp != nil {
req.SetResult(resp)
}
res, err := req.Execute(method, url)
if err != nil {
return nil, err
}
body := res.Body()
code := utils.Json.Get(body, "code").ToInt()
if code != 0 {
return nil, errors.New(jsoniter.Get(body, "message").ToString())
}
return body, nil
}
func (d *Pan123Share) getFiles(parentId string) ([]File, error) {
page := 1
res := make([]File, 0)
for {
var resp Files
query := map[string]string{
"limit": "100",
"next": "0",
"orderBy": d.OrderBy,
"orderDirection": d.OrderDirection,
"parentFileId": parentId,
"Page": strconv.Itoa(page),
"shareKey": d.ShareKey,
"SharePwd": d.SharePwd,
}
_, err := d.request(FileList, http.MethodGet, func(req *resty.Request) {
req.SetQueryParams(query)
}, &resp)
if err != nil {
return nil, err
}
page++
res = append(res, resp.Data.InfoList...)
if len(resp.Data.InfoList) == 0 || resp.Data.Next == "-1" {
break
}
}
return res, nil
}
// do others that not defined in Driver interface

347
drivers/139/driver.go Normal file
View File

@ -0,0 +1,347 @@
package _139
import (
"context"
"encoding/base64"
"fmt"
"io"
"net/http"
"strconv"
"strings"
"github.com/alist-org/alist/v3/drivers/base"
"github.com/alist-org/alist/v3/internal/driver"
"github.com/alist-org/alist/v3/internal/errs"
"github.com/alist-org/alist/v3/internal/model"
"github.com/alist-org/alist/v3/pkg/utils"
log "github.com/sirupsen/logrus"
)
type Yun139 struct {
model.Storage
Addition
Account string
}
func (d *Yun139) Config() driver.Config {
return config
}
func (d *Yun139) GetAddition() driver.Additional {
return &d.Addition
}
func (d *Yun139) Init(ctx context.Context) error {
if d.Authorization == "" {
return fmt.Errorf("authorization is empty")
}
decode, err := base64.StdEncoding.DecodeString(d.Authorization)
if err != nil {
return err
}
decodeStr := string(decode)
splits := strings.Split(decodeStr, ":")
if len(splits) < 2 {
return fmt.Errorf("authorization is invalid, splits < 2")
}
d.Account = splits[1]
_, err = d.post("/orchestration/personalCloud/user/v1.0/qryUserExternInfo", base.Json{
"qryUserExternInfoReq": base.Json{
"commonAccountInfo": base.Json{
"account": d.Account,
"accountType": 1,
},
},
}, nil)
return err
}
func (d *Yun139) Drop(ctx context.Context) error {
return nil
}
func (d *Yun139) List(ctx context.Context, dir model.Obj, args model.ListArgs) ([]model.Obj, error) {
if d.isFamily() {
return d.familyGetFiles(dir.GetID())
} else {
return d.getFiles(dir.GetID())
}
}
func (d *Yun139) Link(ctx context.Context, file model.Obj, args model.LinkArgs) (*model.Link, error) {
u, err := d.getLink(file.GetID())
if err != nil {
return nil, err
}
return &model.Link{URL: u}, nil
}
func (d *Yun139) MakeDir(ctx context.Context, parentDir model.Obj, dirName string) error {
data := base.Json{
"createCatalogExtReq": base.Json{
"parentCatalogID": parentDir.GetID(),
"newCatalogName": dirName,
"commonAccountInfo": base.Json{
"account": d.Account,
"accountType": 1,
},
},
}
pathname := "/orchestration/personalCloud/catalog/v1.0/createCatalogExt"
if d.isFamily() {
data = base.Json{
"cloudID": d.CloudID,
"commonAccountInfo": base.Json{
"account": d.Account,
"accountType": 1,
},
"docLibName": dirName,
}
pathname = "/orchestration/familyCloud/cloudCatalog/v1.0/createCloudDoc"
}
_, err := d.post(pathname, data, nil)
return err
}
func (d *Yun139) Move(ctx context.Context, srcObj, dstDir model.Obj) error {
if d.isFamily() {
return errs.NotImplement
}
var contentInfoList []string
var catalogInfoList []string
if srcObj.IsDir() {
catalogInfoList = append(catalogInfoList, srcObj.GetID())
} else {
contentInfoList = append(contentInfoList, srcObj.GetID())
}
data := base.Json{
"createBatchOprTaskReq": base.Json{
"taskType": 3,
"actionType": "304",
"taskInfo": base.Json{
"contentInfoList": contentInfoList,
"catalogInfoList": catalogInfoList,
"newCatalogID": dstDir.GetID(),
},
"commonAccountInfo": base.Json{
"account": d.Account,
"accountType": 1,
},
},
}
pathname := "/orchestration/personalCloud/batchOprTask/v1.0/createBatchOprTask"
_, err := d.post(pathname, data, nil)
return err
}
func (d *Yun139) Rename(ctx context.Context, srcObj model.Obj, newName string) error {
if d.isFamily() {
return errs.NotImplement
}
var data base.Json
var pathname string
if srcObj.IsDir() {
data = base.Json{
"catalogID": srcObj.GetID(),
"catalogName": newName,
"commonAccountInfo": base.Json{
"account": d.Account,
"accountType": 1,
},
}
pathname = "/orchestration/personalCloud/catalog/v1.0/updateCatalogInfo"
} else {
data = base.Json{
"contentID": srcObj.GetID(),
"contentName": newName,
"commonAccountInfo": base.Json{
"account": d.Account,
"accountType": 1,
},
}
pathname = "/orchestration/personalCloud/content/v1.0/updateContentInfo"
}
_, err := d.post(pathname, data, nil)
return err
}
func (d *Yun139) Copy(ctx context.Context, srcObj, dstDir model.Obj) error {
if d.isFamily() {
return errs.NotImplement
}
var contentInfoList []string
var catalogInfoList []string
if srcObj.IsDir() {
catalogInfoList = append(catalogInfoList, srcObj.GetID())
} else {
contentInfoList = append(contentInfoList, srcObj.GetID())
}
data := base.Json{
"createBatchOprTaskReq": base.Json{
"taskType": 3,
"actionType": 309,
"taskInfo": base.Json{
"contentInfoList": contentInfoList,
"catalogInfoList": catalogInfoList,
"newCatalogID": dstDir.GetID(),
},
"commonAccountInfo": base.Json{
"account": d.Account,
"accountType": 1,
},
},
}
pathname := "/orchestration/personalCloud/batchOprTask/v1.0/createBatchOprTask"
_, err := d.post(pathname, data, nil)
return err
}
func (d *Yun139) Remove(ctx context.Context, obj model.Obj) error {
var contentInfoList []string
var catalogInfoList []string
if obj.IsDir() {
catalogInfoList = append(catalogInfoList, obj.GetID())
} else {
contentInfoList = append(contentInfoList, obj.GetID())
}
data := base.Json{
"createBatchOprTaskReq": base.Json{
"taskType": 2,
"actionType": 201,
"taskInfo": base.Json{
"newCatalogID": "",
"contentInfoList": contentInfoList,
"catalogInfoList": catalogInfoList,
},
"commonAccountInfo": base.Json{
"account": d.Account,
"accountType": 1,
},
},
}
pathname := "/orchestration/personalCloud/batchOprTask/v1.0/createBatchOprTask"
if d.isFamily() {
data = base.Json{
"catalogList": catalogInfoList,
"contentList": contentInfoList,
"commonAccountInfo": base.Json{
"account": d.Account,
"accountType": 1,
},
"sourceCatalogType": 1002,
"taskType": 2,
}
pathname = "/orchestration/familyCloud/batchOprTask/v1.0/createBatchOprTask"
}
_, err := d.post(pathname, data, nil)
return err
}
const (
_ = iota //ignore first value by assigning to blank identifier
KB = 1 << (10 * iota)
MB
GB
TB
)
func getPartSize(size int64) int64 {
// 网盘对于分片数量存在上限
if size/GB > 30 {
return 512 * MB
}
return 100 * MB
}
func (d *Yun139) Put(ctx context.Context, dstDir model.Obj, stream model.FileStreamer, up driver.UpdateProgress) error {
data := base.Json{
"manualRename": 2,
"operation": 0,
"fileCount": 1,
"totalSize": 0, // 去除上传大小限制
"uploadContentList": []base.Json{{
"contentName": stream.GetName(),
"contentSize": 0, // 去除上传大小限制
// "digest": "5a3231986ce7a6b46e408612d385bafa"
}},
"parentCatalogID": dstDir.GetID(),
"newCatalogName": "",
"commonAccountInfo": base.Json{
"account": d.Account,
"accountType": 1,
},
}
pathname := "/orchestration/personalCloud/uploadAndDownload/v1.0/pcUploadFileRequest"
if d.isFamily() {
data = d.newJson(base.Json{
"fileCount": 1,
"manualRename": 2,
"operation": 0,
"path": "",
"seqNo": "",
"totalSize": 0,
"uploadContentList": []base.Json{{
"contentName": stream.GetName(),
"contentSize": 0,
// "digest": "5a3231986ce7a6b46e408612d385bafa"
}},
})
pathname = "/orchestration/familyCloud/content/v1.0/getFileUploadURL"
return errs.NotImplement
}
var resp UploadResp
_, err := d.post(pathname, data, &resp)
if err != nil {
return err
}
// Progress
p := driver.NewProgress(stream.GetSize(), up)
var partSize = getPartSize(stream.GetSize())
part := (stream.GetSize() + partSize - 1) / partSize
if part == 0 {
part = 1
}
for i := int64(0); i < part; i++ {
if utils.IsCanceled(ctx) {
return ctx.Err()
}
start := i * partSize
byteSize := stream.GetSize() - start
if byteSize > partSize {
byteSize = partSize
}
limitReader := io.LimitReader(stream, byteSize)
// Update Progress
r := io.TeeReader(limitReader, p)
req, err := http.NewRequest("POST", resp.Data.UploadResult.RedirectionURL, r)
if err != nil {
return err
}
req = req.WithContext(ctx)
req.Header.Set("Content-Type", "text/plain;name="+unicode(stream.GetName()))
req.Header.Set("contentSize", strconv.FormatInt(stream.GetSize(), 10))
req.Header.Set("range", fmt.Sprintf("bytes=%d-%d", start, start+byteSize-1))
req.Header.Set("uploadtaskID", resp.Data.UploadResult.UploadTaskID)
req.Header.Set("rangeType", "0")
req.ContentLength = byteSize
res, err := base.HttpClient.Do(req)
if err != nil {
return err
}
_ = res.Body.Close()
log.Debugf("%+v", res)
if res.StatusCode != http.StatusOK {
return fmt.Errorf("unexpected status code: %d", res.StatusCode)
}
}
return nil
}
var _ driver.Driver = (*Yun139)(nil)

25
drivers/139/meta.go Normal file
View File

@ -0,0 +1,25 @@
package _139
import (
"github.com/alist-org/alist/v3/internal/driver"
"github.com/alist-org/alist/v3/internal/op"
)
type Addition struct {
//Account string `json:"account" required:"true"`
Authorization string `json:"authorization" type:"text" required:"true"`
driver.RootID
Type string `json:"type" type:"select" options:"personal,family" default:"personal"`
CloudID string `json:"cloud_id"`
}
var config = driver.Config{
Name: "139Yun",
LocalSort: true,
}
func init() {
op.RegisterDriver(func() driver.Driver {
return &Yun139{}
})
}

187
drivers/139/types.go Normal file
View File

@ -0,0 +1,187 @@
package _139
type BaseResp struct {
Success bool `json:"success"`
Code string `json:"code"`
Message string `json:"message"`
}
type Catalog struct {
CatalogID string `json:"catalogID"`
CatalogName string `json:"catalogName"`
//CatalogType int `json:"catalogType"`
//CreateTime string `json:"createTime"`
UpdateTime string `json:"updateTime"`
//IsShared bool `json:"isShared"`
//CatalogLevel int `json:"catalogLevel"`
//ShareDoneeCount int `json:"shareDoneeCount"`
//OpenType int `json:"openType"`
//ParentCatalogID string `json:"parentCatalogId"`
//DirEtag int `json:"dirEtag"`
//Tombstoned int `json:"tombstoned"`
//ProxyID interface{} `json:"proxyID"`
//Moved int `json:"moved"`
//IsFixedDir int `json:"isFixedDir"`
//IsSynced interface{} `json:"isSynced"`
//Owner string `json:"owner"`
//Modifier interface{} `json:"modifier"`
//Path string `json:"path"`
//ShareType int `json:"shareType"`
//SoftLink interface{} `json:"softLink"`
//ExtProp1 interface{} `json:"extProp1"`
//ExtProp2 interface{} `json:"extProp2"`
//ExtProp3 interface{} `json:"extProp3"`
//ExtProp4 interface{} `json:"extProp4"`
//ExtProp5 interface{} `json:"extProp5"`
//ETagOprType int `json:"ETagOprType"`
}
type Content struct {
ContentID string `json:"contentID"`
ContentName string `json:"contentName"`
//ContentSuffix string `json:"contentSuffix"`
ContentSize int64 `json:"contentSize"`
//ContentDesc string `json:"contentDesc"`
//ContentType int `json:"contentType"`
//ContentOrigin int `json:"contentOrigin"`
UpdateTime string `json:"updateTime"`
//CommentCount int `json:"commentCount"`
ThumbnailURL string `json:"thumbnailURL"`
//BigthumbnailURL string `json:"bigthumbnailURL"`
//PresentURL string `json:"presentURL"`
//PresentLURL string `json:"presentLURL"`
//PresentHURL string `json:"presentHURL"`
//ContentTAGList interface{} `json:"contentTAGList"`
//ShareDoneeCount int `json:"shareDoneeCount"`
//Safestate int `json:"safestate"`
//Transferstate int `json:"transferstate"`
//IsFocusContent int `json:"isFocusContent"`
//UpdateShareTime interface{} `json:"updateShareTime"`
//UploadTime string `json:"uploadTime"`
//OpenType int `json:"openType"`
//AuditResult int `json:"auditResult"`
//ParentCatalogID string `json:"parentCatalogId"`
//Channel string `json:"channel"`
//GeoLocFlag string `json:"geoLocFlag"`
//Digest string `json:"digest"`
//Version string `json:"version"`
//FileEtag string `json:"fileEtag"`
//FileVersion string `json:"fileVersion"`
//Tombstoned int `json:"tombstoned"`
//ProxyID string `json:"proxyID"`
//Moved int `json:"moved"`
//MidthumbnailURL string `json:"midthumbnailURL"`
//Owner string `json:"owner"`
//Modifier string `json:"modifier"`
//ShareType int `json:"shareType"`
//ExtInfo struct {
// Uploader string `json:"uploader"`
// Address string `json:"address"`
//} `json:"extInfo"`
//Exif struct {
// CreateTime string `json:"createTime"`
// Longitude interface{} `json:"longitude"`
// Latitude interface{} `json:"latitude"`
// LocalSaveTime interface{} `json:"localSaveTime"`
//} `json:"exif"`
//CollectionFlag interface{} `json:"collectionFlag"`
//TreeInfo interface{} `json:"treeInfo"`
//IsShared bool `json:"isShared"`
//ETagOprType int `json:"ETagOprType"`
}
type GetDiskResp struct {
BaseResp
Data struct {
Result struct {
ResultCode string `json:"resultCode"`
ResultDesc interface{} `json:"resultDesc"`
} `json:"result"`
GetDiskResult struct {
ParentCatalogID string `json:"parentCatalogID"`
NodeCount int `json:"nodeCount"`
CatalogList []Catalog `json:"catalogList"`
ContentList []Content `json:"contentList"`
IsCompleted int `json:"isCompleted"`
} `json:"getDiskResult"`
} `json:"data"`
}
type UploadResp struct {
BaseResp
Data struct {
Result struct {
ResultCode string `json:"resultCode"`
ResultDesc interface{} `json:"resultDesc"`
} `json:"result"`
UploadResult struct {
UploadTaskID string `json:"uploadTaskID"`
RedirectionURL string `json:"redirectionUrl"`
NewContentIDList []struct {
ContentID string `json:"contentID"`
ContentName string `json:"contentName"`
IsNeedUpload string `json:"isNeedUpload"`
FileEtag int64 `json:"fileEtag"`
FileVersion int64 `json:"fileVersion"`
OverridenFlag int `json:"overridenFlag"`
} `json:"newContentIDList"`
CatalogIDList interface{} `json:"catalogIDList"`
IsSlice interface{} `json:"isSlice"`
} `json:"uploadResult"`
} `json:"data"`
}
type CloudContent struct {
ContentID string `json:"contentID"`
//Modifier string `json:"modifier"`
//Nickname string `json:"nickname"`
//CloudNickName string `json:"cloudNickName"`
ContentName string `json:"contentName"`
//ContentType int `json:"contentType"`
//ContentSuffix string `json:"contentSuffix"`
ContentSize int64 `json:"contentSize"`
//ContentDesc string `json:"contentDesc"`
//CreateTime string `json:"createTime"`
//Shottime interface{} `json:"shottime"`
LastUpdateTime string `json:"lastUpdateTime"`
ThumbnailURL string `json:"thumbnailURL"`
//MidthumbnailURL string `json:"midthumbnailURL"`
//BigthumbnailURL string `json:"bigthumbnailURL"`
//PresentURL string `json:"presentURL"`
//PresentLURL string `json:"presentLURL"`
//PresentHURL string `json:"presentHURL"`
//ParentCatalogID string `json:"parentCatalogID"`
//Uploader string `json:"uploader"`
//UploaderNickName string `json:"uploaderNickName"`
//TreeInfo interface{} `json:"treeInfo"`
//UpdateTime interface{} `json:"updateTime"`
//ExtInfo struct {
// Uploader string `json:"uploader"`
//} `json:"extInfo"`
//EtagOprType interface{} `json:"etagOprType"`
}
type CloudCatalog struct {
CatalogID string `json:"catalogID"`
CatalogName string `json:"catalogName"`
//CloudID string `json:"cloudID"`
//CreateTime string `json:"createTime"`
LastUpdateTime string `json:"lastUpdateTime"`
//Creator string `json:"creator"`
//CreatorNickname string `json:"creatorNickname"`
}
type QueryContentListResp struct {
BaseResp
Data struct {
Result struct {
ResultCode string `json:"resultCode"`
ResultDesc string `json:"resultDesc"`
} `json:"result"`
Path string `json:"path"`
CloudContentList []CloudContent `json:"cloudContentList"`
CloudCatalogList []CloudCatalog `json:"cloudCatalogList"`
TotalCount int `json:"totalCount"`
RecallContent interface{} `json:"recallContent"`
} `json:"data"`
}

250
drivers/139/util.go Normal file
View File

@ -0,0 +1,250 @@
package _139
import (
"encoding/base64"
"errors"
"fmt"
"net/http"
"net/url"
"sort"
"strconv"
"strings"
"time"
"github.com/alist-org/alist/v3/drivers/base"
"github.com/alist-org/alist/v3/internal/model"
"github.com/alist-org/alist/v3/pkg/utils"
"github.com/alist-org/alist/v3/pkg/utils/random"
"github.com/go-resty/resty/v2"
jsoniter "github.com/json-iterator/go"
log "github.com/sirupsen/logrus"
)
// do others that not defined in Driver interface
func (d *Yun139) isFamily() bool {
return d.Type == "family"
}
func encodeURIComponent(str string) string {
r := url.QueryEscape(str)
r = strings.Replace(r, "+", "%20", -1)
r = strings.Replace(r, "%21", "!", -1)
r = strings.Replace(r, "%27", "'", -1)
r = strings.Replace(r, "%28", "(", -1)
r = strings.Replace(r, "%29", ")", -1)
r = strings.Replace(r, "%2A", "*", -1)
return r
}
func calSign(body, ts, randStr string) string {
body = encodeURIComponent(body)
strs := strings.Split(body, "")
sort.Strings(strs)
body = strings.Join(strs, "")
body = base64.StdEncoding.EncodeToString([]byte(body))
res := utils.GetMD5EncodeStr(body) + utils.GetMD5EncodeStr(ts+":"+randStr)
res = strings.ToUpper(utils.GetMD5EncodeStr(res))
return res
}
func getTime(t string) time.Time {
stamp, _ := time.ParseInLocation("20060102150405", t, time.Local)
return stamp
}
func (d *Yun139) request(pathname string, method string, callback base.ReqCallback, resp interface{}) ([]byte, error) {
url := "https://yun.139.com" + pathname
req := base.RestyClient.R()
randStr := random.String(16)
ts := time.Now().Format("2006-01-02 15:04:05")
if callback != nil {
callback(req)
}
body, err := utils.Json.Marshal(req.Body)
if err != nil {
return nil, err
}
sign := calSign(string(body), ts, randStr)
svcType := "1"
if d.isFamily() {
svcType = "2"
}
req.SetHeaders(map[string]string{
"Accept": "application/json, text/plain, */*",
"CMS-DEVICE": "default",
"Authorization": "Basic " + d.Authorization,
"mcloud-channel": "1000101",
"mcloud-client": "10701",
//"mcloud-route": "001",
"mcloud-sign": fmt.Sprintf("%s,%s,%s", ts, randStr, sign),
//"mcloud-skey":"",
"mcloud-version": "6.6.0",
"Origin": "https://yun.139.com",
"Referer": "https://yun.139.com/w/",
"x-DeviceInfo": "||9|6.6.0|chrome|95.0.4638.69|uwIy75obnsRPIwlJSd7D9GhUvFwG96ce||macos 10.15.2||zh-CN|||",
"x-huawei-channelSrc": "10000034",
"x-inner-ntwk": "2",
"x-m4c-caller": "PC",
"x-m4c-src": "10002",
"x-SvcType": svcType,
})
var e BaseResp
req.SetResult(&e)
res, err := req.Execute(method, url)
log.Debugln(res.String())
if !e.Success {
return nil, errors.New(e.Message)
}
if resp != nil {
err = utils.Json.Unmarshal(res.Body(), resp)
if err != nil {
return nil, err
}
}
return res.Body(), nil
}
func (d *Yun139) post(pathname string, data interface{}, resp interface{}) ([]byte, error) {
return d.request(pathname, http.MethodPost, func(req *resty.Request) {
req.SetBody(data)
}, resp)
}
func (d *Yun139) getFiles(catalogID string) ([]model.Obj, error) {
start := 0
limit := 100
files := make([]model.Obj, 0)
for {
data := base.Json{
"catalogID": catalogID,
"sortDirection": 1,
"startNumber": start + 1,
"endNumber": start + limit,
"filterType": 0,
"catalogSortType": 0,
"contentSortType": 0,
"commonAccountInfo": base.Json{
"account": d.Account,
"accountType": 1,
},
}
var resp GetDiskResp
_, err := d.post("/orchestration/personalCloud/catalog/v1.0/getDisk", data, &resp)
if err != nil {
return nil, err
}
for _, catalog := range resp.Data.GetDiskResult.CatalogList {
f := model.Object{
ID: catalog.CatalogID,
Name: catalog.CatalogName,
Size: 0,
Modified: getTime(catalog.UpdateTime),
IsFolder: true,
}
files = append(files, &f)
}
for _, content := range resp.Data.GetDiskResult.ContentList {
f := model.ObjThumb{
Object: model.Object{
ID: content.ContentID,
Name: content.ContentName,
Size: content.ContentSize,
Modified: getTime(content.UpdateTime),
},
Thumbnail: model.Thumbnail{Thumbnail: content.ThumbnailURL},
//Thumbnail: content.BigthumbnailURL,
}
files = append(files, &f)
}
if start+limit >= resp.Data.GetDiskResult.NodeCount {
break
}
start += limit
}
return files, nil
}
func (d *Yun139) newJson(data map[string]interface{}) base.Json {
common := map[string]interface{}{
"catalogType": 3,
"cloudID": d.CloudID,
"cloudType": 1,
"commonAccountInfo": base.Json{
"account": d.Account,
"accountType": 1,
},
}
return utils.MergeMap(data, common)
}
func (d *Yun139) familyGetFiles(catalogID string) ([]model.Obj, error) {
pageNum := 1
files := make([]model.Obj, 0)
for {
data := d.newJson(base.Json{
"catalogID": catalogID,
"contentSortType": 0,
"pageInfo": base.Json{
"pageNum": pageNum,
"pageSize": 100,
},
"sortDirection": 1,
})
var resp QueryContentListResp
_, err := d.post("/orchestration/familyCloud/content/v1.0/queryContentList", data, &resp)
if err != nil {
return nil, err
}
for _, catalog := range resp.Data.CloudCatalogList {
f := model.Object{
ID: catalog.CatalogID,
Name: catalog.CatalogName,
Size: 0,
IsFolder: true,
Modified: getTime(catalog.LastUpdateTime),
}
files = append(files, &f)
}
for _, content := range resp.Data.CloudContentList {
f := model.ObjThumb{
Object: model.Object{
ID: content.ContentID,
Name: content.ContentName,
Size: content.ContentSize,
Modified: getTime(content.LastUpdateTime),
},
Thumbnail: model.Thumbnail{Thumbnail: content.ThumbnailURL},
//Thumbnail: content.BigthumbnailURL,
}
files = append(files, &f)
}
if 100*pageNum > resp.Data.TotalCount {
break
}
pageNum++
}
return files, nil
}
func (d *Yun139) getLink(contentId string) (string, error) {
data := base.Json{
"appName": "",
"contentID": contentId,
"commonAccountInfo": base.Json{
"account": d.Account,
"accountType": 1,
},
}
res, err := d.post("/orchestration/personalCloud/uploadAndDownload/v1.0/downloadRequest",
data, nil)
if err != nil {
return "", err
}
return jsoniter.Get(res, "data", "downloadURL").ToString(), nil
}
func unicode(str string) string {
textQuoted := strconv.QuoteToASCII(str)
textUnquoted := textQuoted[1 : len(textQuoted)-1]
return textUnquoted
}

View File

@ -1,491 +0,0 @@
package _89
import (
"crypto/aes"
"crypto/hmac"
"crypto/md5"
"crypto/rand"
"crypto/rsa"
"crypto/sha1"
"crypto/x509"
"encoding/base64"
"encoding/hex"
"encoding/json"
"encoding/pem"
"errors"
"fmt"
"github.com/Xhofe/alist/conf"
"github.com/Xhofe/alist/drivers/base"
"github.com/Xhofe/alist/model"
"github.com/Xhofe/alist/utils"
"github.com/go-resty/resty/v2"
jsoniter "github.com/json-iterator/go"
log "github.com/sirupsen/logrus"
mathRand "math/rand"
"net/url"
"path/filepath"
"regexp"
"strconv"
"strings"
"time"
)
var client189Map map[string]*resty.Client
func (driver Cloud189) FormatFile(file *Cloud189File) *model.File {
f := &model.File{
Id: strconv.FormatInt(file.Id, 10),
Name: file.Name,
Size: file.Size,
Driver: driver.Config().Name,
UpdatedAt: nil,
Thumbnail: file.Icon.SmallUrl,
Url: file.Url,
}
loc, _ := time.LoadLocation("Local")
lastOpTime, err := time.ParseInLocation("2006-01-02 15:04:05", file.LastOpTime, loc)
if err == nil {
f.UpdatedAt = &lastOpTime
}
if file.Size == -1 {
f.Type = conf.FOLDER
f.Size = 0
} else {
f.Type = utils.GetFileType(filepath.Ext(file.Name))
}
return f
}
//func (c Cloud189) GetFile(path string, account *model.Account) (*Cloud189File, error) {
// dir, name := filepath.Split(path)
// dir = utils.ParsePath(dir)
// _, _, err := c.ParentPath(dir, account)
// if err != nil {
// return nil, err
// }
// parentFiles_, _ := conf.Cache.Get(conf.Ctx, fmt.Sprintf("%s%s", account.Name, dir))
// parentFiles, _ := parentFiles_.([]Cloud189File)
// for _, file := range parentFiles {
// if file.Name == name {
// if file.Size != -1 {
// return &file, err
// } else {
// return nil, ErrNotFile
// }
// }
// }
// return nil, ErrPathNotFound
//}
type Cloud189Down struct {
ResCode int `json:"res_code"`
ResMessage string `json:"res_message"`
FileDownloadUrl string `json:"fileDownloadUrl"`
}
type LoginResp struct {
Msg string `json:"msg"`
Result int `json:"result"`
ToUrl string `json:"toUrl"`
}
// Login refer to PanIndex
func (driver Cloud189) Login(account *model.Account) error {
client, ok := client189Map[account.Name]
if !ok {
//cookieJar, _ := cookiejar.New(&cookiejar.Options{PublicSuffixList: publicsuffix.List})
client = resty.New()
//client.SetCookieJar(cookieJar)
client.SetRetryCount(3)
client.SetHeader("Referer", "https://cloud.189.cn/")
}
url := "https://cloud.189.cn/api/portal/loginUrl.action?redirectURL=https%3A%2F%2Fcloud.189.cn%2Fmain.action"
b := ""
lt := ""
ltText := regexp.MustCompile(`lt = "(.+?)"`)
for i := 0; i < 3; i++ {
res, err := client.R().Get(url)
if err != nil {
return err
}
b = res.String()
ltTextArr := ltText.FindStringSubmatch(b)
if len(ltTextArr) > 0 {
lt = ltTextArr[1]
break
} else {
<-time.After(time.Second)
}
}
if lt == "" {
return fmt.Errorf("get empty login page")
}
captchaToken := regexp.MustCompile(`captchaToken' value='(.+?)'`).FindStringSubmatch(b)[1]
returnUrl := regexp.MustCompile(`returnUrl = '(.+?)'`).FindStringSubmatch(b)[1]
paramId := regexp.MustCompile(`paramId = "(.+?)"`).FindStringSubmatch(b)[1]
//reqId := regexp.MustCompile(`reqId = "(.+?)"`).FindStringSubmatch(b)[1]
jRsakey := regexp.MustCompile(`j_rsaKey" value="(\S+)"`).FindStringSubmatch(b)[1]
vCodeID := regexp.MustCompile(`picCaptcha\.do\?token\=([A-Za-z0-9\&\=]+)`).FindStringSubmatch(b)[1]
vCodeRS := ""
if vCodeID != "" {
// need ValidateCode
}
userRsa := RsaEncode([]byte(account.Username), jRsakey)
passwordRsa := RsaEncode([]byte(account.Password), jRsakey)
url = "https://open.e.189.cn/api/logbox/oauth2/loginSubmit.do"
var loginResp LoginResp
res, err := client.R().
SetHeaders(map[string]string{
"lt": lt,
"User-Agent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/87.0.4280.88 Safari/537.36",
"Referer": "https://open.e.189.cn/",
"accept": "application/json;charset=UTF-8",
}).SetFormData(map[string]string{
"appKey": "cloud",
"accountType": "01",
"userName": "{RSA}" + userRsa,
"password": "{RSA}" + passwordRsa,
"validateCode": vCodeRS,
"captchaToken": captchaToken,
"returnUrl": returnUrl,
"mailSuffix": "@pan.cn",
"paramId": paramId,
"clientType": "10010",
"dynamicCheck": "FALSE",
"cb_SaveName": "1",
"isOauth2": "false",
}).Post(url)
if err != nil {
return err
}
err = json.Unmarshal(res.Body(), &loginResp)
if err != nil {
log.Error(err.Error())
return err
}
if loginResp.Result != 0 {
return fmt.Errorf(loginResp.Msg)
}
_, err = client.R().Get(loginResp.ToUrl)
if err != nil {
log.Errorf(err.Error())
return err
}
client189Map[account.Name] = client
return nil
}
type Cloud189Error struct {
ErrorCode string `json:"errorCode"`
ErrorMsg string `json:"errorMsg"`
}
type Cloud189File struct {
Id int64 `json:"id"`
LastOpTime string `json:"lastOpTime"`
Name string `json:"name"`
Size int64 `json:"size"`
Icon struct {
SmallUrl string `json:"smallUrl"`
//LargeUrl string `json:"largeUrl"`
} `json:"icon"`
Url string `json:"url"`
}
type Cloud189Folder struct {
Id int64 `json:"id"`
LastOpTime string `json:"lastOpTime"`
Name string `json:"name"`
}
type Cloud189Files struct {
ResCode int `json:"res_code"`
ResMessage string `json:"res_message"`
FileListAO struct {
Count int `json:"count"`
FileList []Cloud189File `json:"fileList"`
FolderList []Cloud189Folder `json:"folderList"`
} `json:"fileListAO"`
}
func (driver Cloud189) GetFiles(fileId string, account *model.Account) ([]Cloud189File, error) {
client, ok := client189Map[account.Name]
if !ok {
return nil, fmt.Errorf("can't find [%s] client", account.Name)
}
res := make([]Cloud189File, 0)
pageNum := 1
for {
var e Cloud189Error
var resp Cloud189Files
_, err := client.R().SetResult(&resp).SetError(&e).
SetHeader("Accept", "application/json;charset=UTF-8").
SetQueryParams(map[string]string{
"noCache": random(),
"pageSize": "60",
"pageNum": strconv.Itoa(pageNum),
"mediaType": "0",
"folderId": fileId,
"iconOption": "5",
"orderBy": account.OrderBy,
"descending": account.OrderDirection,
}).Get("https://cloud.189.cn/api/open/file/listFiles.action")
if err != nil {
return nil, err
}
if e.ErrorCode != "" {
if e.ErrorCode == "InvalidSessionKey" {
err = driver.Login(account)
if err != nil {
return nil, err
}
return driver.GetFiles(fileId, account)
}
}
if resp.ResCode != 0 {
return nil, fmt.Errorf(resp.ResMessage)
}
if resp.FileListAO.Count == 0 {
break
}
for _, folder := range resp.FileListAO.FolderList {
res = append(res, Cloud189File{
Id: folder.Id,
LastOpTime: folder.LastOpTime,
Name: folder.Name,
Size: -1,
})
}
res = append(res, resp.FileListAO.FileList...)
pageNum++
}
return res, nil
}
func (driver Cloud189) Request(url string, method string, form map[string]string, headers map[string]string, account *model.Account) ([]byte, error) {
client, ok := client189Map[account.Name]
if !ok {
return nil, fmt.Errorf("can't find [%s] client", account.Name)
}
//var resp base.Json
var e Cloud189Error
req := client.R().SetError(&e).
SetHeader("Accept", "application/json;charset=UTF-8").
SetQueryParams(map[string]string{
"noCache": random(),
})
if form != nil {
req = req.SetFormData(form)
}
if headers != nil {
req = req.SetHeaders(headers)
}
var err error
var res *resty.Response
if strings.ToUpper(method) == "GET" {
res, err = req.Get(url)
} else {
res, err = req.Post(url)
}
if err != nil {
return nil, err
}
if e.ErrorCode != "" {
if e.ErrorCode == "InvalidSessionKey" {
err = driver.Login(account)
if err != nil {
return nil, err
}
return driver.Request(url, method, form, nil, account)
}
}
//log.Debug(res, jsoniter.Get(res.Body(),"res_code").ToInt())
if jsoniter.Get(res.Body(), "res_code").ToInt() != 0 {
err = errors.New(jsoniter.Get(res.Body(), "res_message").ToString())
}
return res.Body(), err
}
func (driver Cloud189) GetSessionKey(account *model.Account) (string, error) {
resp, err := driver.Request("https://cloud.189.cn/v2/getUserBriefInfo.action", "GET", nil, nil, account)
if err != nil {
return "", err
}
return jsoniter.Get(resp, "sessionKey").ToString(), nil
}
func (driver Cloud189) GetResKey(account *model.Account) (string, string, error) {
resp, err := driver.Request("https://cloud.189.cn/api/security/generateRsaKey.action", "GET", nil, nil, account)
if err != nil {
return "", "", err
}
return jsoniter.Get(resp, "pubKey").ToString(), jsoniter.Get(resp, "pkId").ToString(), nil
}
func (driver Cloud189) UploadRequest(url string, form map[string]string, account *model.Account) ([]byte, error) {
sessionKey, err := driver.GetSessionKey(account)
if err != nil {
return nil, err
}
pubKey, pkId, err := driver.GetResKey(account)
if err != nil {
return nil, err
}
xRId := "e007e99a-370c-4a14-a143-1b1541972fcf"
pkey := strings.ReplaceAll(xRId, "-", "")
params := aesEncrypt(qs(form), pkey[:16])
date := strconv.FormatInt(time.Now().Unix(), 10)
signature := hmacSha1(fmt.Sprintf("SessionKey=%s&Operate=GET&RequestURI=%s&Date=%s&params=%s", sessionKey, url, date, params), pkey)
encryptionText := RsaEncode([]byte(pkey), pubKey)
res, err := base.RestyClient.R().SetHeaders(map[string]string{
"signature": signature,
"sessionKey": sessionKey,
"encryptionText": encryptionText,
"pkId": pkId,
"x-request-id": xRId,
"x-request-date": date,
"origin": "https://cloud.189.cn",
"referer": "https://cloud.189.cn/",
}).SetQueryParam("params", params).Get("https://upload.cloud.189.cn" + url)
if err != nil {
return nil, err
}
log.Debug(res.String())
data := res.Body()
if jsoniter.Get(data, "code").ToString() != "SUCCESS" {
return nil, errors.New(jsoniter.Get(data, "msg").ToString())
}
return data, nil
}
func random() string {
return fmt.Sprintf("0.%17v", mathRand.New(mathRand.NewSource(time.Now().UnixNano())).Int63n(100000000000000000))
}
func RsaEncode(origData []byte, j_rsakey string) string {
publicKey := []byte("-----BEGIN PUBLIC KEY-----\n" + j_rsakey + "\n-----END PUBLIC KEY-----")
block, _ := pem.Decode(publicKey)
pubInterface, _ := x509.ParsePKIXPublicKey(block.Bytes)
pub := pubInterface.(*rsa.PublicKey)
b, err := rsa.EncryptPKCS1v15(rand.Reader, pub, origData)
if err != nil {
log.Errorf("err: %s", err.Error())
}
return b64tohex(base64.StdEncoding.EncodeToString(b))
}
var b64map = "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/"
var BI_RM = "0123456789abcdefghijklmnopqrstuvwxyz"
func int2char(a int) string {
return strings.Split(BI_RM, "")[a]
}
func b64tohex(a string) string {
d := ""
e := 0
c := 0
for i := 0; i < len(a); i++ {
m := strings.Split(a, "")[i]
if m != "=" {
v := strings.Index(b64map, m)
if 0 == e {
e = 1
d += int2char(v >> 2)
c = 3 & v
} else if 1 == e {
e = 2
d += int2char(c<<2 | v>>4)
c = 15 & v
} else if 2 == e {
e = 3
d += int2char(c)
d += int2char(v >> 2)
c = 3 & v
} else {
e = 0
d += int2char(c<<2 | v>>4)
d += int2char(15 & v)
}
}
}
if e == 1 {
d += int2char(c << 2)
}
return d
}
func qs(form map[string]string) string {
strList := make([]string, 0)
for k, v := range form {
strList = append(strList, fmt.Sprintf("%s=%s", k, url.QueryEscape(v)))
}
return strings.Join(strList, "&")
}
func aesEncrypt(data, key string) string {
encrypted := AesEncryptECB([]byte(data), []byte(key))
//return string(encrypted)
return hex.EncodeToString(encrypted)
}
func hmacSha1(data string, secret string) string {
h := hmac.New(sha1.New, []byte(secret))
h.Write([]byte(data))
return hex.EncodeToString(h.Sum(nil))
}
func AesEncryptECB(origData []byte, key []byte) (encrypted []byte) {
cipher, _ := aes.NewCipher(generateKey(key))
length := (len(origData) + aes.BlockSize) / aes.BlockSize
plain := make([]byte, length*aes.BlockSize)
copy(plain, origData)
pad := byte(len(plain) - len(origData))
for i := len(origData); i < len(plain); i++ {
plain[i] = pad
}
encrypted = make([]byte, len(plain))
// 分组分块加密
for bs, be := 0, cipher.BlockSize(); bs <= len(origData); bs, be = bs+cipher.BlockSize(), be+cipher.BlockSize() {
cipher.Encrypt(encrypted[bs:be], plain[bs:be])
}
return encrypted
}
func AesDecryptECB(encrypted []byte, key []byte) (decrypted []byte) {
cipher, _ := aes.NewCipher(generateKey(key))
decrypted = make([]byte, len(encrypted))
//
for bs, be := 0, cipher.BlockSize(); bs < len(encrypted); bs, be = bs+cipher.BlockSize(), be+cipher.BlockSize() {
cipher.Decrypt(decrypted[bs:be], encrypted[bs:be])
}
trim := 0
if len(decrypted) > 0 {
trim = len(decrypted) - int(decrypted[len(decrypted)-1])
}
return decrypted[:trim]
}
func generateKey(key []byte) (genKey []byte) {
genKey = make([]byte, 16)
copy(genKey, key)
for i := 16; i < len(key); {
for j := 0; j < 16 && i < len(key); j, i = j+1, i+1 {
genKey[j] ^= key[i]
}
}
return genKey
}
func getMd5(data []byte) []byte {
h := md5.New()
h.Write(data)
return h.Sum(nil)
}
func init() {
base.RegisterDriver(&Cloud189{})
client189Map = make(map[string]*resty.Client, 0)
}

View File

@ -1,347 +1,181 @@
package _89
package _189
import (
"bytes"
"crypto/md5"
"encoding/base64"
"encoding/hex"
"encoding/json"
"fmt"
"github.com/Xhofe/alist/conf"
"github.com/Xhofe/alist/drivers/base"
"github.com/Xhofe/alist/model"
"github.com/Xhofe/alist/utils"
"github.com/gin-gonic/gin"
jsoniter "github.com/json-iterator/go"
log "github.com/sirupsen/logrus"
"io"
"math"
"context"
"net/http"
"path/filepath"
"strconv"
"strings"
"github.com/alist-org/alist/v3/drivers/base"
"github.com/alist-org/alist/v3/internal/driver"
"github.com/alist-org/alist/v3/internal/model"
"github.com/alist-org/alist/v3/pkg/utils"
"github.com/go-resty/resty/v2"
log "github.com/sirupsen/logrus"
)
type Cloud189 struct{}
func (driver Cloud189) Config() base.DriverConfig {
return base.DriverConfig{
Name: "189Cloud",
}
type Cloud189 struct {
model.Storage
Addition
client *resty.Client
rsa Rsa
sessionKey string
}
func (driver Cloud189) Items() []base.Item {
return []base.Item{
{
Name: "username",
Label: "username",
Type: base.TypeString,
Required: true,
Description: "account username/phone number",
},
{
Name: "password",
Label: "password",
Type: base.TypeString,
Required: true,
Description: "account password",
},
{
Name: "root_folder",
Label: "root folder file_id",
Type: base.TypeString,
Required: true,
},
{
Name: "order_by",
Label: "order_by",
Type: base.TypeSelect,
Values: "name,size,lastOpTime,createdDate",
Required: true,
},
{
Name: "order_direction",
Label: "desc",
Type: base.TypeSelect,
Values: "true,false",
Required: true,
},
}
func (d *Cloud189) Config() driver.Config {
return config
}
func (driver Cloud189) Save(account *model.Account, old *model.Account) error {
if old != nil && old.Name != account.Name {
delete(client189Map, old.Name)
}
if err := driver.Login(account); err != nil {
account.Status = err.Error()
_ = model.SaveAccount(account)
return err
}
account.Status = "work"
err := model.SaveAccount(account)
if err != nil {
return err
}
func (d *Cloud189) GetAddition() driver.Additional {
return &d.Addition
}
func (d *Cloud189) Init(ctx context.Context) error {
d.client = base.NewRestyClient().
SetHeader("Referer", "https://cloud.189.cn/")
return d.newLogin()
}
func (d *Cloud189) Drop(ctx context.Context) error {
return nil
}
func (driver Cloud189) File(path string, account *model.Account) (*model.File, error) {
path = utils.ParsePath(path)
if path == "/" {
return &model.File{
Id: account.RootFolder,
Name: account.Name,
Size: 0,
Type: conf.FOLDER,
Driver: driver.Config().Name,
UpdatedAt: account.UpdatedAt,
}, nil
}
dir, name := filepath.Split(path)
files, err := driver.Files(dir, account)
if err != nil {
return nil, err
}
for _, file := range files {
if file.Name == name {
return &file, nil
}
}
return nil, base.ErrPathNotFound
func (d *Cloud189) List(ctx context.Context, dir model.Obj, args model.ListArgs) ([]model.Obj, error) {
return d.getFiles(dir.GetID())
}
func (driver Cloud189) Files(path string, account *model.Account) ([]model.File, error) {
path = utils.ParsePath(path)
var rawFiles []Cloud189File
cache, err := base.GetCache(path, account)
if err == nil {
rawFiles, _ = cache.([]Cloud189File)
} else {
file, err := driver.File(path, account)
if err != nil {
return nil, err
}
rawFiles, err = driver.GetFiles(file.Id, account)
if err != nil {
return nil, err
}
if len(rawFiles) > 0 {
_ = base.SetCache(path, rawFiles, account)
}
}
files := make([]model.File, 0)
for _, file := range rawFiles {
files = append(files, *driver.FormatFile(&file))
}
return files, nil
}
func (driver Cloud189) Link(args base.Args, account *model.Account) (*base.Link, error) {
file, err := driver.File(utils.ParsePath(args.Path), account)
func (d *Cloud189) Link(ctx context.Context, file model.Obj, args model.LinkArgs) (*model.Link, error) {
var resp DownResp
u := "https://cloud.189.cn/api/portal/getFileInfo.action"
_, err := d.request(u, http.MethodGet, func(req *resty.Request) {
req.SetQueryParam("fileId", file.GetID())
}, &resp)
if err != nil {
return nil, err
}
if file.Type == conf.FOLDER {
return nil, base.ErrNotFile
}
client, ok := client189Map[account.Name]
if !ok {
return nil, fmt.Errorf("can't find [%s] client", account.Name)
}
var e Cloud189Error
var resp Cloud189Down
_, err = client.R().SetResult(&resp).SetError(&e).
SetHeader("Accept", "application/json;charset=UTF-8").
SetQueryParams(map[string]string{
"noCache": random(),
"fileId": file.Id,
}).Get("https://cloud.189.cn/api/open/file/getFileDownloadUrl.action")
client := resty.NewWithClient(d.client.GetClient()).SetRedirectPolicy(
resty.RedirectPolicyFunc(func(req *http.Request, via []*http.Request) error {
return http.ErrUseLastResponse
}))
res, err := client.R().SetHeader("User-Agent", base.UserAgent).Get("https:" + resp.FileDownloadUrl)
if err != nil {
return nil, err
}
if e.ErrorCode != "" {
if e.ErrorCode == "InvalidSessionKey" {
err = driver.Login(account)
if err != nil {
return nil, err
}
return driver.Link(args, account)
}
}
if resp.ResCode != 0 {
return nil, fmt.Errorf(resp.ResMessage)
}
res, err := base.NoRedirectClient.R().Get(resp.FileDownloadUrl)
if err != nil {
return nil, err
}
link := base.Link{}
log.Debugln(res.Status())
log.Debugln(res.String())
link := model.Link{}
log.Debugln("first url:", resp.FileDownloadUrl)
if res.StatusCode() == 302 {
link.Url = res.Header().Get("location")
link.URL = res.Header().Get("location")
log.Debugln("second url:", link.URL)
_, _ = client.R().Get(link.URL)
if res.StatusCode() == 302 {
link.URL = res.Header().Get("location")
}
log.Debugln("third url:", link.URL)
} else {
link.Url = resp.FileDownloadUrl
link.URL = resp.FileDownloadUrl
}
link.URL = strings.Replace(link.URL, "http://", "https://", 1)
return &link, nil
}
func (driver Cloud189) Path(path string, account *model.Account) (*model.File, []model.File, error) {
path = utils.ParsePath(path)
log.Debugf("189 path: %s", path)
file, err := driver.File(path, account)
if err != nil {
return nil, nil, err
}
if !file.IsDir() {
return file, nil, nil
}
files, err := driver.Files(path, account)
if err != nil {
return nil, nil, err
}
return nil, files, nil
}
func (driver Cloud189) Proxy(ctx *gin.Context, account *model.Account) {
ctx.Request.Header.Del("Origin")
}
func (driver Cloud189) Preview(path string, account *model.Account) (interface{}, error) {
return nil, base.ErrNotSupport
}
func (driver Cloud189) MakeDir(path string, account *model.Account) error {
dir, name := filepath.Split(path)
parent, err := driver.File(dir, account)
if err != nil {
return err
}
if !parent.IsDir() {
return base.ErrNotFolder
}
func (d *Cloud189) MakeDir(ctx context.Context, parentDir model.Obj, dirName string) error {
form := map[string]string{
"parentFolderId": parent.Id,
"folderName": name,
}
_, err = driver.Request("https://cloud.189.cn/api/open/file/createFolder.action", "POST", form, nil, account)
if err == nil {
_ = base.DeleteCache(dir, account)
"parentFolderId": parentDir.GetID(),
"folderName": dirName,
}
_, err := d.request("https://cloud.189.cn/api/open/file/createFolder.action", http.MethodPost, func(req *resty.Request) {
req.SetFormData(form)
}, nil)
return err
}
func (driver Cloud189) Move(src string, dst string, account *model.Account) error {
srcDir, _ := filepath.Split(src)
dstDir, dstName := filepath.Split(dst)
srcFile, err := driver.File(src, account)
if err != nil {
return err
}
// rename
if srcDir == dstDir {
url := "https://cloud.189.cn/api/open/file/renameFile.action"
idKey := "fileId"
nameKey := "destFileName"
if srcFile.IsDir() {
url = "https://cloud.189.cn/api/open/file/renameFolder.action"
idKey = "folderId"
nameKey = "destFolderName"
}
form := map[string]string{
idKey: srcFile.Id,
nameKey: dstName,
}
_, err = driver.Request(url, "POST", form, nil, account)
} else {
// move
dstDirFile, err := driver.File(dstDir, account)
if err != nil {
return err
}
isFolder := 0
if srcFile.IsDir() {
isFolder = 1
}
taskInfos := []base.Json{
{
"fileId": srcFile.Id,
"fileName": dstName,
"isFolder": isFolder,
},
}
taskInfosBytes, err := json.Marshal(taskInfos)
if err != nil {
return err
}
form := map[string]string{
"type": "MOVE",
"targetFolderId": dstDirFile.Id,
"taskInfos": string(taskInfosBytes),
}
_, err = driver.Request("https://cloud.189.cn/api/open/batch/createBatchTask.action", "POST", form, nil, account)
}
if err == nil {
_ = base.DeleteCache(srcDir, account)
_ = base.DeleteCache(dstDir, account)
}
return err
}
func (driver Cloud189) Copy(src string, dst string, account *model.Account) error {
dstDir, dstName := filepath.Split(dst)
srcFile, err := driver.File(src, account)
if err != nil {
return err
}
dstDirFile, err := driver.File(dstDir, account)
if err != nil {
return err
}
func (d *Cloud189) Move(ctx context.Context, srcObj, dstDir model.Obj) error {
isFolder := 0
if srcFile.IsDir() {
if srcObj.IsDir() {
isFolder = 1
}
taskInfos := []base.Json{
{
"fileId": srcFile.Id,
"fileName": dstName,
"fileId": srcObj.GetID(),
"fileName": srcObj.GetName(),
"isFolder": isFolder,
},
}
taskInfosBytes, err := json.Marshal(taskInfos)
taskInfosBytes, err := utils.Json.Marshal(taskInfos)
if err != nil {
return err
}
form := map[string]string{
"type": "MOVE",
"targetFolderId": dstDir.GetID(),
"taskInfos": string(taskInfosBytes),
}
_, err = d.request("https://cloud.189.cn/api/open/batch/createBatchTask.action", http.MethodPost, func(req *resty.Request) {
req.SetFormData(form)
}, nil)
return err
}
func (d *Cloud189) Rename(ctx context.Context, srcObj model.Obj, newName string) error {
url := "https://cloud.189.cn/api/open/file/renameFile.action"
idKey := "fileId"
nameKey := "destFileName"
if srcObj.IsDir() {
url = "https://cloud.189.cn/api/open/file/renameFolder.action"
idKey = "folderId"
nameKey = "destFolderName"
}
form := map[string]string{
idKey: srcObj.GetID(),
nameKey: newName,
}
_, err := d.request(url, http.MethodPost, func(req *resty.Request) {
req.SetFormData(form)
}, nil)
return err
}
func (d *Cloud189) Copy(ctx context.Context, srcObj, dstDir model.Obj) error {
isFolder := 0
if srcObj.IsDir() {
isFolder = 1
}
taskInfos := []base.Json{
{
"fileId": srcObj.GetID(),
"fileName": srcObj.GetName(),
"isFolder": isFolder,
},
}
taskInfosBytes, err := utils.Json.Marshal(taskInfos)
if err != nil {
return err
}
form := map[string]string{
"type": "COPY",
"targetFolderId": dstDirFile.Id,
"targetFolderId": dstDir.GetID(),
"taskInfos": string(taskInfosBytes),
}
_, err = driver.Request("https://cloud.189.cn/api/open/batch/createBatchTask.action", "POST", form, nil, account)
if err == nil {
_ = base.DeleteCache(dstDir, account)
}
_, err = d.request("https://cloud.189.cn/api/open/batch/createBatchTask.action", http.MethodPost, func(req *resty.Request) {
req.SetFormData(form)
}, nil)
return err
}
func (driver Cloud189) Delete(path string, account *model.Account) error {
path = utils.ParsePath(path)
file, err := driver.File(path, account)
if err != nil {
return err
}
func (d *Cloud189) Remove(ctx context.Context, obj model.Obj) error {
isFolder := 0
if file.IsDir() {
if obj.IsDir() {
isFolder = 1
}
taskInfos := []base.Json{
{
"fileId": file.Id,
"fileName": file.Name,
"fileId": obj.GetID(),
"fileName": obj.GetName(),
"isFolder": isFolder,
},
}
taskInfosBytes, err := json.Marshal(taskInfos)
taskInfosBytes, err := utils.Json.Marshal(taskInfos)
if err != nil {
return err
}
@ -350,93 +184,14 @@ func (driver Cloud189) Delete(path string, account *model.Account) error {
"targetFolderId": "",
"taskInfos": string(taskInfosBytes),
}
_, err = driver.Request("https://cloud.189.cn/api/open/batch/createBatchTask.action", "POST", form, nil, account)
if err == nil {
_ = base.DeleteCache(utils.Dir(path), account)
}
_, err = d.request("https://cloud.189.cn/api/open/batch/createBatchTask.action", http.MethodPost, func(req *resty.Request) {
req.SetFormData(form)
}, nil)
return err
}
// Upload Error: decrypt encryptionText failed
func (driver Cloud189) Upload(file *model.FileStream, account *model.Account) error {
return base.ErrNotImplement
const DEFAULT uint64 = 10485760
var count = int64(math.Ceil(float64(file.GetSize()) / float64(DEFAULT)))
var finish uint64 = 0
parentFile, err := driver.File(file.ParentPath, account)
if err != nil {
return err
}
if !parentFile.IsDir() {
return base.ErrNotFolder
}
res, err := driver.UploadRequest("/person/initMultiUpload", map[string]string{
"parentFolderId": parentFile.Id,
"fileName": file.Name,
"fileSize": strconv.FormatInt(int64(file.Size), 10),
"sliceSize": strconv.FormatInt(int64(DEFAULT), 10),
"lazyCheck": "1",
}, account)
if err != nil {
return err
}
uploadFileId := jsoniter.Get(res, "data.uploadFileId").ToString()
var i int64
var byteSize uint64
md5s := make([]string, 0)
md5Sum := md5.New()
for i = 1; i <= count; i++ {
byteSize = file.GetSize() - finish
if DEFAULT < byteSize {
byteSize = DEFAULT
}
log.Debugf("%d,%d", byteSize, finish)
byteData := make([]byte, byteSize)
n, err := io.ReadFull(file, byteData)
log.Debug(err, n)
if err != nil {
return err
}
finish += uint64(n)
md5Bytes := getMd5(byteData)
md5Str := hex.EncodeToString(md5Bytes)
md5Base64 := base64.StdEncoding.EncodeToString(md5Bytes)
md5s = append(md5s, md5Str)
md5Sum.Write(byteData)
res, err = driver.UploadRequest("/person/getMultiUploadUrls", map[string]string{
"partInfo": fmt.Sprintf("%s-%s", strconv.FormatInt(i, 10), md5Base64),
"uploadFileId": uploadFileId,
}, account)
if err != nil {
return err
}
uploadData := jsoniter.Get(res, "uploadUrls.partNumber_"+strconv.FormatInt(i, 10))
headers := strings.Split(uploadData.Get("requestHeader").ToString(), "&")
req, err := http.NewRequest("PUT", uploadData.Get("requestURL").ToString(), bytes.NewBuffer(byteData))
if err != nil {
return err
}
for _, header := range headers {
kv := strings.Split(header, "=")
req.Header.Set(kv[0], strings.Join(kv[1:], "="))
}
res, err := base.HttpClient.Do(req)
if err != nil {
return err
}
log.Debugf("%+v", res)
}
id := md5Sum.Sum(nil)
res, err = driver.UploadRequest("/person/commitMultiUploadFile", map[string]string{
"uploadFileId": uploadFileId,
"fileMd5": hex.EncodeToString(id),
"sliceMd5": utils.GetMD5Encode(strings.Join(md5s, "\n")),
"lazyCheck": "1",
}, account)
if err == nil {
_ = base.DeleteCache(file.ParentPath, account)
}
return err
func (d *Cloud189) Put(ctx context.Context, dstDir model.Obj, stream model.FileStreamer, up driver.UpdateProgress) error {
return d.newUpload(ctx, dstDir, stream, up)
}
var _ base.Driver = (*Cloud189)(nil)
var _ driver.Driver = (*Cloud189)(nil)

186
drivers/189/help.go Normal file
View File

@ -0,0 +1,186 @@
package _189
import (
"bytes"
"crypto/aes"
"crypto/hmac"
"crypto/md5"
"crypto/rand"
"crypto/rsa"
"crypto/sha1"
"crypto/x509"
"encoding/base64"
"encoding/hex"
"encoding/pem"
"fmt"
"net/url"
"regexp"
"strconv"
"strings"
myrand "github.com/alist-org/alist/v3/pkg/utils/random"
log "github.com/sirupsen/logrus"
)
func random() string {
return fmt.Sprintf("0.%17v", myrand.Rand.Int63n(100000000000000000))
}
func RsaEncode(origData []byte, j_rsakey string, hex bool) string {
publicKey := []byte("-----BEGIN PUBLIC KEY-----\n" + j_rsakey + "\n-----END PUBLIC KEY-----")
block, _ := pem.Decode(publicKey)
pubInterface, _ := x509.ParsePKIXPublicKey(block.Bytes)
pub := pubInterface.(*rsa.PublicKey)
b, err := rsa.EncryptPKCS1v15(rand.Reader, pub, origData)
if err != nil {
log.Errorf("err: %s", err.Error())
}
res := base64.StdEncoding.EncodeToString(b)
if hex {
return b64tohex(res)
}
return res
}
var b64map = "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/"
var BI_RM = "0123456789abcdefghijklmnopqrstuvwxyz"
func int2char(a int) string {
return strings.Split(BI_RM, "")[a]
}
func b64tohex(a string) string {
d := ""
e := 0
c := 0
for i := 0; i < len(a); i++ {
m := strings.Split(a, "")[i]
if m != "=" {
v := strings.Index(b64map, m)
if 0 == e {
e = 1
d += int2char(v >> 2)
c = 3 & v
} else if 1 == e {
e = 2
d += int2char(c<<2 | v>>4)
c = 15 & v
} else if 2 == e {
e = 3
d += int2char(c)
d += int2char(v >> 2)
c = 3 & v
} else {
e = 0
d += int2char(c<<2 | v>>4)
d += int2char(15 & v)
}
}
}
if e == 1 {
d += int2char(c << 2)
}
return d
}
func qs(form map[string]string) string {
f := make(url.Values)
for k, v := range form {
f.Set(k, v)
}
return EncodeParam(f)
//strList := make([]string, 0)
//for k, v := range form {
// strList = append(strList, fmt.Sprintf("%s=%s", k, url.QueryEscape(v)))
//}
//return strings.Join(strList, "&")
}
func EncodeParam(v url.Values) string {
if v == nil {
return ""
}
var buf strings.Builder
keys := make([]string, 0, len(v))
for k := range v {
keys = append(keys, k)
}
for _, k := range keys {
vs := v[k]
for _, v := range vs {
if buf.Len() > 0 {
buf.WriteByte('&')
}
buf.WriteString(k)
buf.WriteByte('=')
//if k == "fileName" {
// buf.WriteString(encode(v))
//} else {
buf.WriteString(v)
//}
}
}
return buf.String()
}
func encode(str string) string {
//str = strings.ReplaceAll(str, "%", "%25")
//str = strings.ReplaceAll(str, "&", "%26")
//str = strings.ReplaceAll(str, "+", "%2B")
//return str
return url.QueryEscape(str)
}
func AesEncrypt(data, key []byte) []byte {
block, _ := aes.NewCipher(key)
if block == nil {
return []byte{}
}
data = PKCS7Padding(data, block.BlockSize())
decrypted := make([]byte, len(data))
size := block.BlockSize()
for bs, be := 0, size; bs < len(data); bs, be = bs+size, be+size {
block.Encrypt(decrypted[bs:be], data[bs:be])
}
return decrypted
}
func PKCS7Padding(ciphertext []byte, blockSize int) []byte {
padding := blockSize - len(ciphertext)%blockSize
padtext := bytes.Repeat([]byte{byte(padding)}, padding)
return append(ciphertext, padtext...)
}
func hmacSha1(data string, secret string) string {
h := hmac.New(sha1.New, []byte(secret))
h.Write([]byte(data))
return hex.EncodeToString(h.Sum(nil))
}
func getMd5(data []byte) []byte {
h := md5.New()
h.Write(data)
return h.Sum(nil)
}
func decodeURIComponent(str string) string {
r, _ := url.PathUnescape(str)
//r = strings.ReplaceAll(r, " ", "+")
return r
}
func Random(v string) string {
reg := regexp.MustCompilePOSIX("[xy]")
data := reg.ReplaceAllFunc([]byte(v), func(msg []byte) []byte {
var i int64
t := int64(16 * myrand.Rand.Float32())
if msg[0] == 120 {
i = t
} else {
i = 3&t | 8
}
return []byte(strconv.FormatInt(i, 16))
})
return string(data)
}

126
drivers/189/login.go Normal file
View File

@ -0,0 +1,126 @@
package _189
import (
"errors"
"strconv"
"github.com/alist-org/alist/v3/pkg/utils"
log "github.com/sirupsen/logrus"
)
type AppConf struct {
Data struct {
AccountType string `json:"accountType"`
AgreementCheck string `json:"agreementCheck"`
AppKey string `json:"appKey"`
ClientType int `json:"clientType"`
IsOauth2 bool `json:"isOauth2"`
LoginSort string `json:"loginSort"`
MailSuffix string `json:"mailSuffix"`
PageKey string `json:"pageKey"`
ParamId string `json:"paramId"`
RegReturnUrl string `json:"regReturnUrl"`
ReqId string `json:"reqId"`
ReturnUrl string `json:"returnUrl"`
ShowFeedback string `json:"showFeedback"`
ShowPwSaveName string `json:"showPwSaveName"`
ShowQrSaveName string `json:"showQrSaveName"`
ShowSmsSaveName string `json:"showSmsSaveName"`
Sso string `json:"sso"`
} `json:"data"`
Msg string `json:"msg"`
Result string `json:"result"`
}
type EncryptConf struct {
Result int `json:"result"`
Data struct {
UpSmsOn string `json:"upSmsOn"`
Pre string `json:"pre"`
PreDomain string `json:"preDomain"`
PubKey string `json:"pubKey"`
} `json:"data"`
}
func (d *Cloud189) newLogin() error {
url := "https://cloud.189.cn/api/portal/loginUrl.action?redirectURL=https%3A%2F%2Fcloud.189.cn%2Fmain.action"
res, err := d.client.R().Get(url)
if err != nil {
return err
}
// Is logged in
redirectURL := res.RawResponse.Request.URL
if redirectURL.String() == "https://cloud.189.cn/web/main" {
return nil
}
lt := redirectURL.Query().Get("lt")
reqId := redirectURL.Query().Get("reqId")
appId := redirectURL.Query().Get("appId")
headers := map[string]string{
"lt": lt,
"reqid": reqId,
"referer": redirectURL.String(),
"origin": "https://open.e.189.cn",
}
// get app Conf
var appConf AppConf
res, err = d.client.R().SetHeaders(headers).SetFormData(map[string]string{
"version": "2.0",
"appKey": appId,
}).SetResult(&appConf).Post("https://open.e.189.cn/api/logbox/oauth2/appConf.do")
if err != nil {
return err
}
log.Debugf("189 AppConf resp body: %s", res.String())
if appConf.Result != "0" {
return errors.New(appConf.Msg)
}
// get encrypt conf
var encryptConf EncryptConf
res, err = d.client.R().SetHeaders(headers).SetFormData(map[string]string{
"appId": appId,
}).Post("https://open.e.189.cn/api/logbox/config/encryptConf.do")
if err != nil {
return err
}
err = utils.Json.Unmarshal(res.Body(), &encryptConf)
if err != nil {
return err
}
log.Debugf("189 EncryptConf resp body: %s\n%+v", res.String(), encryptConf)
if encryptConf.Result != 0 {
return errors.New("get EncryptConf error:" + res.String())
}
// TODO: getUUID? needcaptcha
// login
loginData := map[string]string{
"version": "v2.0",
"apToken": "",
"appKey": appId,
"accountType": appConf.Data.AccountType,
"userName": encryptConf.Data.Pre + RsaEncode([]byte(d.Username), encryptConf.Data.PubKey, true),
"epd": encryptConf.Data.Pre + RsaEncode([]byte(d.Password), encryptConf.Data.PubKey, true),
"captchaType": "",
"validateCode": "",
"smsValidateCode": "",
"captchaToken": "",
"returnUrl": appConf.Data.ReturnUrl,
"mailSuffix": appConf.Data.MailSuffix,
"dynamicCheck": "FALSE",
"clientType": strconv.Itoa(appConf.Data.ClientType),
"cb_SaveName": "3",
"isOauth2": strconv.FormatBool(appConf.Data.IsOauth2),
"state": "",
"paramId": appConf.Data.ParamId,
}
res, err = d.client.R().SetHeaders(headers).SetFormData(loginData).Post("https://open.e.189.cn/api/logbox/oauth2/loginSubmit.do")
if err != nil {
return err
}
log.Debugf("189 login resp body: %s", res.String())
loginResult := utils.Json.Get(res.Body(), "result").ToInt()
if loginResult != 0 {
return errors.New(utils.Json.Get(res.Body(), "msg").ToString())
}
return nil
}

26
drivers/189/meta.go Normal file
View File

@ -0,0 +1,26 @@
package _189
import (
"github.com/alist-org/alist/v3/internal/driver"
"github.com/alist-org/alist/v3/internal/op"
)
type Addition struct {
Username string `json:"username" required:"true"`
Password string `json:"password" required:"true"`
Cookie string `json:"cookie" help:"Fill in the cookie if need captcha"`
driver.RootID
}
var config = driver.Config{
Name: "189Cloud",
LocalSort: true,
DefaultRoot: "-11",
Alert: `info|You can try to use 189PC driver if this driver does not work.`,
}
func init() {
op.RegisterDriver(func() driver.Driver {
return &Cloud189{}
})
}

68
drivers/189/types.go Normal file
View File

@ -0,0 +1,68 @@
package _189
type LoginResp struct {
Msg string `json:"msg"`
Result int `json:"result"`
ToUrl string `json:"toUrl"`
}
type Error struct {
ErrorCode string `json:"errorCode"`
ErrorMsg string `json:"errorMsg"`
}
type File struct {
Id int64 `json:"id"`
LastOpTime string `json:"lastOpTime"`
Name string `json:"name"`
Size int64 `json:"size"`
Icon struct {
SmallUrl string `json:"smallUrl"`
//LargeUrl string `json:"largeUrl"`
} `json:"icon"`
Url string `json:"url"`
}
type Folder struct {
Id int64 `json:"id"`
LastOpTime string `json:"lastOpTime"`
Name string `json:"name"`
}
type Files struct {
ResCode int `json:"res_code"`
ResMessage string `json:"res_message"`
FileListAO struct {
Count int `json:"count"`
FileList []File `json:"fileList"`
FolderList []Folder `json:"folderList"`
} `json:"fileListAO"`
}
type UploadUrlsResp struct {
Code string `json:"code"`
UploadUrls map[string]Part `json:"uploadUrls"`
}
type Part struct {
RequestURL string `json:"requestURL"`
RequestHeader string `json:"requestHeader"`
}
type Rsa struct {
Expire int64 `json:"expire"`
PkId string `json:"pkId"`
PubKey string `json:"pubKey"`
}
type Down struct {
ResCode int `json:"res_code"`
ResMessage string `json:"res_message"`
FileDownloadUrl string `json:"fileDownloadUrl"`
}
type DownResp struct {
ResCode int `json:"res_code"`
ResMessage string `json:"res_message"`
FileDownloadUrl string `json:"downloadUrl"`
}

398
drivers/189/util.go Normal file
View File

@ -0,0 +1,398 @@
package _189
import (
"bytes"
"context"
"crypto/md5"
"encoding/base64"
"encoding/hex"
"errors"
"fmt"
"io"
"math"
"net/http"
"strconv"
"strings"
"time"
"github.com/alist-org/alist/v3/drivers/base"
"github.com/alist-org/alist/v3/internal/driver"
"github.com/alist-org/alist/v3/internal/model"
"github.com/alist-org/alist/v3/pkg/utils"
myrand "github.com/alist-org/alist/v3/pkg/utils/random"
"github.com/go-resty/resty/v2"
jsoniter "github.com/json-iterator/go"
log "github.com/sirupsen/logrus"
)
// do others that not defined in Driver interface
//func (d *Cloud189) login() error {
// url := "https://cloud.189.cn/api/portal/loginUrl.action?redirectURL=https%3A%2F%2Fcloud.189.cn%2Fmain.action"
// b := ""
// lt := ""
// ltText := regexp.MustCompile(`lt = "(.+?)"`)
// var res *resty.Response
// var err error
// for i := 0; i < 3; i++ {
// res, err = d.client.R().Get(url)
// if err != nil {
// return err
// }
// // 已经登陆
// if res.RawResponse.Request.URL.String() == "https://cloud.189.cn/web/main" {
// return nil
// }
// b = res.String()
// ltTextArr := ltText.FindStringSubmatch(b)
// if len(ltTextArr) > 0 {
// lt = ltTextArr[1]
// break
// } else {
// <-time.After(time.Second)
// }
// }
// if lt == "" {
// return fmt.Errorf("get page: %s \nstatus: %d \nrequest url: %s\nredirect url: %s",
// b, res.StatusCode(), res.RawResponse.Request.URL.String(), res.Header().Get("location"))
// }
// captchaToken := regexp.MustCompile(`captchaToken' value='(.+?)'`).FindStringSubmatch(b)[1]
// returnUrl := regexp.MustCompile(`returnUrl = '(.+?)'`).FindStringSubmatch(b)[1]
// paramId := regexp.MustCompile(`paramId = "(.+?)"`).FindStringSubmatch(b)[1]
// //reqId := regexp.MustCompile(`reqId = "(.+?)"`).FindStringSubmatch(b)[1]
// jRsakey := regexp.MustCompile(`j_rsaKey" value="(\S+)"`).FindStringSubmatch(b)[1]
// vCodeID := regexp.MustCompile(`picCaptcha\.do\?token\=([A-Za-z0-9\&\=]+)`).FindStringSubmatch(b)[1]
// vCodeRS := ""
// if vCodeID != "" {
// // need ValidateCode
// log.Debugf("try to identify verification codes")
// timeStamp := strconv.FormatInt(time.Now().UnixNano()/1e6, 10)
// u := "https://open.e.189.cn/api/logbox/oauth2/picCaptcha.do?token=" + vCodeID + timeStamp
// imgRes, err := d.client.R().SetHeaders(map[string]string{
// "User-Agent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:74.0) Gecko/20100101 Firefox/76.0",
// "Referer": "https://open.e.189.cn/api/logbox/oauth2/unifyAccountLogin.do",
// "Sec-Fetch-Dest": "image",
// "Sec-Fetch-Mode": "no-cors",
// "Sec-Fetch-Site": "same-origin",
// }).Get(u)
// if err != nil {
// return err
// }
// // Enter the verification code manually
// //err = message.GetMessenger().WaitSend(message.Message{
// // Type: "image",
// // Content: "data:image/png;base64," + base64.StdEncoding.EncodeToString(imgRes.Body()),
// //}, 10)
// //if err != nil {
// // return err
// //}
// //vCodeRS, err = message.GetMessenger().WaitReceive(30)
// // use ocr api
// vRes, err := base.RestyClient.R().SetMultipartField(
// "image", "validateCode.png", "image/png", bytes.NewReader(imgRes.Body())).
// Post(setting.GetStr(conf.OcrApi))
// if err != nil {
// return err
// }
// if jsoniter.Get(vRes.Body(), "status").ToInt() != 200 {
// return errors.New("ocr error:" + jsoniter.Get(vRes.Body(), "msg").ToString())
// }
// vCodeRS = jsoniter.Get(vRes.Body(), "result").ToString()
// log.Debugln("code: ", vCodeRS)
// }
// userRsa := RsaEncode([]byte(d.Username), jRsakey, true)
// passwordRsa := RsaEncode([]byte(d.Password), jRsakey, true)
// url = "https://open.e.189.cn/api/logbox/oauth2/loginSubmit.do"
// var loginResp LoginResp
// res, err = d.client.R().
// SetHeaders(map[string]string{
// "lt": lt,
// "User-Agent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/87.0.4280.88 Safari/537.36",
// "Referer": "https://open.e.189.cn/",
// "accept": "application/json;charset=UTF-8",
// }).SetFormData(map[string]string{
// "appKey": "cloud",
// "accountType": "01",
// "userName": "{RSA}" + userRsa,
// "password": "{RSA}" + passwordRsa,
// "validateCode": vCodeRS,
// "captchaToken": captchaToken,
// "returnUrl": returnUrl,
// "mailSuffix": "@pan.cn",
// "paramId": paramId,
// "clientType": "10010",
// "dynamicCheck": "FALSE",
// "cb_SaveName": "1",
// "isOauth2": "false",
// }).Post(url)
// if err != nil {
// return err
// }
// err = utils.Json.Unmarshal(res.Body(), &loginResp)
// if err != nil {
// log.Error(err.Error())
// return err
// }
// if loginResp.Result != 0 {
// return fmt.Errorf(loginResp.Msg)
// }
// _, err = d.client.R().Get(loginResp.ToUrl)
// return err
//}
func (d *Cloud189) request(url string, method string, callback base.ReqCallback, resp interface{}) ([]byte, error) {
var e Error
req := d.client.R().SetError(&e).
SetHeader("Accept", "application/json;charset=UTF-8").
SetQueryParams(map[string]string{
"noCache": random(),
})
if callback != nil {
callback(req)
}
if resp != nil {
req.SetResult(resp)
}
res, err := req.Execute(method, url)
if err != nil {
return nil, err
}
//log.Debug(res.String())
if e.ErrorCode != "" {
if e.ErrorCode == "InvalidSessionKey" {
err = d.newLogin()
if err != nil {
return nil, err
}
return d.request(url, method, callback, resp)
}
}
if jsoniter.Get(res.Body(), "res_code").ToInt() != 0 {
err = errors.New(jsoniter.Get(res.Body(), "res_message").ToString())
}
return res.Body(), err
}
func (d *Cloud189) getFiles(fileId string) ([]model.Obj, error) {
res := make([]model.Obj, 0)
pageNum := 1
for {
var resp Files
_, err := d.request("https://cloud.189.cn/api/open/file/listFiles.action", http.MethodGet, func(req *resty.Request) {
req.SetQueryParams(map[string]string{
//"noCache": random(),
"pageSize": "60",
"pageNum": strconv.Itoa(pageNum),
"mediaType": "0",
"folderId": fileId,
"iconOption": "5",
"orderBy": "lastOpTime", //account.OrderBy
"descending": "true", //account.OrderDirection
})
}, &resp)
if err != nil {
return nil, err
}
if resp.FileListAO.Count == 0 {
break
}
for _, folder := range resp.FileListAO.FolderList {
lastOpTime := utils.MustParseCNTime(folder.LastOpTime)
res = append(res, &model.Object{
ID: strconv.FormatInt(folder.Id, 10),
Name: folder.Name,
Modified: lastOpTime,
IsFolder: true,
})
}
for _, file := range resp.FileListAO.FileList {
lastOpTime := utils.MustParseCNTime(file.LastOpTime)
res = append(res, &model.ObjThumb{
Object: model.Object{
ID: strconv.FormatInt(file.Id, 10),
Name: file.Name,
Modified: lastOpTime,
Size: file.Size,
},
Thumbnail: model.Thumbnail{Thumbnail: file.Icon.SmallUrl},
})
}
pageNum++
}
return res, nil
}
func (d *Cloud189) oldUpload(dstDir model.Obj, file model.FileStreamer) error {
res, err := d.client.R().SetMultipartFormData(map[string]string{
"parentId": dstDir.GetID(),
"sessionKey": "??",
"opertype": "1",
"fname": file.GetName(),
}).SetMultipartField("Filedata", file.GetName(), file.GetMimetype(), file).Post("https://hb02.upload.cloud.189.cn/v1/DCIWebUploadAction")
if err != nil {
return err
}
if utils.Json.Get(res.Body(), "MD5").ToString() != "" {
return nil
}
log.Debugf(res.String())
return errors.New(res.String())
}
func (d *Cloud189) getSessionKey() (string, error) {
resp, err := d.request("https://cloud.189.cn/v2/getUserBriefInfo.action", http.MethodGet, nil, nil)
if err != nil {
return "", err
}
sessionKey := utils.Json.Get(resp, "sessionKey").ToString()
return sessionKey, nil
}
func (d *Cloud189) getResKey() (string, string, error) {
now := time.Now().UnixMilli()
if d.rsa.Expire > now {
return d.rsa.PubKey, d.rsa.PkId, nil
}
resp, err := d.request("https://cloud.189.cn/api/security/generateRsaKey.action", http.MethodGet, nil, nil)
if err != nil {
return "", "", err
}
pubKey, pkId := utils.Json.Get(resp, "pubKey").ToString(), utils.Json.Get(resp, "pkId").ToString()
d.rsa.PubKey, d.rsa.PkId = pubKey, pkId
d.rsa.Expire = utils.Json.Get(resp, "expire").ToInt64()
return pubKey, pkId, nil
}
func (d *Cloud189) uploadRequest(uri string, form map[string]string, resp interface{}) ([]byte, error) {
c := strconv.FormatInt(time.Now().UnixMilli(), 10)
r := Random("xxxxxxxx-xxxx-4xxx-yxxx-xxxxxxxxxxxx")
l := Random("xxxxxxxxxxxx4xxxyxxxxxxxxxxxxxxx")
l = l[0 : 16+int(16*myrand.Rand.Float32())]
e := qs(form)
data := AesEncrypt([]byte(e), []byte(l[0:16]))
h := hex.EncodeToString(data)
sessionKey := d.sessionKey
signature := hmacSha1(fmt.Sprintf("SessionKey=%s&Operate=GET&RequestURI=%s&Date=%s&params=%s", sessionKey, uri, c, h), l)
pubKey, pkId, err := d.getResKey()
if err != nil {
return nil, err
}
b := RsaEncode([]byte(l), pubKey, false)
req := d.client.R().SetHeaders(map[string]string{
"accept": "application/json;charset=UTF-8",
"SessionKey": sessionKey,
"Signature": signature,
"X-Request-Date": c,
"X-Request-ID": r,
"EncryptionText": b,
"PkId": pkId,
})
if resp != nil {
req.SetResult(resp)
}
res, err := req.Get("https://upload.cloud.189.cn" + uri + "?params=" + h)
if err != nil {
return nil, err
}
data = res.Body()
if utils.Json.Get(data, "code").ToString() != "SUCCESS" {
return nil, errors.New(uri + "---" + jsoniter.Get(data, "msg").ToString())
}
return data, nil
}
func (d *Cloud189) newUpload(ctx context.Context, dstDir model.Obj, file model.FileStreamer, up driver.UpdateProgress) error {
sessionKey, err := d.getSessionKey()
if err != nil {
return err
}
d.sessionKey = sessionKey
const DEFAULT int64 = 10485760
var count = int64(math.Ceil(float64(file.GetSize()) / float64(DEFAULT)))
res, err := d.uploadRequest("/person/initMultiUpload", map[string]string{
"parentFolderId": dstDir.GetID(),
"fileName": encode(file.GetName()),
"fileSize": strconv.FormatInt(file.GetSize(), 10),
"sliceSize": strconv.FormatInt(DEFAULT, 10),
"lazyCheck": "1",
}, nil)
if err != nil {
return err
}
uploadFileId := jsoniter.Get(res, "data", "uploadFileId").ToString()
//_, err = d.uploadRequest("/person/getUploadedPartsInfo", map[string]string{
// "uploadFileId": uploadFileId,
//}, nil)
var finish int64 = 0
var i int64
var byteSize int64
md5s := make([]string, 0)
md5Sum := md5.New()
for i = 1; i <= count; i++ {
if utils.IsCanceled(ctx) {
return ctx.Err()
}
byteSize = file.GetSize() - finish
if DEFAULT < byteSize {
byteSize = DEFAULT
}
//log.Debugf("%d,%d", byteSize, finish)
byteData := make([]byte, byteSize)
n, err := io.ReadFull(file, byteData)
//log.Debug(err, n)
if err != nil {
return err
}
finish += int64(n)
md5Bytes := getMd5(byteData)
md5Hex := hex.EncodeToString(md5Bytes)
md5Base64 := base64.StdEncoding.EncodeToString(md5Bytes)
md5s = append(md5s, strings.ToUpper(md5Hex))
md5Sum.Write(byteData)
var resp UploadUrlsResp
res, err = d.uploadRequest("/person/getMultiUploadUrls", map[string]string{
"partInfo": fmt.Sprintf("%s-%s", strconv.FormatInt(i, 10), md5Base64),
"uploadFileId": uploadFileId,
}, &resp)
if err != nil {
return err
}
uploadData := resp.UploadUrls["partNumber_"+strconv.FormatInt(i, 10)]
log.Debugf("uploadData: %+v", uploadData)
requestURL := uploadData.RequestURL
uploadHeaders := strings.Split(decodeURIComponent(uploadData.RequestHeader), "&")
req, err := http.NewRequest(http.MethodPut, requestURL, bytes.NewReader(byteData))
if err != nil {
return err
}
req = req.WithContext(ctx)
for _, v := range uploadHeaders {
i := strings.Index(v, "=")
req.Header.Set(v[0:i], v[i+1:])
}
r, err := base.HttpClient.Do(req)
log.Debugf("%+v %+v", r, r.Request.Header)
r.Body.Close()
if err != nil {
return err
}
up(int(i * 100 / count))
}
fileMd5 := hex.EncodeToString(md5Sum.Sum(nil))
sliceMd5 := fileMd5
if file.GetSize() > DEFAULT {
sliceMd5 = utils.GetMD5EncodeStr(strings.Join(md5s, "\n"))
}
res, err = d.uploadRequest("/person/commitMultiUploadFile", map[string]string{
"uploadFileId": uploadFileId,
"fileMd5": fileMd5,
"sliceMd5": sliceMd5,
"lazyCheck": "1",
"opertype": "3",
}, nil)
return err
}

309
drivers/189pc/driver.go Normal file
View File

@ -0,0 +1,309 @@
package _189pc
import (
"context"
"net/http"
"strings"
"time"
"github.com/alist-org/alist/v3/drivers/base"
"github.com/alist-org/alist/v3/internal/driver"
"github.com/alist-org/alist/v3/internal/errs"
"github.com/alist-org/alist/v3/internal/model"
"github.com/alist-org/alist/v3/pkg/utils"
"github.com/go-resty/resty/v2"
)
type Cloud189PC struct {
model.Storage
Addition
identity string
client *resty.Client
loginParam *LoginParam
tokenInfo *AppSessionResp
}
func (y *Cloud189PC) Config() driver.Config {
return config
}
func (y *Cloud189PC) GetAddition() driver.Additional {
return &y.Addition
}
func (y *Cloud189PC) Init(ctx context.Context) (err error) {
// 处理个人云和家庭云参数
if y.isFamily() && y.RootFolderID == "-11" {
y.RootFolderID = ""
}
if !y.isFamily() && y.RootFolderID == "" {
y.RootFolderID = "-11"
y.FamilyID = ""
}
// 初始化请求客户端
if y.client == nil {
y.client = base.NewRestyClient().SetHeaders(map[string]string{
"Accept": "application/json;charset=UTF-8",
"Referer": WEB_URL,
})
}
// 避免重复登陆
identity := utils.GetMD5EncodeStr(y.Username + y.Password)
if !y.isLogin() || y.identity != identity {
y.identity = identity
if err = y.login(); err != nil {
return
}
}
// 处理家庭云ID
if y.isFamily() && y.FamilyID == "" {
if y.FamilyID, err = y.getFamilyID(); err != nil {
return err
}
}
return
}
func (y *Cloud189PC) Drop(ctx context.Context) error {
return nil
}
func (y *Cloud189PC) List(ctx context.Context, dir model.Obj, args model.ListArgs) ([]model.Obj, error) {
return y.getFiles(ctx, dir.GetID())
}
func (y *Cloud189PC) Link(ctx context.Context, file model.Obj, args model.LinkArgs) (*model.Link, error) {
var downloadUrl struct {
URL string `json:"fileDownloadUrl"`
}
fullUrl := API_URL
if y.isFamily() {
fullUrl += "/family/file"
}
fullUrl += "/getFileDownloadUrl.action"
_, err := y.get(fullUrl, func(r *resty.Request) {
r.SetContext(ctx)
r.SetQueryParam("fileId", file.GetID())
if y.isFamily() {
r.SetQueryParams(map[string]string{
"familyId": y.FamilyID,
})
} else {
r.SetQueryParams(map[string]string{
"dt": "3",
"flag": "1",
})
}
}, &downloadUrl)
if err != nil {
return nil, err
}
// 重定向获取真实链接
downloadUrl.URL = strings.Replace(strings.ReplaceAll(downloadUrl.URL, "&amp;", "&"), "http://", "https://", 1)
res, err := base.NoRedirectClient.R().SetContext(ctx).Get(downloadUrl.URL)
if err != nil {
return nil, err
}
if res.StatusCode() == 302 {
downloadUrl.URL = res.Header().Get("location")
}
like := &model.Link{
URL: downloadUrl.URL,
Header: http.Header{
"User-Agent": []string{base.UserAgent},
},
}
/*
// 获取链接有效时常
strs := regexp.MustCompile(`(?i)expire[^=]*=([0-9]*)`).FindStringSubmatch(downloadUrl.URL)
if len(strs) == 2 {
timestamp, err := strconv.ParseInt(strs[1], 10, 64)
if err == nil {
expired := time.Duration(timestamp-time.Now().Unix()) * time.Second
like.Expiration = &expired
}
}
*/
return like, nil
}
func (y *Cloud189PC) MakeDir(ctx context.Context, parentDir model.Obj, dirName string) (model.Obj, error) {
fullUrl := API_URL
if y.isFamily() {
fullUrl += "/family/file"
}
fullUrl += "/createFolder.action"
var newFolder Cloud189Folder
_, err := y.post(fullUrl, func(req *resty.Request) {
req.SetContext(ctx)
req.SetQueryParams(map[string]string{
"folderName": dirName,
"relativePath": "",
})
if y.isFamily() {
req.SetQueryParams(map[string]string{
"familyId": y.FamilyID,
"parentId": parentDir.GetID(),
})
} else {
req.SetQueryParams(map[string]string{
"parentFolderId": parentDir.GetID(),
})
}
}, &newFolder)
if err != nil {
return nil, err
}
return &newFolder, nil
}
func (y *Cloud189PC) Move(ctx context.Context, srcObj, dstDir model.Obj) (model.Obj, error) {
var resp CreateBatchTaskResp
_, err := y.post(API_URL+"/batch/createBatchTask.action", func(req *resty.Request) {
req.SetContext(ctx)
req.SetFormData(map[string]string{
"type": "MOVE",
"taskInfos": MustString(utils.Json.MarshalToString(
[]BatchTaskInfo{
{
FileId: srcObj.GetID(),
FileName: srcObj.GetName(),
IsFolder: BoolToNumber(srcObj.IsDir()),
},
})),
"targetFolderId": dstDir.GetID(),
})
if y.isFamily() {
req.SetFormData(map[string]string{
"familyId": y.FamilyID,
})
}
}, &resp)
if err != nil {
return nil, err
}
if err = y.WaitBatchTask("MOVE", resp.TaskID, time.Millisecond*400); err != nil {
return nil, err
}
return srcObj, nil
}
func (y *Cloud189PC) Rename(ctx context.Context, srcObj model.Obj, newName string) (model.Obj, error) {
queryParam := make(map[string]string)
fullUrl := API_URL
method := http.MethodPost
if y.isFamily() {
fullUrl += "/family/file"
method = http.MethodGet
queryParam["familyId"] = y.FamilyID
}
var newObj model.Obj
switch f := srcObj.(type) {
case *Cloud189File:
fullUrl += "/renameFile.action"
queryParam["fileId"] = srcObj.GetID()
queryParam["destFileName"] = newName
newObj = &Cloud189File{Icon: f.Icon} // 复用预览
case *Cloud189Folder:
fullUrl += "/renameFolder.action"
queryParam["folderId"] = srcObj.GetID()
queryParam["destFolderName"] = newName
newObj = &Cloud189Folder{}
default:
return nil, errs.NotSupport
}
_, err := y.request(fullUrl, method, func(req *resty.Request) {
req.SetContext(ctx).SetQueryParams(queryParam)
}, nil, newObj)
if err != nil {
return nil, err
}
return newObj, nil
}
func (y *Cloud189PC) Copy(ctx context.Context, srcObj, dstDir model.Obj) error {
var resp CreateBatchTaskResp
_, err := y.post(API_URL+"/batch/createBatchTask.action", func(req *resty.Request) {
req.SetContext(ctx)
req.SetFormData(map[string]string{
"type": "COPY",
"taskInfos": MustString(utils.Json.MarshalToString(
[]BatchTaskInfo{
{
FileId: srcObj.GetID(),
FileName: srcObj.GetName(),
IsFolder: BoolToNumber(srcObj.IsDir()),
},
})),
"targetFolderId": dstDir.GetID(),
"targetFileName": dstDir.GetName(),
})
if y.isFamily() {
req.SetFormData(map[string]string{
"familyId": y.FamilyID,
})
}
}, &resp)
if err != nil {
return err
}
return y.WaitBatchTask("COPY", resp.TaskID, time.Second)
}
func (y *Cloud189PC) Remove(ctx context.Context, obj model.Obj) error {
var resp CreateBatchTaskResp
_, err := y.post(API_URL+"/batch/createBatchTask.action", func(req *resty.Request) {
req.SetContext(ctx)
req.SetFormData(map[string]string{
"type": "DELETE",
"taskInfos": MustString(utils.Json.MarshalToString(
[]*BatchTaskInfo{
{
FileId: obj.GetID(),
FileName: obj.GetName(),
IsFolder: BoolToNumber(obj.IsDir()),
},
})),
})
if y.isFamily() {
req.SetFormData(map[string]string{
"familyId": y.FamilyID,
})
}
}, &resp)
if err != nil {
return err
}
// 批量任务数量限制,过快会导致无法删除
return y.WaitBatchTask("DELETE", resp.TaskID, time.Millisecond*200)
}
func (y *Cloud189PC) Put(ctx context.Context, dstDir model.Obj, stream model.FileStreamer, up driver.UpdateProgress) (model.Obj, error) {
switch y.UploadMethod {
case "old":
return y.OldUpload(ctx, dstDir, stream, up)
case "rapid":
return y.FastUpload(ctx, dstDir, stream, up)
case "stream":
if stream.GetSize() == 0 {
return y.FastUpload(ctx, dstDir, stream, up)
}
fallthrough
default:
return y.StreamUpload(ctx, dstDir, stream, up)
}
}

195
drivers/189pc/help.go Normal file
View File

@ -0,0 +1,195 @@
package _189pc
import (
"bytes"
"crypto/aes"
"crypto/hmac"
"crypto/rand"
"crypto/rsa"
"crypto/sha1"
"crypto/x509"
"encoding/hex"
"encoding/pem"
"encoding/xml"
"fmt"
"math"
"net/http"
"regexp"
"strings"
"time"
"github.com/alist-org/alist/v3/pkg/utils/random"
)
func clientSuffix() map[string]string {
rand := random.Rand
return map[string]string{
"clientType": PC,
"version": VERSION,
"channelId": CHANNEL_ID,
"rand": fmt.Sprintf("%d_%d", rand.Int63n(1e5), rand.Int63n(1e10)),
}
}
// 带params的SignatureOfHmac HMAC签名
func signatureOfHmac(sessionSecret, sessionKey, operate, fullUrl, dateOfGmt, param string) string {
urlpath := regexp.MustCompile(`://[^/]+((/[^/\s?#]+)*)`).FindStringSubmatch(fullUrl)[1]
mac := hmac.New(sha1.New, []byte(sessionSecret))
data := fmt.Sprintf("SessionKey=%s&Operate=%s&RequestURI=%s&Date=%s", sessionKey, operate, urlpath, dateOfGmt)
if param != "" {
data += fmt.Sprintf("&params=%s", param)
}
mac.Write([]byte(data))
return strings.ToUpper(hex.EncodeToString(mac.Sum(nil)))
}
// RAS 加密用户名密码
func RsaEncrypt(publicKey, origData string) string {
block, _ := pem.Decode([]byte(publicKey))
pubInterface, _ := x509.ParsePKIXPublicKey(block.Bytes)
data, _ := rsa.EncryptPKCS1v15(rand.Reader, pubInterface.(*rsa.PublicKey), []byte(origData))
return strings.ToUpper(hex.EncodeToString(data))
}
// aes 加密params
func AesECBEncrypt(data, key string) string {
block, _ := aes.NewCipher([]byte(key))
paddingData := PKCS7Padding([]byte(data), block.BlockSize())
decrypted := make([]byte, len(paddingData))
size := block.BlockSize()
for src, dst := paddingData, decrypted; len(src) > 0; src, dst = src[size:], dst[size:] {
block.Encrypt(dst[:size], src[:size])
}
return strings.ToUpper(hex.EncodeToString(decrypted))
}
func PKCS7Padding(ciphertext []byte, blockSize int) []byte {
padding := blockSize - len(ciphertext)%blockSize
padtext := bytes.Repeat([]byte{byte(padding)}, padding)
return append(ciphertext, padtext...)
}
// 获取http规范的时间
func getHttpDateStr() string {
return time.Now().UTC().Format(http.TimeFormat)
}
// 时间戳
func timestamp() int64 {
return time.Now().UTC().UnixNano() / 1e6
}
func MustParseTime(str string) *time.Time {
lastOpTime, _ := time.ParseInLocation("2006-01-02 15:04:05 -07", str+" +08", time.Local)
return &lastOpTime
}
type Time time.Time
func (t *Time) UnmarshalJSON(b []byte) error { return t.Unmarshal(b) }
func (t *Time) UnmarshalXML(e *xml.Decoder, ee xml.StartElement) error {
b, err := e.Token()
if err != nil {
return err
}
if b, ok := b.(xml.CharData); ok {
if err = t.Unmarshal(b); err != nil {
return err
}
}
return e.Skip()
}
func (t *Time) Unmarshal(b []byte) error {
bs := strings.Trim(string(b), "\"")
var v time.Time
var err error
for _, f := range []string{"2006-01-02 15:04:05 -07", "Jan 2, 2006 15:04:05 PM -07"} {
v, err = time.ParseInLocation(f, bs+" +08", time.Local)
if err == nil {
break
}
}
*t = Time(v)
return err
}
type String string
func (t *String) UnmarshalJSON(b []byte) error { return t.Unmarshal(b) }
func (t *String) UnmarshalXML(e *xml.Decoder, ee xml.StartElement) error {
b, err := e.Token()
if err != nil {
return err
}
if b, ok := b.(xml.CharData); ok {
if err = t.Unmarshal(b); err != nil {
return err
}
}
return e.Skip()
}
func (s *String) Unmarshal(b []byte) error {
*s = String(bytes.Trim(b, "\""))
return nil
}
func toFamilyOrderBy(o string) string {
switch o {
case "filename":
return "1"
case "filesize":
return "2"
case "lastOpTime":
return "3"
default:
return "1"
}
}
func toDesc(o string) string {
switch o {
case "desc":
return "true"
case "asc":
fallthrough
default:
return "false"
}
}
func ParseHttpHeader(str string) map[string]string {
header := make(map[string]string)
for _, value := range strings.Split(str, "&") {
i := strings.Index(value, "=")
if i > 0 {
header[strings.TrimSpace(value[0:i])] = strings.TrimSpace(value[i+1:])
}
}
return header
}
func MustString(str string, err error) string {
return str
}
func BoolToNumber(b bool) int {
if b {
return 1
}
return 0
}
// 计算分片大小
// 对分片数量有限制
// 10MIB 20 MIB 999片
// 50MIB 60MIB 70MIB 80MIB ∞MIB 1999片
func partSize(size int64) int64 {
const DEFAULT = 1024 * 1024 * 10 // 10MIB
if size > DEFAULT*2*999 {
return int64(math.Max(math.Ceil((float64(size)/1999) /*=单个切片大小*/ /float64(DEFAULT)) /*=倍率*/, 5) * DEFAULT)
}
if size > DEFAULT*999 {
return DEFAULT * 2 // 20MIB
}
return DEFAULT
}

31
drivers/189pc/meta.go Normal file
View File

@ -0,0 +1,31 @@
package _189pc
import (
"github.com/alist-org/alist/v3/internal/driver"
"github.com/alist-org/alist/v3/internal/op"
)
type Addition struct {
Username string `json:"username" required:"true"`
Password string `json:"password" required:"true"`
VCode string `json:"validate_code"`
driver.RootID
OrderBy string `json:"order_by" type:"select" options:"filename,filesize,lastOpTime" default:"filename"`
OrderDirection string `json:"order_direction" type:"select" options:"asc,desc" default:"asc"`
Type string `json:"type" type:"select" options:"personal,family" default:"personal"`
FamilyID string `json:"family_id"`
UploadMethod string `json:"upload_method" type:"select" options:"stream,rapid,old" default:"stream"`
NoUseOcr bool `json:"no_use_ocr"`
}
var config = driver.Config{
Name: "189CloudPC",
DefaultRoot: "-11",
CheckStatus: true,
}
func init() {
op.RegisterDriver(func() driver.Driver {
return &Cloud189PC{}
})
}

356
drivers/189pc/types.go Normal file
View File

@ -0,0 +1,356 @@
package _189pc
import (
"encoding/xml"
"fmt"
"sort"
"strings"
"time"
)
// 居然有四种返回方式
type RespErr struct {
ResCode any `json:"res_code"` // int or string
ResMessage string `json:"res_message"`
Error_ string `json:"error"`
XMLName xml.Name `xml:"error"`
Code string `json:"code" xml:"code"`
Message string `json:"message" xml:"message"`
Msg string `json:"msg"`
ErrorCode string `json:"errorCode"`
ErrorMsg string `json:"errorMsg"`
}
func (e *RespErr) HasError() bool {
switch v := e.ResCode.(type) {
case int, int64, int32:
return v != 0
case string:
return e.ResCode != ""
}
return (e.Code != "" && e.Code != "SUCCESS") || e.ErrorCode != "" || e.Error_ != ""
}
func (e *RespErr) Error() string {
switch v := e.ResCode.(type) {
case int, int64, int32:
if v != 0 {
return fmt.Sprintf("res_code: %d ,res_msg: %s", v, e.ResMessage)
}
case string:
if e.ResCode != "" {
return fmt.Sprintf("res_code: %s ,res_msg: %s", e.ResCode, e.ResMessage)
}
}
if e.Code != "" && e.Code != "SUCCESS" {
if e.Msg != "" {
return fmt.Sprintf("code: %s ,msg: %s", e.Code, e.Msg)
}
if e.Message != "" {
return fmt.Sprintf("code: %s ,msg: %s", e.Code, e.Message)
}
return "code: " + e.Code
}
if e.ErrorCode != "" {
return fmt.Sprintf("err_code: %s ,err_msg: %s", e.ErrorCode, e.ErrorMsg)
}
if e.Error_ != "" {
return fmt.Sprintf("error: %s ,message: %s", e.ErrorCode, e.Message)
}
return ""
}
// 登陆需要的参数
type LoginParam struct {
// 加密后的用户名和密码
RsaUsername string
RsaPassword string
// rsa密钥
jRsaKey string
// 请求头参数
Lt string
ReqId string
// 表单参数
ParamId string
// 验证码
CaptchaToken string
}
// 登陆加密相关
type EncryptConfResp struct {
Result int `json:"result"`
Data struct {
UpSmsOn string `json:"upSmsOn"`
Pre string `json:"pre"`
PreDomain string `json:"preDomain"`
PubKey string `json:"pubKey"`
} `json:"data"`
}
type LoginResp struct {
Msg string `json:"msg"`
Result int `json:"result"`
ToUrl string `json:"toUrl"`
}
// 刷新session返回
type UserSessionResp struct {
ResCode int `json:"res_code"`
ResMessage string `json:"res_message"`
LoginName string `json:"loginName"`
KeepAlive int `json:"keepAlive"`
GetFileDiffSpan int `json:"getFileDiffSpan"`
GetUserInfoSpan int `json:"getUserInfoSpan"`
// 个人云
SessionKey string `json:"sessionKey"`
SessionSecret string `json:"sessionSecret"`
// 家庭云
FamilySessionKey string `json:"familySessionKey"`
FamilySessionSecret string `json:"familySessionSecret"`
}
// 登录返回
type AppSessionResp struct {
UserSessionResp
IsSaveName string `json:"isSaveName"`
// 会话刷新Token
AccessToken string `json:"accessToken"`
//Token刷新
RefreshToken string `json:"refreshToken"`
}
// 家庭云账户
type FamilyInfoListResp struct {
FamilyInfoResp []FamilyInfoResp `json:"familyInfoResp"`
}
type FamilyInfoResp struct {
Count int `json:"count"`
CreateTime string `json:"createTime"`
FamilyID int `json:"familyId"`
RemarkName string `json:"remarkName"`
Type int `json:"type"`
UseFlag int `json:"useFlag"`
UserRole int `json:"userRole"`
}
/*文件部分*/
// 文件
type Cloud189File struct {
ID String `json:"id"`
Name string `json:"name"`
Size int64 `json:"size"`
Md5 string `json:"md5"`
LastOpTime Time `json:"lastOpTime"`
CreateDate Time `json:"createDate"`
Icon struct {
//iconOption 5
SmallUrl string `json:"smallUrl"`
LargeUrl string `json:"largeUrl"`
// iconOption 10
Max600 string `json:"max600"`
MediumURL string `json:"mediumUrl"`
} `json:"icon"`
// Orientation int64 `json:"orientation"`
// FileCata int64 `json:"fileCata"`
// MediaType int `json:"mediaType"`
// Rev string `json:"rev"`
// StarLabel int64 `json:"starLabel"`
}
func (c *Cloud189File) GetSize() int64 { return c.Size }
func (c *Cloud189File) GetName() string { return c.Name }
func (c *Cloud189File) ModTime() time.Time { return time.Time(c.LastOpTime) }
func (c *Cloud189File) IsDir() bool { return false }
func (c *Cloud189File) GetID() string { return string(c.ID) }
func (c *Cloud189File) GetPath() string { return "" }
func (c *Cloud189File) Thumb() string { return c.Icon.SmallUrl }
// 文件夹
type Cloud189Folder struct {
ID String `json:"id"`
ParentID int64 `json:"parentId"`
Name string `json:"name"`
LastOpTime Time `json:"lastOpTime"`
CreateDate Time `json:"createDate"`
// FileListSize int64 `json:"fileListSize"`
// FileCount int64 `json:"fileCount"`
// FileCata int64 `json:"fileCata"`
// Rev string `json:"rev"`
// StarLabel int64 `json:"starLabel"`
}
func (c *Cloud189Folder) GetSize() int64 { return 0 }
func (c *Cloud189Folder) GetName() string { return c.Name }
func (c *Cloud189Folder) ModTime() time.Time { return time.Time(c.LastOpTime) }
func (c *Cloud189Folder) IsDir() bool { return true }
func (c *Cloud189Folder) GetID() string { return string(c.ID) }
func (c *Cloud189Folder) GetPath() string { return "" }
type Cloud189FilesResp struct {
//ResCode int `json:"res_code"`
//ResMessage string `json:"res_message"`
FileListAO struct {
Count int `json:"count"`
FileList []Cloud189File `json:"fileList"`
FolderList []Cloud189Folder `json:"folderList"`
} `json:"fileListAO"`
}
// TaskInfo 任务信息
type BatchTaskInfo struct {
// FileId 文件ID
FileId string `json:"fileId"`
// FileName 文件名
FileName string `json:"fileName"`
// IsFolder 是否是文件夹0-否1-是
IsFolder int `json:"isFolder"`
// SrcParentId 文件所在父目录ID
//SrcParentId string `json:"srcParentId"`
}
/* 上传部分 */
type InitMultiUploadResp struct {
//Code string `json:"code"`
Data struct {
UploadType int `json:"uploadType"`
UploadHost string `json:"uploadHost"`
UploadFileID string `json:"uploadFileId"`
FileDataExists int `json:"fileDataExists"`
} `json:"data"`
}
type UploadUrlsResp struct {
Code string `json:"code"`
UploadUrls map[string]Part `json:"uploadUrls"`
}
type Part struct {
RequestURL string `json:"requestURL"`
RequestHeader string `json:"requestHeader"`
}
/* 第二种上传方式 */
type CreateUploadFileResp struct {
// 上传文件请求ID
UploadFileId int64 `json:"uploadFileId"`
// 上传文件数据的URL路径
FileUploadUrl string `json:"fileUploadUrl"`
// 上传文件完成后确认路径
FileCommitUrl string `json:"fileCommitUrl"`
// 文件是否已存在云盘中0-未存在1-已存在
FileDataExists int `json:"fileDataExists"`
}
type GetUploadFileStatusResp struct {
CreateUploadFileResp
// 已上传的大小
DataSize int64 `json:"dataSize"`
Size int64 `json:"size"`
}
func (r *GetUploadFileStatusResp) GetSize() int64 {
return r.DataSize + r.Size
}
type CommitMultiUploadFileResp struct {
File struct {
UserFileID String `json:"userFileId"`
FileName string `json:"fileName"`
FileSize int64 `json:"fileSize"`
FileMd5 string `json:"fileMd5"`
CreateDate Time `json:"createDate"`
} `json:"file"`
}
func (f *CommitMultiUploadFileResp) toFile() *Cloud189File {
return &Cloud189File{
ID: f.File.UserFileID,
Name: f.File.FileName,
Size: f.File.FileSize,
Md5: f.File.FileMd5,
LastOpTime: f.File.CreateDate,
CreateDate: f.File.CreateDate,
}
}
type OldCommitUploadFileResp struct {
XMLName xml.Name `xml:"file"`
ID String `xml:"id"`
Name string `xml:"name"`
Size int64 `xml:"size"`
Md5 string `xml:"md5"`
CreateDate Time `xml:"createDate"`
}
func (f *OldCommitUploadFileResp) toFile() *Cloud189File {
return &Cloud189File{
ID: f.ID,
Name: f.Name,
Size: f.Size,
Md5: f.Md5,
CreateDate: f.CreateDate,
LastOpTime: f.CreateDate,
}
}
type CreateBatchTaskResp struct {
TaskID string `json:"taskId"`
}
type BatchTaskStateResp struct {
FailedCount int `json:"failedCount"`
Process int `json:"process"`
SkipCount int `json:"skipCount"`
SubTaskCount int `json:"subTaskCount"`
SuccessedCount int `json:"successedCount"`
SuccessedFileIDList []int64 `json:"successedFileIdList"`
TaskID string `json:"taskId"`
TaskStatus int `json:"taskStatus"` //1 初始化 2 存在冲突 3 执行中4 完成
}
/* query 加密参数*/
type Params map[string]string
func (p Params) Set(k, v string) {
p[k] = v
}
func (p Params) Encode() string {
if p == nil {
return ""
}
var buf strings.Builder
keys := make([]string, 0, len(p))
for k := range p {
keys = append(keys, k)
}
sort.Strings(keys)
for i := range keys {
if buf.Len() > 0 {
buf.WriteByte('&')
}
buf.WriteString(keys[i])
buf.WriteByte('=')
buf.WriteString(p[keys[i]])
}
return buf.String()
}

874
drivers/189pc/utils.go Normal file
View File

@ -0,0 +1,874 @@
package _189pc
import (
"bytes"
"context"
"crypto/md5"
"encoding/base64"
"encoding/hex"
"encoding/xml"
"fmt"
"io"
"math"
"net/http"
"net/http/cookiejar"
"net/url"
"os"
"regexp"
"strings"
"time"
"github.com/alist-org/alist/v3/drivers/base"
"github.com/alist-org/alist/v3/internal/conf"
"github.com/alist-org/alist/v3/internal/driver"
"github.com/alist-org/alist/v3/internal/model"
"github.com/alist-org/alist/v3/internal/op"
"github.com/alist-org/alist/v3/internal/setting"
"github.com/alist-org/alist/v3/pkg/utils"
"github.com/avast/retry-go"
"github.com/go-resty/resty/v2"
"github.com/google/uuid"
jsoniter "github.com/json-iterator/go"
"github.com/pkg/errors"
)
const (
ACCOUNT_TYPE = "02"
APP_ID = "8025431004"
CLIENT_TYPE = "10020"
VERSION = "6.2"
WEB_URL = "https://cloud.189.cn"
AUTH_URL = "https://open.e.189.cn"
API_URL = "https://api.cloud.189.cn"
UPLOAD_URL = "https://upload.cloud.189.cn"
RETURN_URL = "https://m.cloud.189.cn/zhuanti/2020/loginErrorPc/index.html"
PC = "TELEPC"
MAC = "TELEMAC"
CHANNEL_ID = "web_cloud.189.cn"
)
func (y *Cloud189PC) SignatureHeader(url, method, params string) map[string]string {
dateOfGmt := getHttpDateStr()
sessionKey := y.tokenInfo.SessionKey
sessionSecret := y.tokenInfo.SessionSecret
if y.isFamily() {
sessionKey = y.tokenInfo.FamilySessionKey
sessionSecret = y.tokenInfo.FamilySessionSecret
}
header := map[string]string{
"Date": dateOfGmt,
"SessionKey": sessionKey,
"X-Request-ID": uuid.NewString(),
"Signature": signatureOfHmac(sessionSecret, sessionKey, method, url, dateOfGmt, params),
}
return header
}
func (y *Cloud189PC) EncryptParams(params Params) string {
sessionSecret := y.tokenInfo.SessionSecret
if y.isFamily() {
sessionSecret = y.tokenInfo.FamilySessionSecret
}
if params != nil {
return AesECBEncrypt(params.Encode(), sessionSecret[:16])
}
return ""
}
func (y *Cloud189PC) request(url, method string, callback base.ReqCallback, params Params, resp interface{}) ([]byte, error) {
req := y.client.R().SetQueryParams(clientSuffix())
// 设置params
paramsData := y.EncryptParams(params)
if paramsData != "" {
req.SetQueryParam("params", paramsData)
}
// Signature
req.SetHeaders(y.SignatureHeader(url, method, paramsData))
var erron RespErr
req.SetError(&erron)
if callback != nil {
callback(req)
}
if resp != nil {
req.SetResult(resp)
}
res, err := req.Execute(method, url)
if err != nil {
return nil, err
}
if strings.Contains(res.String(), "userSessionBO is null") {
if err = y.refreshSession(); err != nil {
return nil, err
}
return y.request(url, method, callback, params, resp)
}
// 处理错误
if erron.HasError() {
if erron.ErrorCode == "InvalidSessionKey" {
if err = y.refreshSession(); err != nil {
return nil, err
}
return y.request(url, method, callback, params, resp)
}
return nil, &erron
}
return res.Body(), nil
}
func (y *Cloud189PC) get(url string, callback base.ReqCallback, resp interface{}) ([]byte, error) {
return y.request(url, http.MethodGet, callback, nil, resp)
}
func (y *Cloud189PC) post(url string, callback base.ReqCallback, resp interface{}) ([]byte, error) {
return y.request(url, http.MethodPost, callback, nil, resp)
}
func (y *Cloud189PC) put(ctx context.Context, url string, headers map[string]string, sign bool, file io.Reader) ([]byte, error) {
req, err := http.NewRequestWithContext(ctx, http.MethodPut, url, file)
if err != nil {
return nil, err
}
query := req.URL.Query()
for key, value := range clientSuffix() {
query.Add(key, value)
}
req.URL.RawQuery = query.Encode()
for key, value := range headers {
req.Header.Add(key, value)
}
if sign {
for key, value := range y.SignatureHeader(url, http.MethodPut, "") {
req.Header.Add(key, value)
}
}
resp, err := base.HttpClient.Do(req)
if err != nil {
return nil, err
}
defer resp.Body.Close()
body, err := io.ReadAll(resp.Body)
if err != nil {
return nil, err
}
var erron RespErr
jsoniter.Unmarshal(body, &erron)
xml.Unmarshal(body, &erron)
if erron.HasError() {
return nil, &erron
}
if resp.StatusCode != http.StatusOK {
return nil, errors.Errorf("put fail,err:%s", string(body))
}
return body, nil
}
func (y *Cloud189PC) getFiles(ctx context.Context, fileId string) ([]model.Obj, error) {
fullUrl := API_URL
if y.isFamily() {
fullUrl += "/family/file"
}
fullUrl += "/listFiles.action"
res := make([]model.Obj, 0, 130)
for pageNum := 1; ; pageNum++ {
var resp Cloud189FilesResp
_, err := y.get(fullUrl, func(r *resty.Request) {
r.SetContext(ctx)
r.SetQueryParams(map[string]string{
"folderId": fileId,
"fileType": "0",
"mediaAttr": "0",
"iconOption": "5",
"pageNum": fmt.Sprint(pageNum),
"pageSize": "130",
})
if y.isFamily() {
r.SetQueryParams(map[string]string{
"familyId": y.FamilyID,
"orderBy": toFamilyOrderBy(y.OrderBy),
"descending": toDesc(y.OrderDirection),
})
} else {
r.SetQueryParams(map[string]string{
"recursive": "0",
"orderBy": y.OrderBy,
"descending": toDesc(y.OrderDirection),
})
}
}, &resp)
if err != nil {
return nil, err
}
// 获取完毕跳出
if resp.FileListAO.Count == 0 {
break
}
for i := 0; i < len(resp.FileListAO.FolderList); i++ {
res = append(res, &resp.FileListAO.FolderList[i])
}
for i := 0; i < len(resp.FileListAO.FileList); i++ {
res = append(res, &resp.FileListAO.FileList[i])
}
}
return res, nil
}
func (y *Cloud189PC) login() (err error) {
// 初始化登陆所需参数
if y.loginParam == nil {
if err = y.initLoginParam(); err != nil {
// 验证码也通过错误返回
return err
}
}
defer func() {
// 销毁验证码
y.VCode = ""
// 销毁登陆参数
y.loginParam = nil
// 遇到错误,重新加载登陆参数(刷新验证码)
if err != nil && y.NoUseOcr {
if err1 := y.initLoginParam(); err1 != nil {
err = fmt.Errorf("err1: %s \nerr2: %s", err, err1)
}
}
}()
param := y.loginParam
var loginresp LoginResp
_, err = y.client.R().
ForceContentType("application/json;charset=UTF-8").SetResult(&loginresp).
SetHeaders(map[string]string{
"REQID": param.ReqId,
"lt": param.Lt,
}).
SetFormData(map[string]string{
"appKey": APP_ID,
"accountType": ACCOUNT_TYPE,
"userName": param.RsaUsername,
"password": param.RsaPassword,
"validateCode": y.VCode,
"captchaToken": param.CaptchaToken,
"returnUrl": RETURN_URL,
// "mailSuffix": "@189.cn",
"dynamicCheck": "FALSE",
"clientType": CLIENT_TYPE,
"cb_SaveName": "1",
"isOauth2": "false",
"state": "",
"paramId": param.ParamId,
}).
Post(AUTH_URL + "/api/logbox/oauth2/loginSubmit.do")
if err != nil {
return err
}
if loginresp.ToUrl == "" {
return fmt.Errorf("login failed,No toUrl obtained, msg: %s", loginresp.Msg)
}
// 获取Session
var erron RespErr
var tokenInfo AppSessionResp
_, err = y.client.R().
SetResult(&tokenInfo).SetError(&erron).
SetQueryParams(clientSuffix()).
SetQueryParam("redirectURL", url.QueryEscape(loginresp.ToUrl)).
Post(API_URL + "/getSessionForPC.action")
if err != nil {
return
}
if erron.HasError() {
return &erron
}
if tokenInfo.ResCode != 0 {
err = fmt.Errorf(tokenInfo.ResMessage)
return
}
y.tokenInfo = &tokenInfo
return
}
/* 初始化登陆需要的参数
* 如果遇到验证码返回错误
*/
func (y *Cloud189PC) initLoginParam() error {
// 清除cookie
jar, _ := cookiejar.New(nil)
y.client.SetCookieJar(jar)
res, err := y.client.R().
SetQueryParams(map[string]string{
"appId": APP_ID,
"clientType": CLIENT_TYPE,
"returnURL": RETURN_URL,
"timeStamp": fmt.Sprint(timestamp()),
}).
Get(WEB_URL + "/api/portal/unifyLoginForPC.action")
if err != nil {
return err
}
param := LoginParam{
CaptchaToken: regexp.MustCompile(`'captchaToken' value='(.+?)'`).FindStringSubmatch(res.String())[1],
Lt: regexp.MustCompile(`lt = "(.+?)"`).FindStringSubmatch(res.String())[1],
ParamId: regexp.MustCompile(`paramId = "(.+?)"`).FindStringSubmatch(res.String())[1],
ReqId: regexp.MustCompile(`reqId = "(.+?)"`).FindStringSubmatch(res.String())[1],
// jRsaKey: regexp.MustCompile(`"j_rsaKey" value="(.+?)"`).FindStringSubmatch(res.String())[1],
}
// 获取rsa公钥
var encryptConf EncryptConfResp
_, err = y.client.R().
ForceContentType("application/json;charset=UTF-8").SetResult(&encryptConf).
SetFormData(map[string]string{"appId": APP_ID}).
Post(AUTH_URL + "/api/logbox/config/encryptConf.do")
if err != nil {
return err
}
param.jRsaKey = fmt.Sprintf("-----BEGIN PUBLIC KEY-----\n%s\n-----END PUBLIC KEY-----", encryptConf.Data.PubKey)
param.RsaUsername = encryptConf.Data.Pre + RsaEncrypt(param.jRsaKey, y.Username)
param.RsaPassword = encryptConf.Data.Pre + RsaEncrypt(param.jRsaKey, y.Password)
y.loginParam = &param
// 判断是否需要验证码
resp, err := y.client.R().
SetHeader("REQID", param.ReqId).
SetFormData(map[string]string{
"appKey": APP_ID,
"accountType": ACCOUNT_TYPE,
"userName": param.RsaUsername,
}).Post(AUTH_URL + "/api/logbox/oauth2/needcaptcha.do")
if err != nil {
return err
}
if resp.String() == "0" {
return nil
}
// 拉取验证码
imgRes, err := y.client.R().
SetQueryParams(map[string]string{
"token": param.CaptchaToken,
"REQID": param.ReqId,
"rnd": fmt.Sprint(timestamp()),
}).
Get(AUTH_URL + "/api/logbox/oauth2/picCaptcha.do")
if err != nil {
return fmt.Errorf("failed to obtain verification code")
}
if imgRes.Size() > 20 {
if setting.GetStr(conf.OcrApi) != "" && !y.NoUseOcr {
vRes, err := base.RestyClient.R().
SetMultipartField("image", "validateCode.png", "image/png", bytes.NewReader(imgRes.Body())).
Post(setting.GetStr(conf.OcrApi))
if err != nil {
return err
}
if jsoniter.Get(vRes.Body(), "status").ToInt() == 200 {
y.VCode = jsoniter.Get(vRes.Body(), "result").ToString()
return nil
}
}
// 返回验证码图片给前端
return fmt.Errorf(`need img validate code: <img src="data:image/png;base64,%s"/>`, base64.StdEncoding.EncodeToString(imgRes.Body()))
}
return nil
}
// 刷新会话
func (y *Cloud189PC) refreshSession() (err error) {
var erron RespErr
var userSessionResp UserSessionResp
_, err = y.client.R().
SetResult(&userSessionResp).SetError(&erron).
SetQueryParams(clientSuffix()).
SetQueryParams(map[string]string{
"appId": APP_ID,
"accessToken": y.tokenInfo.AccessToken,
}).
SetHeader("X-Request-ID", uuid.NewString()).
Get(API_URL + "/getSessionForPC.action")
if err != nil {
return err
}
// 错误影响正常访问,下线该储存
defer func() {
if err != nil {
y.GetStorage().SetStatus(fmt.Sprintf("%+v", err.Error()))
op.MustSaveDriverStorage(y)
}
}()
if erron.HasError() {
if erron.ResCode == "UserInvalidOpenToken" {
if err = y.login(); err != nil {
return err
}
}
return &erron
}
y.tokenInfo.UserSessionResp = userSessionResp
return
}
// 普通上传
// 无法上传大小为0的文件
func (y *Cloud189PC) StreamUpload(ctx context.Context, dstDir model.Obj, file model.FileStreamer, up driver.UpdateProgress) (model.Obj, error) {
var DEFAULT = partSize(file.GetSize())
var count = int(math.Ceil(float64(file.GetSize()) / float64(DEFAULT)))
params := Params{
"parentFolderId": dstDir.GetID(),
"fileName": url.QueryEscape(file.GetName()),
"fileSize": fmt.Sprint(file.GetSize()),
"sliceSize": fmt.Sprint(DEFAULT),
"lazyCheck": "1",
}
fullUrl := UPLOAD_URL
if y.isFamily() {
params.Set("familyId", y.FamilyID)
fullUrl += "/family"
} else {
//params.Set("extend", `{"opScene":"1","relativepath":"","rootfolderid":""}`)
fullUrl += "/person"
}
// 初始化上传
var initMultiUpload InitMultiUploadResp
_, err := y.request(fullUrl+"/initMultiUpload", http.MethodGet, func(req *resty.Request) {
req.SetContext(ctx)
}, params, &initMultiUpload)
if err != nil {
return nil, err
}
fileMd5 := md5.New()
silceMd5 := md5.New()
silceMd5Hexs := make([]string, 0, count)
byteData := bytes.NewBuffer(make([]byte, DEFAULT))
for i := 1; i <= count; i++ {
if utils.IsCanceled(ctx) {
return nil, ctx.Err()
}
// 读取块
byteData.Reset()
silceMd5.Reset()
_, err := io.CopyN(io.MultiWriter(fileMd5, silceMd5, byteData), file, DEFAULT)
if err != io.EOF && err != io.ErrUnexpectedEOF && err != nil {
return nil, err
}
// 计算块md5并进行hex和base64编码
md5Bytes := silceMd5.Sum(nil)
silceMd5Hexs = append(silceMd5Hexs, strings.ToUpper(hex.EncodeToString(md5Bytes)))
silceMd5Base64 := base64.StdEncoding.EncodeToString(md5Bytes)
// 获取上传链接
var uploadUrl UploadUrlsResp
_, err = y.request(fullUrl+"/getMultiUploadUrls", http.MethodGet,
func(req *resty.Request) {
req.SetContext(ctx)
}, Params{
"partInfo": fmt.Sprintf("%d-%s", i, silceMd5Base64),
"uploadFileId": initMultiUpload.Data.UploadFileID,
}, &uploadUrl)
if err != nil {
return nil, err
}
// 开始上传
uploadData := uploadUrl.UploadUrls[fmt.Sprint("partNumber_", i)]
err = retry.Do(func() error {
_, err := y.put(ctx, uploadData.RequestURL, ParseHttpHeader(uploadData.RequestHeader), false, bytes.NewReader(byteData.Bytes()))
return err
},
retry.Context(ctx),
retry.Attempts(3),
retry.Delay(time.Second),
retry.MaxDelay(5*time.Second))
if err != nil {
return nil, err
}
up(int(i * 100 / count))
}
fileMd5Hex := strings.ToUpper(hex.EncodeToString(fileMd5.Sum(nil)))
sliceMd5Hex := fileMd5Hex
if file.GetSize() > DEFAULT {
sliceMd5Hex = strings.ToUpper(utils.GetMD5EncodeStr(strings.Join(silceMd5Hexs, "\n")))
}
// 提交上传
var resp CommitMultiUploadFileResp
_, err = y.request(fullUrl+"/commitMultiUploadFile", http.MethodGet,
func(req *resty.Request) {
req.SetContext(ctx)
}, Params{
"uploadFileId": initMultiUpload.Data.UploadFileID,
"fileMd5": fileMd5Hex,
"sliceMd5": sliceMd5Hex,
"lazyCheck": "1",
"isLog": "0",
"opertype": "3",
}, &resp)
if err != nil {
return nil, err
}
return resp.toFile(), nil
}
// 快传
func (y *Cloud189PC) FastUpload(ctx context.Context, dstDir model.Obj, file model.FileStreamer, up driver.UpdateProgress) (model.Obj, error) {
// 需要获取完整文件md5,必须支持 io.Seek
tempFile, err := utils.CreateTempFile(file.GetReadCloser(), file.GetSize())
if err != nil {
return nil, err
}
defer func() {
_ = tempFile.Close()
_ = os.Remove(tempFile.Name())
}()
var DEFAULT = partSize(file.GetSize())
count := int(math.Ceil(float64(file.GetSize()) / float64(DEFAULT)))
// 优先计算所需信息
fileMd5 := md5.New()
silceMd5 := md5.New()
silceMd5Hexs := make([]string, 0, count)
silceMd5Base64s := make([]string, 0, count)
for i := 1; i <= count; i++ {
if utils.IsCanceled(ctx) {
return nil, ctx.Err()
}
silceMd5.Reset()
if _, err := io.CopyN(io.MultiWriter(fileMd5, silceMd5), tempFile, DEFAULT); err != nil && err != io.EOF && err != io.ErrUnexpectedEOF {
return nil, err
}
md5Byte := silceMd5.Sum(nil)
silceMd5Hexs = append(silceMd5Hexs, strings.ToUpper(hex.EncodeToString(md5Byte)))
silceMd5Base64s = append(silceMd5Base64s, fmt.Sprint(i, "-", base64.StdEncoding.EncodeToString(md5Byte)))
}
if _, err = tempFile.Seek(0, io.SeekStart); err != nil {
return nil, err
}
fileMd5Hex := strings.ToUpper(hex.EncodeToString(fileMd5.Sum(nil)))
sliceMd5Hex := fileMd5Hex
if file.GetSize() > DEFAULT {
sliceMd5Hex = strings.ToUpper(utils.GetMD5EncodeStr(strings.Join(silceMd5Hexs, "\n")))
}
// 检测是否支持快传
params := Params{
"parentFolderId": dstDir.GetID(),
"fileName": url.QueryEscape(file.GetName()),
"fileSize": fmt.Sprint(file.GetSize()),
"fileMd5": fileMd5Hex,
"sliceSize": fmt.Sprint(DEFAULT),
"sliceMd5": sliceMd5Hex,
}
fullUrl := UPLOAD_URL
if y.isFamily() {
params.Set("familyId", y.FamilyID)
fullUrl += "/family"
} else {
//params.Set("extend", `{"opScene":"1","relativepath":"","rootfolderid":""}`)
fullUrl += "/person"
}
var uploadInfo InitMultiUploadResp
_, err = y.request(fullUrl+"/initMultiUpload", http.MethodGet, func(req *resty.Request) {
req.SetContext(ctx)
}, params, &uploadInfo)
if err != nil {
return nil, err
}
// 网盘中不存在该文件,开始上传
if uploadInfo.Data.FileDataExists != 1 {
var uploadUrls UploadUrlsResp
_, err = y.request(fullUrl+"/getMultiUploadUrls", http.MethodGet,
func(req *resty.Request) {
req.SetContext(ctx)
}, Params{
"uploadFileId": uploadInfo.Data.UploadFileID,
"partInfo": strings.Join(silceMd5Base64s, ","),
}, &uploadUrls)
if err != nil {
return nil, err
}
buf := make([]byte, DEFAULT)
for i := 1; i <= count; i++ {
if utils.IsCanceled(ctx) {
return nil, ctx.Err()
}
n, err := io.ReadFull(tempFile, buf)
if err != nil && err != io.EOF && err != io.ErrUnexpectedEOF {
return nil, err
}
uploadData := uploadUrls.UploadUrls[fmt.Sprint("partNumber_", i)]
err = retry.Do(func() error {
_, err := y.put(ctx, uploadData.RequestURL, ParseHttpHeader(uploadData.RequestHeader), false, bytes.NewReader(buf[:n]))
return err
},
retry.Context(ctx),
retry.Attempts(3),
retry.Delay(time.Second),
retry.MaxDelay(5*time.Second))
if err != nil {
return nil, err
}
up(int(i * 100 / count))
}
}
// 提交
var resp CommitMultiUploadFileResp
_, err = y.request(fullUrl+"/commitMultiUploadFile", http.MethodGet,
func(req *resty.Request) {
req.SetContext(ctx)
}, Params{
"uploadFileId": uploadInfo.Data.UploadFileID,
"isLog": "0",
"opertype": "3",
}, &resp)
if err != nil {
return nil, err
}
return resp.toFile(), nil
}
// 旧版本上传,家庭云不支持覆盖
func (y *Cloud189PC) OldUpload(ctx context.Context, dstDir model.Obj, file model.FileStreamer, up driver.UpdateProgress) (model.Obj, error) {
// 需要获取完整文件md5,必须支持 io.Seek
tempFile, err := utils.CreateTempFile(file.GetReadCloser(), file.GetSize())
if err != nil {
return nil, err
}
defer func() {
_ = tempFile.Close()
_ = os.Remove(tempFile.Name())
}()
// 计算md5
fileMd5 := md5.New()
if _, err := io.Copy(fileMd5, tempFile); err != nil {
return nil, err
}
if _, err = tempFile.Seek(0, io.SeekStart); err != nil {
return nil, err
}
fileMd5Hex := strings.ToUpper(hex.EncodeToString(fileMd5.Sum(nil)))
// 创建上传会话
var uploadInfo CreateUploadFileResp
fullUrl := API_URL + "/createUploadFile.action"
if y.isFamily() {
fullUrl = API_URL + "/family/file/createFamilyFile.action"
}
_, err = y.post(fullUrl, func(req *resty.Request) {
req.SetContext(ctx)
if y.isFamily() {
req.SetQueryParams(map[string]string{
"familyId": y.FamilyID,
"fileMd5": fileMd5Hex,
"fileName": file.GetName(),
"fileSize": fmt.Sprint(file.GetSize()),
"parentId": dstDir.GetID(),
"resumePolicy": "1",
})
} else {
req.SetFormData(map[string]string{
"parentFolderId": dstDir.GetID(),
"fileName": file.GetName(),
"size": fmt.Sprint(file.GetSize()),
"md5": fileMd5Hex,
"opertype": "3",
"flag": "1",
"resumePolicy": "1",
"isLog": "0",
// "baseFileId": "",
// "lastWrite":"",
// "localPath": strings.ReplaceAll(param.LocalPath, "\\", "/"),
// "fileExt": "",
})
}
}, &uploadInfo)
if err != nil {
return nil, err
}
// 网盘中不存在该文件,开始上传
status := GetUploadFileStatusResp{CreateUploadFileResp: uploadInfo}
for status.Size < file.GetSize() && status.FileDataExists != 1 {
if utils.IsCanceled(ctx) {
return nil, ctx.Err()
}
header := map[string]string{
"ResumePolicy": "1",
"Expect": "100-continue",
}
if y.isFamily() {
header["FamilyId"] = fmt.Sprint(y.FamilyID)
header["UploadFileId"] = fmt.Sprint(status.UploadFileId)
} else {
header["Edrive-UploadFileId"] = fmt.Sprint(status.UploadFileId)
}
_, err := y.put(ctx, status.FileUploadUrl, header, true, io.NopCloser(tempFile))
if err, ok := err.(*RespErr); ok && err.Code != "InputStreamReadError" {
return nil, err
}
// 获取断点状态
fullUrl := API_URL + "/getUploadFileStatus.action"
if y.isFamily() {
fullUrl = API_URL + "/family/file/getFamilyFileStatus.action"
}
_, err = y.get(fullUrl, func(req *resty.Request) {
req.SetContext(ctx).SetQueryParams(map[string]string{
"uploadFileId": fmt.Sprint(status.UploadFileId),
"resumePolicy": "1",
})
if y.isFamily() {
req.SetQueryParam("familyId", fmt.Sprint(y.FamilyID))
}
}, &status)
if err != nil {
return nil, err
}
if _, err := tempFile.Seek(status.GetSize(), io.SeekStart); err != nil {
return nil, err
}
up(int(status.Size / file.GetSize()))
}
// 提交
var resp OldCommitUploadFileResp
_, err = y.post(status.FileCommitUrl, func(req *resty.Request) {
req.SetContext(ctx)
if y.isFamily() {
req.SetHeaders(map[string]string{
"ResumePolicy": "1",
"UploadFileId": fmt.Sprint(status.UploadFileId),
"FamilyId": fmt.Sprint(y.FamilyID),
})
} else {
req.SetFormData(map[string]string{
"opertype": "3",
"resumePolicy": "1",
"uploadFileId": fmt.Sprint(status.UploadFileId),
"isLog": "0",
})
}
}, &resp)
if err != nil {
return nil, err
}
return resp.toFile(), nil
}
func (y *Cloud189PC) isFamily() bool {
return y.Type == "family"
}
func (y *Cloud189PC) isLogin() bool {
if y.tokenInfo == nil {
return false
}
_, err := y.get(API_URL+"/getUserInfo.action", nil, nil)
return err == nil
}
// 获取家庭云所有用户信息
func (y *Cloud189PC) getFamilyInfoList() ([]FamilyInfoResp, error) {
var resp FamilyInfoListResp
_, err := y.get(API_URL+"/family/manage/getFamilyList.action", nil, &resp)
if err != nil {
return nil, err
}
return resp.FamilyInfoResp, nil
}
// 抽取家庭云ID
func (y *Cloud189PC) getFamilyID() (string, error) {
infos, err := y.getFamilyInfoList()
if err != nil {
return "", err
}
if len(infos) == 0 {
return "", fmt.Errorf("cannot get automatically,please input family_id")
}
for _, info := range infos {
if strings.Contains(y.tokenInfo.LoginName, info.RemarkName) {
return fmt.Sprint(info.FamilyID), nil
}
}
return fmt.Sprint(infos[0].FamilyID), nil
}
func (y *Cloud189PC) CheckBatchTask(aType string, taskID string) (*BatchTaskStateResp, error) {
var resp BatchTaskStateResp
_, err := y.post(API_URL+"/batch/checkBatchTask.action", func(req *resty.Request) {
req.SetFormData(map[string]string{
"type": aType,
"taskId": taskID,
})
}, &resp)
if err != nil {
return nil, err
}
return &resp, nil
}
func (y *Cloud189PC) WaitBatchTask(aType string, taskID string, t time.Duration) error {
for {
state, err := y.CheckBatchTask(aType, taskID)
if err != nil {
return err
}
switch state.TaskStatus {
case 2:
return errors.New("there is a conflict with the target object")
case 4:
return nil
}
time.Sleep(t)
}
}

114
drivers/alias/driver.go Normal file
View File

@ -0,0 +1,114 @@
package alias
import (
"context"
"errors"
"strings"
"github.com/alist-org/alist/v3/internal/driver"
"github.com/alist-org/alist/v3/internal/errs"
"github.com/alist-org/alist/v3/internal/model"
"github.com/alist-org/alist/v3/pkg/utils"
)
type Alias struct {
model.Storage
Addition
pathMap map[string][]string
autoFlatten bool
oneKey string
}
func (d *Alias) Config() driver.Config {
return config
}
func (d *Alias) GetAddition() driver.Additional {
return &d.Addition
}
func (d *Alias) Init(ctx context.Context) error {
if d.Paths == "" {
return errors.New("paths is required")
}
d.pathMap = make(map[string][]string)
for _, path := range strings.Split(d.Paths, "\n") {
path = strings.TrimSpace(path)
if path == "" {
continue
}
k, v := getPair(path)
d.pathMap[k] = append(d.pathMap[k], v)
}
if len(d.pathMap) == 1 {
for k := range d.pathMap {
d.oneKey = k
}
d.autoFlatten = true
}
return nil
}
func (d *Alias) Drop(ctx context.Context) error {
d.pathMap = nil
return nil
}
func (d *Alias) Get(ctx context.Context, path string) (model.Obj, error) {
if utils.PathEqual(path, "/") {
return &model.Object{
Name: "Root",
IsFolder: true,
Path: "/",
}, nil
}
root, sub := d.getRootAndPath(path)
dsts, ok := d.pathMap[root]
if !ok {
return nil, errs.ObjectNotFound
}
for _, dst := range dsts {
obj, err := d.get(ctx, path, dst, sub)
if err == nil {
return obj, nil
}
}
return nil, errs.ObjectNotFound
}
func (d *Alias) List(ctx context.Context, dir model.Obj, args model.ListArgs) ([]model.Obj, error) {
path := dir.GetPath()
if utils.PathEqual(path, "/") && !d.autoFlatten {
return d.listRoot(), nil
}
root, sub := d.getRootAndPath(path)
dsts, ok := d.pathMap[root]
if !ok {
return nil, errs.ObjectNotFound
}
var objs []model.Obj
for _, dst := range dsts {
tmp, err := d.list(ctx, dst, sub)
if err == nil {
objs = append(objs, tmp...)
}
}
return objs, nil
}
func (d *Alias) Link(ctx context.Context, file model.Obj, args model.LinkArgs) (*model.Link, error) {
root, sub := d.getRootAndPath(file.GetPath())
dsts, ok := d.pathMap[root]
if !ok {
return nil, errs.ObjectNotFound
}
for _, dst := range dsts {
link, err := d.link(ctx, dst, sub, args)
if err == nil {
return link, nil
}
}
return nil, errs.ObjectNotFound
}
var _ driver.Driver = (*Alias)(nil)

27
drivers/alias/meta.go Normal file
View File

@ -0,0 +1,27 @@
package alias
import (
"github.com/alist-org/alist/v3/internal/driver"
"github.com/alist-org/alist/v3/internal/op"
)
type Addition struct {
// Usually one of two
// driver.RootPath
// define other
Paths string `json:"paths" required:"true" type:"text"`
}
var config = driver.Config{
Name: "Alias",
LocalSort: true,
NoCache: true,
NoUpload: true,
DefaultRoot: "/",
}
func init() {
op.RegisterDriver(func() driver.Driver {
return &Alias{}
})
}

1
drivers/alias/types.go Normal file
View File

@ -0,0 +1 @@
package alias

114
drivers/alias/util.go Normal file
View File

@ -0,0 +1,114 @@
package alias
import (
"context"
"fmt"
stdpath "path"
"strings"
"github.com/alist-org/alist/v3/internal/fs"
"github.com/alist-org/alist/v3/internal/model"
"github.com/alist-org/alist/v3/internal/sign"
"github.com/alist-org/alist/v3/pkg/utils"
"github.com/alist-org/alist/v3/server/common"
)
func (d *Alias) listRoot() []model.Obj {
var objs []model.Obj
for k, _ := range d.pathMap {
obj := model.Object{
Name: k,
IsFolder: true,
Modified: d.Modified,
}
objs = append(objs, &obj)
}
return objs
}
// do others that not defined in Driver interface
func getPair(path string) (string, string) {
//path = strings.TrimSpace(path)
if strings.Contains(path, ":") {
pair := strings.SplitN(path, ":", 2)
if !strings.Contains(pair[0], "/") {
return pair[0], pair[1]
}
}
return stdpath.Base(path), path
}
func (d *Alias) getRootAndPath(path string) (string, string) {
if d.autoFlatten {
return d.oneKey, path
}
path = strings.TrimPrefix(path, "/")
parts := strings.SplitN(path, "/", 2)
if len(parts) == 1 {
return parts[0], ""
}
return parts[0], parts[1]
}
func (d *Alias) get(ctx context.Context, path string, dst, sub string) (model.Obj, error) {
obj, err := fs.Get(ctx, stdpath.Join(dst, sub), &fs.GetArgs{NoLog: true})
if err != nil {
return nil, err
}
return &model.Object{
Path: path,
Name: obj.GetName(),
Size: obj.GetSize(),
Modified: obj.ModTime(),
IsFolder: obj.IsDir(),
}, nil
}
func (d *Alias) list(ctx context.Context, dst, sub string) ([]model.Obj, error) {
objs, err := fs.List(ctx, stdpath.Join(dst, sub), &fs.ListArgs{NoLog: true})
// the obj must implement the model.SetPath interface
// return objs, err
if err != nil {
return nil, err
}
return utils.SliceConvert(objs, func(obj model.Obj) (model.Obj, error) {
thumb, ok := model.GetThumb(obj)
objRes := model.Object{
Name: obj.GetName(),
Size: obj.GetSize(),
Modified: obj.ModTime(),
IsFolder: obj.IsDir(),
}
if !ok {
return &objRes, nil
}
return &model.ObjThumb{
Object: objRes,
Thumbnail: model.Thumbnail{
Thumbnail: thumb,
},
}, nil
})
}
func (d *Alias) link(ctx context.Context, dst, sub string, args model.LinkArgs) (*model.Link, error) {
reqPath := stdpath.Join(dst, sub)
storage, err := fs.GetStorage(reqPath, &fs.GetStoragesArgs{})
if err != nil {
return nil, err
}
_, err = fs.Get(ctx, reqPath, &fs.GetArgs{NoLog: true})
if err != nil {
return nil, err
}
if common.ShouldProxy(storage, stdpath.Base(sub)) {
return &model.Link{
URL: fmt.Sprintf("%s/p%s?sign=%s",
common.GetApiUrl(args.HttpReq),
utils.EncodePath(reqPath, true),
sign.Sign(reqPath)),
}, nil
}
link, _, err := fs.Link(ctx, reqPath, args)
return link, err
}

View File

@ -1,243 +0,0 @@
package alidrive
import (
"errors"
"fmt"
"github.com/Xhofe/alist/conf"
"github.com/Xhofe/alist/drivers/base"
"github.com/Xhofe/alist/model"
"github.com/Xhofe/alist/utils"
"github.com/go-resty/resty/v2"
log "github.com/sirupsen/logrus"
"path/filepath"
"strings"
"time"
)
var aliClient = resty.New()
type AliRespError struct {
Code string `json:"code"`
Message string `json:"message"`
}
type AliFiles struct {
Items []AliFile `json:"items"`
NextMarker string `json:"next_marker"`
}
type AliFile struct {
DriveId string `json:"drive_id"`
CreatedAt *time.Time `json:"created_at"`
FileExtension string `json:"file_extension"`
FileId string `json:"file_id"`
Type string `json:"type"`
Name string `json:"name"`
Category string `json:"category"`
ParentFileId string `json:"parent_file_id"`
UpdatedAt *time.Time `json:"updated_at"`
Size int64 `json:"size"`
Thumbnail string `json:"thumbnail"`
Url string `json:"url"`
}
func (driver AliDrive) FormatFile(file *AliFile) *model.File {
f := &model.File{
Id: file.FileId,
Name: file.Name,
Size: file.Size,
UpdatedAt: file.UpdatedAt,
Thumbnail: file.Thumbnail,
Driver: driver.Config().Name,
Url: file.Url,
}
if file.Type == "folder" {
f.Type = conf.FOLDER
} else {
f.Type = utils.GetFileType(file.FileExtension)
}
if file.Category == "video" {
f.Type = conf.VIDEO
}
if file.Category == "image" {
f.Type = conf.IMAGE
}
return f
}
func (driver AliDrive) GetFiles(fileId string, account *model.Account) ([]AliFile, error) {
marker := "first"
res := make([]AliFile, 0)
for marker != "" {
if marker == "first" {
marker = ""
}
var resp AliFiles
var e AliRespError
_, err := aliClient.R().
SetResult(&resp).
SetError(&e).
SetHeader("authorization", "Bearer\t"+account.AccessToken).
SetBody(base.Json{
"drive_id": account.DriveId,
"fields": "*",
"image_thumbnail_process": "image/resize,w_400/format,jpeg",
"image_url_process": "image/resize,w_1920/format,jpeg",
"limit": account.Limit,
"marker": marker,
"order_by": account.OrderBy,
"order_direction": account.OrderDirection,
"parent_file_id": fileId,
"video_thumbnail_process": "video/snapshot,t_0,f_jpg,ar_auto,w_300",
"url_expire_sec": 14400,
}).Post("https://api.aliyundrive.com/v2/file/list")
if err != nil {
return nil, err
}
if e.Code != "" {
if e.Code == "AccessTokenInvalid" {
err = driver.RefreshToken(account)
if err != nil {
return nil, err
} else {
_ = model.SaveAccount(account)
return driver.GetFiles(fileId, account)
}
}
return nil, fmt.Errorf("%s", e.Message)
}
marker = resp.NextMarker
res = append(res, resp.Items...)
}
return res, nil
}
func (driver AliDrive) GetFile(path string, account *model.Account) (*AliFile, error) {
dir, name := filepath.Split(path)
dir = utils.ParsePath(dir)
_, err := driver.Files(dir, account)
if err != nil {
return nil, err
}
parentFiles_, _ := base.GetCache(dir, account)
parentFiles, _ := parentFiles_.([]AliFile)
for _, file := range parentFiles {
if file.Name == name {
if file.Type == "file" {
return &file, err
} else {
return nil, fmt.Errorf("not file")
}
}
}
return nil, base.ErrPathNotFound
}
func (driver AliDrive) RefreshToken(account *model.Account) error {
url := "https://auth.aliyundrive.com/v2/account/token"
var resp base.TokenResp
var e AliRespError
_, err := aliClient.R().
//ForceContentType("application/json").
SetBody(base.Json{"refresh_token": account.RefreshToken, "grant_type": "refresh_token"}).
SetResult(&resp).
SetError(&e).
Post(url)
if err != nil {
account.Status = err.Error()
return err
}
log.Debugf("%+v,%+v", resp, e)
if e.Code != "" {
account.Status = e.Message
return fmt.Errorf("failed to refresh token: %s", e.Message)
} else {
account.Status = "work"
}
account.RefreshToken, account.AccessToken = resp.RefreshToken, resp.AccessToken
return nil
}
func (driver AliDrive) Rename(fileId, name string, account *model.Account) error {
var resp base.Json
var e AliRespError
_, err := aliClient.R().SetResult(&resp).SetError(&e).
SetHeader("authorization", "Bearer\t"+account.AccessToken).
SetBody(base.Json{
"check_name_mode": "refuse",
"drive_id": account.DriveId,
"file_id": fileId,
"name": name,
}).Post("https://api.aliyundrive.com/v3/file/update")
if err != nil {
return err
}
if e.Code != "" {
if e.Code == "AccessTokenInvalid" {
err = driver.RefreshToken(account)
if err != nil {
return err
} else {
_ = model.SaveAccount(account)
return driver.Rename(fileId, name, account)
}
}
return fmt.Errorf("%s", e.Message)
}
if resp["name"] == name {
return nil
}
return fmt.Errorf("%+v", resp)
}
func (driver AliDrive) Batch(srcId,dstId string, account *model.Account) error {
var e AliRespError
res, err := aliClient.R().SetError(&e).
SetHeader("authorization", "Bearer\t"+account.AccessToken).
SetBody(base.Json{
"requests": []base.Json{
{
"headers": base.Json{
"Content-Type": "application/json",
},
"method":"POST",
"id":srcId,
"body":base.Json{
"drive_id": account.DriveId,
"file_id":srcId,
"to_drive_id":account.DriveId,
"to_parent_file_id":dstId,
},
},
},
"resource": "file",
}).Post("https://api.aliyundrive.com/v3/batch")
if err != nil {
return err
}
if e.Code != "" {
if e.Code == "AccessTokenInvalid" {
err = driver.RefreshToken(account)
if err != nil {
return err
} else {
_ = model.SaveAccount(account)
return driver.Batch(srcId, dstId, account)
}
}
return fmt.Errorf("%s", e.Message)
}
if strings.Contains(res.String(), `"status":200`) {
return nil
}
return errors.New(res.String())
}
func init() {
base.RegisterDriver(&AliDrive{})
aliClient.
SetRetryCount(3).
SetHeader("user-agent", "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/87.0.4280.88 Safari/537.36").
SetHeader("content-type", "application/json").
SetHeader("origin", "https://www.aliyundrive.com")
}

View File

@ -1,480 +0,0 @@
package alidrive
import (
"bytes"
"errors"
"fmt"
"github.com/Xhofe/alist/conf"
"github.com/Xhofe/alist/drivers/base"
"github.com/Xhofe/alist/model"
"github.com/Xhofe/alist/utils"
"github.com/gin-gonic/gin"
"github.com/robfig/cron/v3"
log "github.com/sirupsen/logrus"
"io"
"math"
"net/http"
"path/filepath"
)
type AliDrive struct{}
func (driver AliDrive) Config() base.DriverConfig {
return base.DriverConfig{
Name: "AliDrive",
}
}
func (driver AliDrive) Items() []base.Item {
return []base.Item{
{
Name: "refresh_token",
Label: "refresh token",
Type: base.TypeString,
Required: true,
},
{
Name: "root_folder",
Label: "root folder file_id",
Type: base.TypeString,
Required: false,
},
{
Name: "order_by",
Label: "order_by",
Type: base.TypeSelect,
Values: "name,size,updated_at,created_at",
Required: false,
},
{
Name: "order_direction",
Label: "order_direction",
Type: base.TypeSelect,
Values: "ASC,DESC",
Required: false,
},
{
Name: "limit",
Label: "limit",
Type: base.TypeNumber,
Required: false,
Description: ">0 and <=200",
},
}
}
func (driver AliDrive) Save(account *model.Account, old *model.Account) error {
if old != nil {
conf.Cron.Remove(cron.EntryID(old.CronId))
}
if account.RootFolder == "" {
account.RootFolder = "root"
}
if account.Limit == 0 {
account.Limit = 200
}
err := driver.RefreshToken(account)
if err != nil {
return err
}
var resp base.Json
_, _ = aliClient.R().SetResult(&resp).
SetBody("{}").
SetHeader("authorization", "Bearer\t"+account.AccessToken).
Post("https://api.aliyundrive.com/v2/user/get")
log.Debugf("user info: %+v", resp)
account.DriveId = resp["default_drive_id"].(string)
cronId, err := conf.Cron.AddFunc("@every 2h", func() {
name := account.Name
log.Debugf("ali account name: %s", name)
newAccount, ok := model.GetAccount(name)
log.Debugf("ali account: %+v", newAccount)
if !ok {
return
}
err = driver.RefreshToken(&newAccount)
_ = model.SaveAccount(&newAccount)
})
if err != nil {
return err
}
account.CronId = int(cronId)
err = model.SaveAccount(account)
if err != nil {
return err
}
return nil
}
func (driver AliDrive) File(path string, account *model.Account) (*model.File, error) {
path = utils.ParsePath(path)
if path == "/" {
return &model.File{
Id: account.RootFolder,
Name: account.Name,
Size: 0,
Type: conf.FOLDER,
Driver: driver.Config().Name,
UpdatedAt: account.UpdatedAt,
}, nil
}
dir, name := filepath.Split(path)
files, err := driver.Files(dir, account)
if err != nil {
return nil, err
}
for _, file := range files {
if file.Name == name {
return &file, nil
}
}
return nil, base.ErrPathNotFound
}
func (driver AliDrive) Files(path string, account *model.Account) ([]model.File, error) {
path = utils.ParsePath(path)
var rawFiles []AliFile
cache, err := base.GetCache(path, account)
if err == nil {
rawFiles, _ = cache.([]AliFile)
} else {
file, err := driver.File(path, account)
if err != nil {
return nil, err
}
rawFiles, err = driver.GetFiles(file.Id, account)
if err != nil {
return nil, err
}
if len(rawFiles) > 0 {
_ = base.SetCache(path, rawFiles, account)
}
}
files := make([]model.File, 0)
for _, file := range rawFiles {
files = append(files, *driver.FormatFile(&file))
}
return files, nil
}
func (driver AliDrive) Link(args base.Args, account *model.Account) (*base.Link, error) {
file, err := driver.File(args.Path, account)
if err != nil {
return nil, err
}
var resp base.Json
var e AliRespError
_, err = aliClient.R().SetResult(&resp).
SetError(&e).
SetHeader("authorization", "Bearer\t"+account.AccessToken).
SetBody(base.Json{
"drive_id": account.DriveId,
"file_id": file.Id,
"expire_sec": 14400,
}).Post("https://api.aliyundrive.com/v2/file/get_download_url")
if err != nil {
return nil, err
}
if e.Code != "" {
if e.Code == "AccessTokenInvalid" {
err = driver.RefreshToken(account)
if err != nil {
return nil, err
} else {
_ = model.SaveAccount(account)
return driver.Link(args, account)
}
}
return nil, fmt.Errorf("%s", e.Message)
}
return &base.Link{
Url: resp["url"].(string),
}, nil
}
func (driver AliDrive) Path(path string, account *model.Account) (*model.File, []model.File, error) {
path = utils.ParsePath(path)
log.Debugf("ali path: %s", path)
file, err := driver.File(path, account)
if err != nil {
return nil, nil, err
}
if !file.IsDir() {
return file, nil, nil
}
files, err := driver.Files(path, account)
if err != nil {
return nil, nil, err
}
return nil, files, nil
}
func (driver AliDrive) Proxy(c *gin.Context, account *model.Account) {
c.Request.Header.Del("Origin")
c.Request.Header.Set("Referer", "https://www.aliyundrive.com/")
}
func (driver AliDrive) Preview(path string, account *model.Account) (interface{}, error) {
file, err := driver.GetFile(path, account)
if err != nil {
return nil, err
}
// office
var resp base.Json
var e AliRespError
var url string
req := base.Json{
"drive_id": account.DriveId,
"file_id": file.FileId,
}
switch file.Category {
case "doc":
{
url = "https://api.aliyundrive.com/v2/file/get_office_preview_url"
req["access_token"] = account.AccessToken
}
case "video":
{
url = "https://api.aliyundrive.com/v2/file/get_video_preview_play_info"
req["category"] = "live_transcoding"
}
default:
return nil, base.ErrNotSupport
}
_, err = aliClient.R().SetResult(&resp).SetError(&e).
SetHeader("authorization", "Bearer\t"+account.AccessToken).
SetBody(req).Post(url)
if err != nil {
return nil, err
}
if e.Code != "" {
return nil, fmt.Errorf("%s", e.Message)
}
return resp, nil
}
func (driver AliDrive) MakeDir(path string, account *model.Account) error {
dir, name := filepath.Split(path)
parentFile, err := driver.File(dir, account)
if err != nil {
return err
}
if !parentFile.IsDir() {
return base.ErrNotFolder
}
var resp base.Json
var e AliRespError
_, err = aliClient.R().SetResult(&resp).SetError(&e).
SetHeader("authorization", "Bearer\t"+account.AccessToken).
SetBody(base.Json{
"check_name_mode": "refuse",
"drive_id": account.DriveId,
"name": name,
"parent_file_id": parentFile.Id,
"type": "folder",
}).Post("https://api.aliyundrive.com/adrive/v2/file/createWithFolders")
if e.Code != "" {
if e.Code == "AccessTokenInvalid" {
err = driver.RefreshToken(account)
if err != nil {
return err
} else {
_ = model.SaveAccount(account)
return driver.MakeDir(path, account)
}
}
return fmt.Errorf("%s", e.Message)
}
if resp["file_name"] == name {
_ = base.DeleteCache(dir, account)
return nil
}
return fmt.Errorf("%+v", resp)
}
func (driver AliDrive) Move(src string, dst string, account *model.Account) error {
srcDir, _ := filepath.Split(src)
dstDir, dstName := filepath.Split(dst)
srcFile, err := driver.File(src, account)
if err != nil {
return err
}
// rename
if srcDir == dstDir {
err = driver.Rename(srcFile.Id, dstName, account)
} else {
// move
dstDirFile, err := driver.File(dstDir, account)
if err != nil {
return err
}
err = driver.Batch(srcFile.Id, dstDirFile.Id, account)
}
if err != nil {
_ = base.DeleteCache(srcDir, account)
_ = base.DeleteCache(dstDir, account)
}
return err
}
func (driver AliDrive) Copy(src string, dst string, account *model.Account) error {
return base.ErrNotSupport
}
func (driver AliDrive) Delete(path string, account *model.Account) error {
file, err := driver.File(path, account)
if err != nil {
return err
}
var e AliRespError
res, err := aliClient.R().SetError(&e).
SetHeader("authorization", "Bearer\t"+account.AccessToken).
SetBody(base.Json{
"drive_id": account.DriveId,
"file_id": file.Id,
}).Post("https://api.aliyundrive.com/v2/recyclebin/trash")
if err != nil {
return err
}
if e.Code != "" {
if e.Code == "AccessTokenInvalid" {
err = driver.RefreshToken(account)
if err != nil {
return err
} else {
_ = model.SaveAccount(account)
return driver.Delete(path, account)
}
}
return fmt.Errorf("%s", e.Message)
}
if res.StatusCode() == 204 {
_ = base.DeleteCache(utils.Dir(path), account)
return nil
}
return errors.New(res.String())
}
type UploadResp struct {
FileId string `json:"file_id"`
UploadId string `json:"upload_id"`
PartInfoList []struct {
UploadUrl string `json:"upload_url"`
} `json:"part_info_list"`
}
func (driver AliDrive) Upload(file *model.FileStream, account *model.Account) error {
if file == nil {
return base.ErrEmptyFile
}
const DEFAULT uint64 = 10485760
var count = int64(math.Ceil(float64(file.GetSize()) / float64(DEFAULT)))
var finish uint64 = 0
parentFile, err := driver.File(file.ParentPath, account)
if err != nil {
return err
}
if !parentFile.IsDir() {
return base.ErrNotFolder
}
var resp UploadResp
var e AliRespError
partInfoList := make([]base.Json, 0)
var i int64
for i = 0; i < count; i++ {
partInfoList = append(partInfoList, base.Json{
"part_number": i + 1,
})
}
_, err = aliClient.R().SetResult(&resp).SetError(&e).
SetHeader("authorization", "Bearer\t"+account.AccessToken).
SetBody(base.Json{
"check_name_mode": "auto_rename",
// content_hash
"content_hash_name": "none",
"drive_id": account.DriveId,
"name": file.GetFileName(),
"parent_file_id": parentFile.Id,
"part_info_list": partInfoList,
//proof_code
"proof_version": "v1",
"size": file.GetSize(),
"type": "file",
}).Post("https://api.aliyundrive.com/adrive/v2/file/createWithFolders") // /v2/file/create_with_proof
//log.Debugf("%+v\n%+v", resp, e)
if e.Code != "" {
if e.Code == "AccessTokenInvalid" {
err = driver.RefreshToken(account)
if err != nil {
return err
} else {
_ = model.SaveAccount(account)
return driver.Upload(file, account)
}
}
return fmt.Errorf("%s", e.Message)
}
var byteSize uint64
for i = 0; i < count; i++ {
byteSize = file.GetSize() - finish
if DEFAULT < byteSize {
byteSize = DEFAULT
}
log.Debugf("%d,%d", byteSize, finish)
byteData := make([]byte, byteSize)
n, err := io.ReadFull(file, byteData)
//n, err := file.Read(byteData)
//byteData, err := io.ReadAll(file)
//n := len(byteData)
log.Debug(err, n)
if err != nil {
return err
}
finish += uint64(n)
req, err := http.NewRequest("PUT", resp.PartInfoList[i].UploadUrl, bytes.NewBuffer(byteData))
if err != nil {
return err
}
res, err := base.HttpClient.Do(req)
if err != nil {
return err
}
log.Debugf("%+v", res)
//res, err := base.BaseClient.R().
// SetHeader("Content-Type","").
// SetBody(byteData).Put(resp.PartInfoList[i].UploadUrl)
//if err != nil {
// return err
//}
//log.Debugf("put to %s : %d,%s", resp.PartInfoList[i].UploadUrl, res.StatusCode(),res.String())
}
var resp2 base.Json
_, err = aliClient.R().SetResult(&resp2).SetError(&e).
SetHeader("authorization", "Bearer\t"+account.AccessToken).
SetBody(base.Json{
"drive_id": account.DriveId,
"file_id": resp.FileId,
"upload_id": resp.UploadId,
}).Post("https://api.aliyundrive.com/v2/file/complete")
if e.Code != "" {
//if e.Code == "AccessTokenInvalid" {
// err = driver.RefreshToken(account)
// if err != nil {
// return err
// } else {
// _ = model.SaveAccount(account)
// return driver.Upload(file, account)
// }
//}
return fmt.Errorf("%s", e.Message)
}
if resp2["file_id"] == resp.FileId {
_ = base.DeleteCache(file.ParentPath, account)
return nil
}
return fmt.Errorf("%+v", resp2)
}
var _ base.Driver = (*AliDrive)(nil)

View File

@ -1,40 +0,0 @@
package alist
import (
"errors"
"github.com/Xhofe/alist/drivers/base"
"github.com/Xhofe/alist/model"
)
type BaseResp struct {
Code int `json:"code"`
Message string `json:"message"`
}
type PathResp struct {
BaseResp
Data []model.File `json:"data"`
}
type PreviewResp struct {
BaseResp
Data interface{} `json:"data"`
}
func (driver *Alist) Login(account *model.Account) error {
var resp BaseResp
_, err := base.RestyClient.R().SetResult(&resp).
SetHeader("Authorization", account.AccessToken).
Get(account.SiteUrl+"/api/admin/login")
if err != nil {
return err
}
if resp.Code != 200 {
return errors.New(resp.Message)
}
return nil
}
func init() {
base.RegisterDriver(&Alist{})
}

View File

@ -1,186 +0,0 @@
package alist
import (
"errors"
"fmt"
"github.com/Xhofe/alist/conf"
"github.com/Xhofe/alist/drivers/base"
"github.com/Xhofe/alist/model"
"github.com/Xhofe/alist/utils"
"github.com/gin-gonic/gin"
"path/filepath"
"strings"
"time"
)
type Alist struct{}
func (driver Alist) Config() base.DriverConfig {
return base.DriverConfig{
Name: "Alist",
NoNeedSetLink: true,
NoCors: true,
}
}
func (driver Alist) Items() []base.Item {
return []base.Item{
{
Name: "site_url",
Label: "alist site url",
Type: base.TypeString,
Required: true,
},
{
Name: "access_token",
Label: "token",
Type: base.TypeString,
Description: "admin token",
Required: true,
},
{
Name: "root_folder",
Label: "root folder path",
Type: base.TypeString,
Required: false,
},
}
}
func (driver Alist) Save(account *model.Account, old *model.Account) error {
account.SiteUrl = strings.TrimRight(account.SiteUrl, "/")
if account.RootFolder == "" {
account.RootFolder = "/"
}
err := driver.Login(account)
if err == nil {
account.Status = "work"
} else {
account.Status = err.Error()
}
_ = model.SaveAccount(account)
return err
}
func (driver Alist) File(path string, account *model.Account) (*model.File, error) {
now := time.Now()
if path == "/" {
return &model.File{
Id: "root",
Name: "root",
Size: 0,
Type: conf.FOLDER,
Driver: driver.Config().Name,
UpdatedAt: &now,
}, nil
}
_, files, err := driver.Path(utils.Dir(path), account)
if err != nil {
return nil, err
}
if files == nil {
return nil, base.ErrPathNotFound
}
name := utils.Base(path)
for _, file := range files {
if file.Name == name {
return &file, nil
}
}
return nil, base.ErrPathNotFound
}
func (driver Alist) Files(path string, account *model.Account) ([]model.File, error) {
//return nil, base.ErrNotImplement
_, files, err := driver.Path(path, account)
if err != nil {
return nil, err
}
if files == nil {
return nil, base.ErrNotFolder
}
return files, nil
}
func (driver Alist) Link(args base.Args, account *model.Account) (*base.Link, error) {
path := args.Path
path = utils.ParsePath(path)
name := utils.Base(path)
flag := "d"
if utils.GetFileType(filepath.Ext(path)) == conf.TEXT {
flag = "p"
}
link := base.Link{}
link.Url = fmt.Sprintf("%s/%s%s?sign=%s", account.SiteUrl, flag, path, utils.SignWithToken(name, conf.Token))
return &link, nil
}
func (driver Alist) Path(path string, account *model.Account) (*model.File, []model.File, error) {
path = utils.ParsePath(path)
path = filepath.Join(account.RootFolder, path)
path = strings.ReplaceAll(path, "\\", "/")
cache, err := base.GetCache(path, account)
if err == nil {
files := cache.([]model.File)
return nil, files, nil
}
var resp PathResp
_, err = base.RestyClient.R().SetResult(&resp).
SetHeader("Authorization", account.AccessToken).
SetBody(base.Json{
"path": path,
}).Post(account.SiteUrl + "/api/public/path")
if err != nil {
return nil, nil, err
}
if resp.Code != 200 {
return nil, nil, errors.New(resp.Message)
}
if resp.Message == "file" {
return &resp.Data[0], nil, nil
}
if len(resp.Data) > 0 {
_ = base.SetCache(path, resp.Data, account)
}
return nil, resp.Data, nil
}
func (driver Alist) Proxy(c *gin.Context, account *model.Account) {}
func (driver Alist) Preview(path string, account *model.Account) (interface{}, error) {
var resp PathResp
_, err := base.RestyClient.R().SetResult(&resp).
SetHeader("Authorization", account.AccessToken).
SetBody(base.Json{
"path": path,
}).Post(account.SiteUrl + "/api/public/preview")
if err != nil {
return nil, err
}
if resp.Code != 200 {
return nil, errors.New(resp.Message)
}
return resp.Data, nil
}
func (driver Alist) MakeDir(path string, account *model.Account) error {
return base.ErrNotImplement
}
func (driver Alist) Move(src string, dst string, account *model.Account) error {
return base.ErrNotImplement
}
func (driver Alist) Copy(src string, dst string, account *model.Account) error {
return base.ErrNotImplement
}
func (driver Alist) Delete(path string, account *model.Account) error {
return base.ErrNotImplement
}
func (driver Alist) Upload(file *model.FileStream, account *model.Account) error {
return base.ErrNotImplement
}
var _ base.Driver = (*Alist)(nil)

118
drivers/alist_v2/driver.go Normal file
View File

@ -0,0 +1,118 @@
package alist_v2
import (
"context"
"github.com/alist-org/alist/v3/drivers/base"
"github.com/alist-org/alist/v3/internal/driver"
"github.com/alist-org/alist/v3/internal/errs"
"github.com/alist-org/alist/v3/internal/model"
"github.com/alist-org/alist/v3/server/common"
)
type AListV2 struct {
model.Storage
Addition
}
func (d *AListV2) Config() driver.Config {
return config
}
func (d *AListV2) GetAddition() driver.Additional {
return &d.Addition
}
func (d *AListV2) Init(ctx context.Context) error {
if len(d.Addition.Address) > 0 && string(d.Addition.Address[len(d.Addition.Address)-1]) == "/" {
d.Addition.Address = d.Addition.Address[0 : len(d.Addition.Address)-1]
}
// TODO login / refresh token
//op.MustSaveDriverStorage(d)
return nil
}
func (d *AListV2) Drop(ctx context.Context) error {
return nil
}
func (d *AListV2) List(ctx context.Context, dir model.Obj, args model.ListArgs) ([]model.Obj, error) {
url := d.Address + "/api/public/path"
var resp common.Resp[PathResp]
_, err := base.RestyClient.R().
SetResult(&resp).
SetHeader("Authorization", d.AccessToken).
SetBody(PathReq{
PageNum: 0,
PageSize: 0,
Path: dir.GetPath(),
Password: d.Password,
}).Post(url)
if err != nil {
return nil, err
}
var files []model.Obj
for _, f := range resp.Data.Files {
file := model.ObjThumb{
Object: model.Object{
Name: f.Name,
Modified: *f.UpdatedAt,
Size: f.Size,
IsFolder: f.Type == 1,
},
Thumbnail: model.Thumbnail{Thumbnail: f.Thumbnail},
}
files = append(files, &file)
}
return files, nil
}
func (d *AListV2) Link(ctx context.Context, file model.Obj, args model.LinkArgs) (*model.Link, error) {
url := d.Address + "/api/public/path"
var resp common.Resp[PathResp]
_, err := base.RestyClient.R().
SetResult(&resp).
SetHeader("Authorization", d.AccessToken).
SetBody(PathReq{
PageNum: 0,
PageSize: 0,
Path: file.GetPath(),
Password: d.Password,
}).Post(url)
if err != nil {
return nil, err
}
return &model.Link{
URL: resp.Data.Files[0].Url,
}, nil
}
func (d *AListV2) MakeDir(ctx context.Context, parentDir model.Obj, dirName string) error {
return errs.NotImplement
}
func (d *AListV2) Move(ctx context.Context, srcObj, dstDir model.Obj) error {
return errs.NotImplement
}
func (d *AListV2) Rename(ctx context.Context, srcObj model.Obj, newName string) error {
return errs.NotImplement
}
func (d *AListV2) Copy(ctx context.Context, srcObj, dstDir model.Obj) error {
return errs.NotImplement
}
func (d *AListV2) Remove(ctx context.Context, obj model.Obj) error {
return errs.NotImplement
}
func (d *AListV2) Put(ctx context.Context, dstDir model.Obj, stream model.FileStreamer, up driver.UpdateProgress) error {
return errs.NotImplement
}
//func (d *AList) Other(ctx context.Context, args model.OtherArgs) (interface{}, error) {
// return nil, errs.NotSupport
//}
var _ driver.Driver = (*AListV2)(nil)

26
drivers/alist_v2/meta.go Normal file
View File

@ -0,0 +1,26 @@
package alist_v2
import (
"github.com/alist-org/alist/v3/internal/driver"
"github.com/alist-org/alist/v3/internal/op"
)
type Addition struct {
driver.RootPath
Address string `json:"url" required:"true"`
Password string `json:"password"`
AccessToken string `json:"access_token"`
}
var config = driver.Config{
Name: "AList V2",
LocalSort: true,
NoUpload: true,
DefaultRoot: "/",
}
func init() {
op.RegisterDriver(func() driver.Driver {
return &AListV2{}
})
}

31
drivers/alist_v2/types.go Normal file
View File

@ -0,0 +1,31 @@
package alist_v2
import (
"time"
)
type File struct {
Id string `json:"-"`
Name string `json:"name"`
Size int64 `json:"size"`
Type int `json:"type"`
Driver string `json:"driver"`
UpdatedAt *time.Time `json:"updated_at"`
Thumbnail string `json:"thumbnail"`
Url string `json:"url"`
SizeStr string `json:"size_str"`
TimeStr string `json:"time_str"`
}
type PathResp struct {
Type string `json:"type"`
//Meta Meta `json:"meta"`
Files []File `json:"files"`
}
type PathReq struct {
PageNum int `json:"page_num"`
PageSize int `json:"page_size"`
Password string `json:"password"`
Path string `json:"path"`
}

Some files were not shown because too many files have changed in this diff Show More