v0.3.1 release (#245)
This commit is contained in:
parent
21c9e321b3
commit
15e8b4602f
|
@ -21,7 +21,7 @@ jobs:
|
||||||
- uses: actions/checkout@v4
|
- uses: actions/checkout@v4
|
||||||
- uses: actions/setup-go@v5
|
- uses: actions/setup-go@v5
|
||||||
with:
|
with:
|
||||||
go-version: '1.22.5'
|
go-version: '1.23.3'
|
||||||
- uses: golangci/golangci-lint-action@v5
|
- uses: golangci/golangci-lint-action@v5
|
||||||
with:
|
with:
|
||||||
version: v1.60
|
version: v1.60
|
||||||
|
@ -41,4 +41,11 @@ jobs:
|
||||||
- uses: actions/checkout@v4
|
- uses: actions/checkout@v4
|
||||||
- uses: actions/setup-node@v4
|
- uses: actions/setup-node@v4
|
||||||
- working-directory: frontend
|
- working-directory: frontend
|
||||||
run: npm i eslint && npm run lint
|
run: npm i && npm run lint
|
||||||
|
test-frontend:
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
steps:
|
||||||
|
- uses: actions/checkout@v4
|
||||||
|
- uses: actions/setup-node@v4
|
||||||
|
- working-directory: frontend
|
||||||
|
run: npm i && npm run test
|
||||||
|
|
18
CHANGELOG.md
18
CHANGELOG.md
|
@ -2,6 +2,24 @@
|
||||||
|
|
||||||
All notable changes to this project will be documented in this file. For commit guidelines, please refer to [Standard Version](https://github.com/conventional-changelog/standard-version).
|
All notable changes to this project will be documented in this file. For commit guidelines, please refer to [Standard Version](https://github.com/conventional-changelog/standard-version).
|
||||||
|
|
||||||
|
## v0.3.1
|
||||||
|
|
||||||
|
**New Features**
|
||||||
|
- Adds Smart Indexing by default.
|
||||||
|
|
||||||
|
**Notes**:
|
||||||
|
- Optimized api request response times via improved caching and simplified actions.
|
||||||
|
- User information persists more reliably.
|
||||||
|
- Added [indexing doc](./docs/indexing.md) to explain the expectations around indexing and how it works.
|
||||||
|
- The index should also use less RAM than it did in v0.3.0.
|
||||||
|
|
||||||
|
**Bugfixes**:
|
||||||
|
- Tweaked sorting by name, fixes case sensitive and numeric sorting. https://github.com/gtsteffaniak/filebrowser/issues/230
|
||||||
|
- Fixed unnecessary authentication status checks each route change
|
||||||
|
- Fix create file action issue.
|
||||||
|
- some small javascript related issues.
|
||||||
|
- Fixes pretty big bug viewing raw content in v0.3.0 (utf format message)
|
||||||
|
|
||||||
## v0.3.0
|
## v0.3.0
|
||||||
|
|
||||||
This Release focuses on the API and making it more accessible for developers to access functions without the UI.
|
This Release focuses on the API and making it more accessible for developers to access functions without the UI.
|
||||||
|
|
23
README.md
23
README.md
|
@ -10,19 +10,25 @@
|
||||||
</p>
|
</p>
|
||||||
|
|
||||||
> [!WARNING]
|
> [!WARNING]
|
||||||
> Starting with `v0.3.0` API routes have been slightly altered for friendly usage outside of the UI.
|
> Starting with `v0.3.0` API routes have been slightly altered for friendly usage outside of the UI. The resources api returns items in separate `files` and `folder` objects now.
|
||||||
|
|
||||||
|
> [!WARNING]
|
||||||
> If on windows, please use docker. The windows binary is unstable and may not work.
|
> If on windows, please use docker. The windows binary is unstable and may not work.
|
||||||
|
|
||||||
|
> [!WARNING]
|
||||||
|
> There is no stable version yet. Always check release notes for bugfixes on functionality that may have been changed. If you notice any unexpected behavior -- please open an issue to have it fixed soon.
|
||||||
|
|
||||||
FileBrowser Quantum is a fork of the file browser opensource project with the following changes:
|
FileBrowser Quantum is a fork of the file browser opensource project with the following changes:
|
||||||
|
|
||||||
1. [x] Efficiently indexed files
|
1. [x] Indexes files efficiently. See [indexing readme](./docs/indexing.md)
|
||||||
- Real-time search results as you type
|
- Real-time search results as you type
|
||||||
- Search Works with more type filters
|
- Search supports file/folder sizes and many file type filters.
|
||||||
- Enhanced interactive results page.
|
- Enhanced interactive results that shows file/folder sizes.
|
||||||
- file/folder sizes are shown in the response
|
|
||||||
1. [x] Revamped and simplified GUI navbar and sidebar menu.
|
1. [x] Revamped and simplified GUI navbar and sidebar menu.
|
||||||
- Additional compact view mode as well as refreshed view mode
|
- Additional compact view mode as well as refreshed view mode
|
||||||
styles.
|
styles.
|
||||||
|
- Many graphical and user experience improvements.
|
||||||
|
- right-click context menu
|
||||||
1. [x] Revamped and simplified configuration via `filebrowser.yml` config file.
|
1. [x] Revamped and simplified configuration via `filebrowser.yml` config file.
|
||||||
1. [x] Better listing browsing
|
1. [x] Better listing browsing
|
||||||
- Switching view modes is instant
|
- Switching view modes is instant
|
||||||
|
@ -33,6 +39,13 @@ FileBrowser Quantum is a fork of the file browser opensource project with the fo
|
||||||
- Can create long-live API Tokens.
|
- Can create long-live API Tokens.
|
||||||
- Helpful Swagger page available at `/swagger` endpoint.
|
- Helpful Swagger page available at `/swagger` endpoint.
|
||||||
|
|
||||||
|
Notable features that this fork *does not* have (removed):
|
||||||
|
|
||||||
|
- jobs/runners are not supported yet (planned).
|
||||||
|
- shell commands are completely removed and will not be returning.
|
||||||
|
- themes and branding are not fully supported yet (planned).
|
||||||
|
- see feature matrix below for more.
|
||||||
|
|
||||||
## About
|
## About
|
||||||
|
|
||||||
FileBrowser Quantum provides a file-managing interface within a specified directory
|
FileBrowser Quantum provides a file-managing interface within a specified directory
|
||||||
|
|
|
@ -114,10 +114,6 @@ func StartFilebrowser() {
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
store, dbExists := getStore(configPath)
|
store, dbExists := getStore(configPath)
|
||||||
indexingInterval := fmt.Sprint(settings.Config.Server.IndexingInterval, " minutes")
|
|
||||||
if !settings.Config.Server.Indexing {
|
|
||||||
indexingInterval = "disabled"
|
|
||||||
}
|
|
||||||
database := fmt.Sprintf("Using existing database : %v", settings.Config.Server.Database)
|
database := fmt.Sprintf("Using existing database : %v", settings.Config.Server.Database)
|
||||||
if !dbExists {
|
if !dbExists {
|
||||||
database = fmt.Sprintf("Creating new database : %v", settings.Config.Server.Database)
|
database = fmt.Sprintf("Creating new database : %v", settings.Config.Server.Database)
|
||||||
|
@ -127,14 +123,13 @@ func StartFilebrowser() {
|
||||||
log.Println("Embeded frontend :", os.Getenv("FILEBROWSER_NO_EMBEDED") != "true")
|
log.Println("Embeded frontend :", os.Getenv("FILEBROWSER_NO_EMBEDED") != "true")
|
||||||
log.Println(database)
|
log.Println(database)
|
||||||
log.Println("Sources :", settings.Config.Server.Root)
|
log.Println("Sources :", settings.Config.Server.Root)
|
||||||
log.Println("Indexing interval :", indexingInterval)
|
|
||||||
|
|
||||||
serverConfig := settings.Config.Server
|
serverConfig := settings.Config.Server
|
||||||
swagInfo := docs.SwaggerInfo
|
swagInfo := docs.SwaggerInfo
|
||||||
swagInfo.BasePath = serverConfig.BaseURL
|
swagInfo.BasePath = serverConfig.BaseURL
|
||||||
swag.Register(docs.SwaggerInfo.InstanceName(), swagInfo)
|
swag.Register(docs.SwaggerInfo.InstanceName(), swagInfo)
|
||||||
// initialize indexing and schedule indexing ever n minutes (default 5)
|
// initialize indexing and schedule indexing ever n minutes (default 5)
|
||||||
go files.InitializeIndex(serverConfig.IndexingInterval, serverConfig.Indexing)
|
go files.InitializeIndex(serverConfig.Indexing)
|
||||||
if err := rootCMD(store, &serverConfig); err != nil {
|
if err := rootCMD(store, &serverConfig); err != nil {
|
||||||
log.Fatal("Error starting filebrowser:", err)
|
log.Fatal("Error starting filebrowser:", err)
|
||||||
}
|
}
|
||||||
|
|
|
@ -14,24 +14,72 @@ var AllFiletypeOptions = []string{
|
||||||
"archive",
|
"archive",
|
||||||
"video",
|
"video",
|
||||||
"doc",
|
"doc",
|
||||||
"dir",
|
|
||||||
"text",
|
"text",
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Document file extensions
|
||||||
var documentTypes = []string{
|
var documentTypes = []string{
|
||||||
".word",
|
// Common Document Formats
|
||||||
".pdf",
|
".doc", ".docx", // Microsoft Word
|
||||||
".doc",
|
".pdf", // Portable Document Format
|
||||||
".docx",
|
".odt", // OpenDocument Text
|
||||||
}
|
".rtf", // Rich Text Format
|
||||||
var textTypes = []string{
|
|
||||||
".text",
|
// Presentation Formats
|
||||||
".sh",
|
".ppt", ".pptx", // Microsoft PowerPoint
|
||||||
".yaml",
|
".odp", // OpenDocument Presentation
|
||||||
".yml",
|
|
||||||
".json",
|
// Spreadsheet Formats
|
||||||
".env",
|
".xls", ".xlsx", // Microsoft Excel
|
||||||
|
".ods", // OpenDocument Spreadsheet
|
||||||
|
|
||||||
|
// Other Document Formats
|
||||||
|
".epub", // Electronic Publication
|
||||||
|
".mobi", // Amazon Kindle
|
||||||
|
".fb2", // FictionBook
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Text-based file extensions
|
||||||
|
var textTypes = []string{
|
||||||
|
// Common Text Formats
|
||||||
|
".txt",
|
||||||
|
".md", // Markdown
|
||||||
|
|
||||||
|
// Scripting and Programming Languages
|
||||||
|
".sh", // Bash script
|
||||||
|
".py", // Python
|
||||||
|
".js", // JavaScript
|
||||||
|
".ts", // TypeScript
|
||||||
|
".php", // PHP
|
||||||
|
".rb", // Ruby
|
||||||
|
".go", // Go
|
||||||
|
".java", // Java
|
||||||
|
".c", ".cpp", // C/C++
|
||||||
|
".cs", // C#
|
||||||
|
".swift", // Swift
|
||||||
|
|
||||||
|
// Configuration Files
|
||||||
|
".yaml", ".yml", // YAML
|
||||||
|
".json", // JSON
|
||||||
|
".xml", // XML
|
||||||
|
".ini", // INI
|
||||||
|
".toml", // TOML
|
||||||
|
".cfg", // Configuration file
|
||||||
|
|
||||||
|
// Other Text-Based Formats
|
||||||
|
".css", // Cascading Style Sheets
|
||||||
|
".html", ".htm", // HyperText Markup Language
|
||||||
|
".sql", // SQL
|
||||||
|
".csv", // Comma-Separated Values
|
||||||
|
".tsv", // Tab-Separated Values
|
||||||
|
".log", // Log file
|
||||||
|
".bat", // Batch file
|
||||||
|
".ps1", // PowerShell script
|
||||||
|
".tex", // LaTeX
|
||||||
|
".bib", // BibTeX
|
||||||
|
}
|
||||||
|
|
||||||
|
// Compressed file extensions
|
||||||
var compressedFile = []string{
|
var compressedFile = []string{
|
||||||
".7z",
|
".7z",
|
||||||
".rar",
|
".rar",
|
||||||
|
@ -39,6 +87,12 @@ var compressedFile = []string{
|
||||||
".tar",
|
".tar",
|
||||||
".gz",
|
".gz",
|
||||||
".xz",
|
".xz",
|
||||||
|
".bz2",
|
||||||
|
".tgz", // tar.gz
|
||||||
|
".tbz2", // tar.bz2
|
||||||
|
".lzma",
|
||||||
|
".lz4",
|
||||||
|
".zstd",
|
||||||
}
|
}
|
||||||
|
|
||||||
type SearchOptions struct {
|
type SearchOptions struct {
|
||||||
|
@ -48,8 +102,8 @@ type SearchOptions struct {
|
||||||
Terms []string
|
Terms []string
|
||||||
}
|
}
|
||||||
|
|
||||||
func ParseSearch(value string) *SearchOptions {
|
func ParseSearch(value string) SearchOptions {
|
||||||
opts := &SearchOptions{
|
opts := SearchOptions{
|
||||||
Conditions: map[string]bool{
|
Conditions: map[string]bool{
|
||||||
"exact": strings.Contains(value, "case:exact"),
|
"exact": strings.Contains(value, "case:exact"),
|
||||||
},
|
},
|
||||||
|
|
|
@ -13,6 +13,8 @@ import (
|
||||||
"net/http"
|
"net/http"
|
||||||
"os"
|
"os"
|
||||||
"path/filepath"
|
"path/filepath"
|
||||||
|
"sort"
|
||||||
|
"strconv"
|
||||||
"strings"
|
"strings"
|
||||||
"sync"
|
"sync"
|
||||||
"time"
|
"time"
|
||||||
|
@ -22,6 +24,7 @@ import (
|
||||||
"github.com/gtsteffaniak/filebrowser/fileutils"
|
"github.com/gtsteffaniak/filebrowser/fileutils"
|
||||||
"github.com/gtsteffaniak/filebrowser/settings"
|
"github.com/gtsteffaniak/filebrowser/settings"
|
||||||
"github.com/gtsteffaniak/filebrowser/users"
|
"github.com/gtsteffaniak/filebrowser/users"
|
||||||
|
"github.com/gtsteffaniak/filebrowser/utils"
|
||||||
)
|
)
|
||||||
|
|
||||||
var (
|
var (
|
||||||
|
@ -29,34 +32,30 @@ var (
|
||||||
pathMutexesMu sync.Mutex // Mutex to protect the pathMutexes map
|
pathMutexesMu sync.Mutex // Mutex to protect the pathMutexes map
|
||||||
)
|
)
|
||||||
|
|
||||||
type ReducedItem struct {
|
type ItemInfo struct {
|
||||||
Name string `json:"name"`
|
Name string `json:"name"`
|
||||||
Size int64 `json:"size"`
|
Size int64 `json:"size"`
|
||||||
ModTime time.Time `json:"modified"`
|
ModTime time.Time `json:"modified"`
|
||||||
Type string `json:"type"`
|
Type string `json:"type"`
|
||||||
Mode os.FileMode `json:"-"`
|
|
||||||
Content string `json:"content,omitempty"`
|
|
||||||
}
|
}
|
||||||
|
|
||||||
// FileInfo describes a file.
|
// FileInfo describes a file.
|
||||||
// reduced item is non-recursive reduced "Items", used to pass flat items array
|
// reduced item is non-recursive reduced "Items", used to pass flat items array
|
||||||
type FileInfo struct {
|
type FileInfo struct {
|
||||||
Files []ReducedItem `json:"-"`
|
ItemInfo
|
||||||
Dirs map[string]*FileInfo `json:"-"`
|
Files []ItemInfo `json:"files"`
|
||||||
Path string `json:"path"`
|
Folders []ItemInfo `json:"folders"`
|
||||||
Name string `json:"name"`
|
Path string `json:"path"`
|
||||||
Items []ReducedItem `json:"items"`
|
}
|
||||||
Size int64 `json:"size"`
|
|
||||||
Extension string `json:"-"`
|
// for efficiency, a response will be a pointer to the data
|
||||||
ModTime time.Time `json:"modified"`
|
// extra calculated fields can be added here
|
||||||
CacheTime time.Time `json:"-"`
|
type ExtendedFileInfo struct {
|
||||||
Mode os.FileMode `json:"-"`
|
*FileInfo
|
||||||
IsSymlink bool `json:"isSymlink,omitempty"`
|
Content string `json:"content,omitempty"`
|
||||||
Type string `json:"type"`
|
Subtitles []string `json:"subtitles,omitempty"`
|
||||||
Subtitles []string `json:"subtitles,omitempty"`
|
Checksums map[string]string `json:"checksums,omitempty"`
|
||||||
Content string `json:"content,omitempty"`
|
Token string `json:"token,omitempty"`
|
||||||
Checksums map[string]string `json:"checksums,omitempty"`
|
|
||||||
Token string `json:"token,omitempty"`
|
|
||||||
}
|
}
|
||||||
|
|
||||||
// FileOptions are the options when getting a file info.
|
// FileOptions are the options when getting a file info.
|
||||||
|
@ -66,7 +65,6 @@ type FileOptions struct {
|
||||||
Modify bool
|
Modify bool
|
||||||
Expand bool
|
Expand bool
|
||||||
ReadHeader bool
|
ReadHeader bool
|
||||||
Token string
|
|
||||||
Checker users.Checker
|
Checker users.Checker
|
||||||
Content bool
|
Content bool
|
||||||
}
|
}
|
||||||
|
@ -75,206 +73,70 @@ func (f FileOptions) Components() (string, string) {
|
||||||
return filepath.Dir(f.Path), filepath.Base(f.Path)
|
return filepath.Dir(f.Path), filepath.Base(f.Path)
|
||||||
}
|
}
|
||||||
|
|
||||||
func FileInfoFaster(opts FileOptions) (*FileInfo, error) {
|
func FileInfoFaster(opts FileOptions) (ExtendedFileInfo, error) {
|
||||||
index := GetIndex(rootPath)
|
index := GetIndex(rootPath)
|
||||||
opts.Path = index.makeIndexPath(opts.Path)
|
opts.Path = index.makeIndexPath(opts.Path)
|
||||||
|
response := ExtendedFileInfo{}
|
||||||
// Lock access for the specific path
|
// Lock access for the specific path
|
||||||
pathMutex := getMutex(opts.Path)
|
pathMutex := getMutex(opts.Path)
|
||||||
pathMutex.Lock()
|
pathMutex.Lock()
|
||||||
defer pathMutex.Unlock()
|
defer pathMutex.Unlock()
|
||||||
if !opts.Checker.Check(opts.Path) {
|
if !opts.Checker.Check(opts.Path) {
|
||||||
return nil, os.ErrPermission
|
return response, os.ErrPermission
|
||||||
}
|
}
|
||||||
|
|
||||||
_, isDir, err := GetRealPath(opts.Path)
|
_, isDir, err := GetRealPath(opts.Path)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, err
|
return response, err
|
||||||
}
|
}
|
||||||
opts.IsDir = isDir
|
opts.IsDir = isDir
|
||||||
|
|
||||||
|
// TODO : whats the best way to save trips to disk here?
|
||||||
|
// disabled using cache because its not clear if this is helping or hurting
|
||||||
// check if the file exists in the index
|
// check if the file exists in the index
|
||||||
info, exists := index.GetReducedMetadata(opts.Path, opts.IsDir)
|
//info, exists := index.GetReducedMetadata(opts.Path, opts.IsDir)
|
||||||
if exists {
|
//if exists {
|
||||||
// Let's not refresh if less than a second has passed
|
// err := RefreshFileInfo(opts)
|
||||||
if time.Since(info.CacheTime) > time.Second {
|
// if err != nil {
|
||||||
RefreshFileInfo(opts) //nolint:errcheck
|
// return info, err
|
||||||
}
|
// }
|
||||||
if opts.Content {
|
// if opts.Content {
|
||||||
content := ""
|
// content := ""
|
||||||
content, err = getContent(opts.Path)
|
// content, err = getContent(opts.Path)
|
||||||
if err != nil {
|
// if err != nil {
|
||||||
return info, err
|
// return info, err
|
||||||
}
|
// }
|
||||||
info.Content = content
|
// info.Content = content
|
||||||
}
|
// }
|
||||||
return info, nil
|
// return info, nil
|
||||||
}
|
//}
|
||||||
err = RefreshFileInfo(opts)
|
|
||||||
|
err = index.RefreshFileInfo(opts)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, err
|
return response, err
|
||||||
}
|
}
|
||||||
info, exists = index.GetReducedMetadata(opts.Path, opts.IsDir)
|
info, exists := index.GetReducedMetadata(opts.Path, opts.IsDir)
|
||||||
if !exists {
|
if !exists {
|
||||||
return nil, err
|
return response, err
|
||||||
}
|
}
|
||||||
if opts.Content {
|
if opts.Content {
|
||||||
content, err := getContent(opts.Path)
|
content, err := getContent(opts.Path)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return info, err
|
return response, err
|
||||||
}
|
}
|
||||||
info.Content = content
|
response.Content = content
|
||||||
}
|
}
|
||||||
return info, nil
|
response.FileInfo = info
|
||||||
}
|
return response, nil
|
||||||
|
|
||||||
func RefreshFileInfo(opts FileOptions) error {
|
|
||||||
refreshOptions := FileOptions{
|
|
||||||
Path: opts.Path,
|
|
||||||
IsDir: opts.IsDir,
|
|
||||||
Token: opts.Token,
|
|
||||||
}
|
|
||||||
index := GetIndex(rootPath)
|
|
||||||
|
|
||||||
if !refreshOptions.IsDir {
|
|
||||||
refreshOptions.Path = index.makeIndexPath(filepath.Dir(refreshOptions.Path))
|
|
||||||
refreshOptions.IsDir = true
|
|
||||||
} else {
|
|
||||||
refreshOptions.Path = index.makeIndexPath(refreshOptions.Path)
|
|
||||||
}
|
|
||||||
|
|
||||||
current, exists := index.GetMetadataInfo(refreshOptions.Path, true)
|
|
||||||
|
|
||||||
file, err := stat(refreshOptions)
|
|
||||||
if err != nil {
|
|
||||||
return fmt.Errorf("file/folder does not exist to refresh data: %s", refreshOptions.Path)
|
|
||||||
}
|
|
||||||
|
|
||||||
//utils.PrintStructFields(*file)
|
|
||||||
result := index.UpdateMetadata(file)
|
|
||||||
if !result {
|
|
||||||
return fmt.Errorf("file/folder does not exist in metadata: %s", refreshOptions.Path)
|
|
||||||
}
|
|
||||||
if !exists {
|
|
||||||
return nil
|
|
||||||
}
|
|
||||||
if current.Size != file.Size {
|
|
||||||
index.recursiveUpdateDirSizes(filepath.Dir(refreshOptions.Path), file, current.Size)
|
|
||||||
}
|
|
||||||
return nil
|
|
||||||
}
|
|
||||||
|
|
||||||
func stat(opts FileOptions) (*FileInfo, error) {
|
|
||||||
realPath, _, err := GetRealPath(rootPath, opts.Path)
|
|
||||||
if err != nil {
|
|
||||||
return nil, err
|
|
||||||
}
|
|
||||||
info, err := os.Lstat(realPath)
|
|
||||||
if err != nil {
|
|
||||||
return nil, err
|
|
||||||
}
|
|
||||||
file := &FileInfo{
|
|
||||||
Path: opts.Path,
|
|
||||||
Name: filepath.Base(opts.Path),
|
|
||||||
ModTime: info.ModTime(),
|
|
||||||
Mode: info.Mode(),
|
|
||||||
Size: info.Size(),
|
|
||||||
Extension: filepath.Ext(info.Name()),
|
|
||||||
Token: opts.Token,
|
|
||||||
}
|
|
||||||
if info.IsDir() {
|
|
||||||
// Open and read directory contents
|
|
||||||
dir, err := os.Open(realPath)
|
|
||||||
if err != nil {
|
|
||||||
return nil, err
|
|
||||||
}
|
|
||||||
defer dir.Close()
|
|
||||||
|
|
||||||
dirInfo, err := dir.Stat()
|
|
||||||
if err != nil {
|
|
||||||
return nil, err
|
|
||||||
}
|
|
||||||
index := GetIndex(rootPath)
|
|
||||||
// Check cached metadata to decide if refresh is needed
|
|
||||||
cachedParentDir, exists := index.GetMetadataInfo(opts.Path, true)
|
|
||||||
if exists && dirInfo.ModTime().Before(cachedParentDir.CacheTime) {
|
|
||||||
return cachedParentDir, nil
|
|
||||||
}
|
|
||||||
|
|
||||||
// Read directory contents and process
|
|
||||||
files, err := dir.Readdir(-1)
|
|
||||||
if err != nil {
|
|
||||||
return nil, err
|
|
||||||
}
|
|
||||||
|
|
||||||
file.Files = []ReducedItem{}
|
|
||||||
file.Dirs = map[string]*FileInfo{}
|
|
||||||
|
|
||||||
var totalSize int64
|
|
||||||
for _, item := range files {
|
|
||||||
itemPath := filepath.Join(realPath, item.Name())
|
|
||||||
|
|
||||||
if item.IsDir() {
|
|
||||||
itemInfo := &FileInfo{
|
|
||||||
Name: item.Name(),
|
|
||||||
ModTime: item.ModTime(),
|
|
||||||
Mode: item.Mode(),
|
|
||||||
}
|
|
||||||
|
|
||||||
if exists {
|
|
||||||
// if directory size was already cached use that.
|
|
||||||
cachedDir, ok := cachedParentDir.Dirs[item.Name()]
|
|
||||||
if ok {
|
|
||||||
itemInfo.Size = cachedDir.Size
|
|
||||||
}
|
|
||||||
}
|
|
||||||
file.Dirs[item.Name()] = itemInfo
|
|
||||||
totalSize += itemInfo.Size
|
|
||||||
} else {
|
|
||||||
itemInfo := ReducedItem{
|
|
||||||
Name: item.Name(),
|
|
||||||
Size: item.Size(),
|
|
||||||
ModTime: item.ModTime(),
|
|
||||||
Mode: item.Mode(),
|
|
||||||
}
|
|
||||||
if IsSymlink(item.Mode()) {
|
|
||||||
itemInfo.Type = "symlink"
|
|
||||||
info, err := os.Stat(itemPath)
|
|
||||||
if err == nil {
|
|
||||||
itemInfo.Name = info.Name()
|
|
||||||
itemInfo.ModTime = info.ModTime()
|
|
||||||
itemInfo.Size = info.Size()
|
|
||||||
itemInfo.Mode = info.Mode()
|
|
||||||
} else {
|
|
||||||
file.Type = "invalid_link"
|
|
||||||
}
|
|
||||||
}
|
|
||||||
if file.Type != "invalid_link" {
|
|
||||||
err := itemInfo.detectType(itemPath, true, opts.Content, opts.ReadHeader)
|
|
||||||
if err != nil {
|
|
||||||
fmt.Printf("failed to detect type for %v: %v \n", itemPath, err)
|
|
||||||
}
|
|
||||||
file.Files = append(file.Files, itemInfo)
|
|
||||||
}
|
|
||||||
totalSize += itemInfo.Size
|
|
||||||
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
file.Size = totalSize
|
|
||||||
}
|
|
||||||
return file, nil
|
|
||||||
}
|
}
|
||||||
|
|
||||||
// Checksum checksums a given File for a given User, using a specific
|
// Checksum checksums a given File for a given User, using a specific
|
||||||
// algorithm. The checksums data is saved on File object.
|
// algorithm. The checksums data is saved on File object.
|
||||||
func (i *FileInfo) Checksum(algo string) error {
|
func GetChecksum(fullPath, algo string) (map[string]string, error) {
|
||||||
|
subs := map[string]string{}
|
||||||
if i.Checksums == nil {
|
reader, err := os.Open(fullPath)
|
||||||
i.Checksums = map[string]string{}
|
|
||||||
}
|
|
||||||
fullpath := filepath.Join(i.Path, i.Name)
|
|
||||||
reader, err := os.Open(fullpath)
|
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return err
|
return subs, err
|
||||||
}
|
}
|
||||||
defer reader.Close()
|
defer reader.Close()
|
||||||
|
|
||||||
|
@ -287,21 +149,21 @@ func (i *FileInfo) Checksum(algo string) error {
|
||||||
|
|
||||||
h, ok := hashFuncs[algo]
|
h, ok := hashFuncs[algo]
|
||||||
if !ok {
|
if !ok {
|
||||||
return errors.ErrInvalidOption
|
return subs, errors.ErrInvalidOption
|
||||||
}
|
}
|
||||||
|
|
||||||
_, err = io.Copy(h, reader)
|
_, err = io.Copy(h, reader)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return err
|
return subs, err
|
||||||
}
|
}
|
||||||
|
subs[algo] = hex.EncodeToString(h.Sum(nil))
|
||||||
i.Checksums[algo] = hex.EncodeToString(h.Sum(nil))
|
return subs, nil
|
||||||
return nil
|
|
||||||
}
|
}
|
||||||
|
|
||||||
// RealPath gets the real path for the file, resolving symlinks if supported.
|
// RealPath gets the real path for the file, resolving symlinks if supported.
|
||||||
func (i *FileInfo) RealPath() string {
|
func (i *FileInfo) RealPath() string {
|
||||||
realPath, err := filepath.EvalSymlinks(i.Path)
|
realPath, _, _ := GetRealPath(rootPath, i.Path)
|
||||||
|
realPath, err := filepath.EvalSymlinks(realPath)
|
||||||
if err == nil {
|
if err == nil {
|
||||||
return realPath
|
return realPath
|
||||||
}
|
}
|
||||||
|
@ -314,13 +176,24 @@ func GetRealPath(relativePath ...string) (string, bool, error) {
|
||||||
combined = append(combined, strings.TrimPrefix(path, settings.Config.Server.Root))
|
combined = append(combined, strings.TrimPrefix(path, settings.Config.Server.Root))
|
||||||
}
|
}
|
||||||
joinedPath := filepath.Join(combined...)
|
joinedPath := filepath.Join(combined...)
|
||||||
|
|
||||||
|
isDir, _ := utils.RealPathCache.Get(joinedPath + ":isdir").(bool)
|
||||||
|
cached, ok := utils.RealPathCache.Get(joinedPath).(string)
|
||||||
|
if ok && cached != "" {
|
||||||
|
return cached, isDir, nil
|
||||||
|
}
|
||||||
// Convert relative path to absolute path
|
// Convert relative path to absolute path
|
||||||
absolutePath, err := filepath.Abs(joinedPath)
|
absolutePath, err := filepath.Abs(joinedPath)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return absolutePath, false, fmt.Errorf("could not get real path: %v, %s", combined, err)
|
return absolutePath, false, fmt.Errorf("could not get real path: %v, %s", combined, err)
|
||||||
}
|
}
|
||||||
// Resolve symlinks and get the real path
|
// Resolve symlinks and get the real path
|
||||||
return resolveSymlinks(absolutePath)
|
realPath, isDir, err := resolveSymlinks(absolutePath)
|
||||||
|
if err == nil {
|
||||||
|
utils.RealPathCache.Set(joinedPath, realPath)
|
||||||
|
utils.RealPathCache.Set(joinedPath+":isdir", isDir)
|
||||||
|
}
|
||||||
|
return realPath, isDir, err
|
||||||
}
|
}
|
||||||
|
|
||||||
func DeleteFiles(absPath string, opts FileOptions) error {
|
func DeleteFiles(absPath string, opts FileOptions) error {
|
||||||
|
@ -328,7 +201,8 @@ func DeleteFiles(absPath string, opts FileOptions) error {
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return err
|
return err
|
||||||
}
|
}
|
||||||
err = RefreshFileInfo(opts)
|
index := GetIndex(rootPath)
|
||||||
|
err = index.RefreshFileInfo(opts)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return err
|
return err
|
||||||
}
|
}
|
||||||
|
@ -340,8 +214,9 @@ func MoveResource(realsrc, realdst string, isSrcDir bool) error {
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return err
|
return err
|
||||||
}
|
}
|
||||||
|
index := GetIndex(rootPath)
|
||||||
// refresh info for source and dest
|
// refresh info for source and dest
|
||||||
err = RefreshFileInfo(FileOptions{
|
err = index.RefreshFileInfo(FileOptions{
|
||||||
Path: realsrc,
|
Path: realsrc,
|
||||||
IsDir: isSrcDir,
|
IsDir: isSrcDir,
|
||||||
})
|
})
|
||||||
|
@ -352,7 +227,7 @@ func MoveResource(realsrc, realdst string, isSrcDir bool) error {
|
||||||
if !isSrcDir {
|
if !isSrcDir {
|
||||||
refreshConfig.Path = filepath.Dir(realdst)
|
refreshConfig.Path = filepath.Dir(realdst)
|
||||||
}
|
}
|
||||||
err = RefreshFileInfo(refreshConfig)
|
err = index.RefreshFileInfo(refreshConfig)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return errors.ErrEmptyKey
|
return errors.ErrEmptyKey
|
||||||
}
|
}
|
||||||
|
@ -364,12 +239,12 @@ func CopyResource(realsrc, realdst string, isSrcDir bool) error {
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return err
|
return err
|
||||||
}
|
}
|
||||||
|
index := GetIndex(rootPath)
|
||||||
refreshConfig := FileOptions{Path: realdst, IsDir: true}
|
refreshConfig := FileOptions{Path: realdst, IsDir: true}
|
||||||
if !isSrcDir {
|
if !isSrcDir {
|
||||||
refreshConfig.Path = filepath.Dir(realdst)
|
refreshConfig.Path = filepath.Dir(realdst)
|
||||||
}
|
}
|
||||||
err = RefreshFileInfo(refreshConfig)
|
err = index.RefreshFileInfo(refreshConfig)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return errors.ErrEmptyKey
|
return errors.ErrEmptyKey
|
||||||
}
|
}
|
||||||
|
@ -383,7 +258,8 @@ func WriteDirectory(opts FileOptions) error {
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return err
|
return err
|
||||||
}
|
}
|
||||||
err = RefreshFileInfo(opts)
|
index := GetIndex(rootPath)
|
||||||
|
err = index.RefreshFileInfo(opts)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return errors.ErrEmptyKey
|
return errors.ErrEmptyKey
|
||||||
}
|
}
|
||||||
|
@ -391,13 +267,10 @@ func WriteDirectory(opts FileOptions) error {
|
||||||
}
|
}
|
||||||
|
|
||||||
func WriteFile(opts FileOptions, in io.Reader) error {
|
func WriteFile(opts FileOptions, in io.Reader) error {
|
||||||
dst := opts.Path
|
dst, _, _ := GetRealPath(rootPath, opts.Path)
|
||||||
parentDir := filepath.Dir(dst)
|
parentDir := filepath.Dir(dst)
|
||||||
// Split the directory from the destination path
|
|
||||||
dir := filepath.Dir(dst)
|
|
||||||
|
|
||||||
// Create the directory and all necessary parents
|
// Create the directory and all necessary parents
|
||||||
err := os.MkdirAll(dir, 0775)
|
err := os.MkdirAll(parentDir, 0775)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return err
|
return err
|
||||||
}
|
}
|
||||||
|
@ -415,35 +288,35 @@ func WriteFile(opts FileOptions, in io.Reader) error {
|
||||||
return err
|
return err
|
||||||
}
|
}
|
||||||
opts.Path = parentDir
|
opts.Path = parentDir
|
||||||
err = RefreshFileInfo(opts)
|
opts.IsDir = true
|
||||||
if err != nil {
|
index := GetIndex(rootPath)
|
||||||
return errors.ErrEmptyKey
|
return index.RefreshFileInfo(opts)
|
||||||
}
|
|
||||||
return nil
|
|
||||||
}
|
}
|
||||||
|
|
||||||
// resolveSymlinks resolves symlinks in the given path
|
// resolveSymlinks resolves symlinks in the given path
|
||||||
func resolveSymlinks(path string) (string, bool, error) {
|
func resolveSymlinks(path string) (string, bool, error) {
|
||||||
for {
|
for {
|
||||||
// Get the file info
|
// Get the file info using os.Lstat to handle symlinks
|
||||||
info, err := os.Lstat(path)
|
info, err := os.Lstat(path)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return path, false, fmt.Errorf("could not stat path: %v, %s", path, err)
|
return path, false, fmt.Errorf("could not stat path: %s, %v", path, err)
|
||||||
}
|
}
|
||||||
|
|
||||||
// Check if it's a symlink
|
// Check if the path is a symlink
|
||||||
if info.Mode()&os.ModeSymlink != 0 {
|
if info.Mode()&os.ModeSymlink != 0 {
|
||||||
// Read the symlink target
|
// Read the symlink target
|
||||||
target, err := os.Readlink(path)
|
target, err := os.Readlink(path)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return path, false, err
|
return path, false, fmt.Errorf("could not read symlink: %s, %v", path, err)
|
||||||
}
|
}
|
||||||
|
|
||||||
// Resolve the target relative to the symlink's directory
|
// Resolve the symlink's target relative to its directory
|
||||||
|
// This ensures the resolved path is absolute and correctly calculated
|
||||||
path = filepath.Join(filepath.Dir(path), target)
|
path = filepath.Join(filepath.Dir(path), target)
|
||||||
} else {
|
} else {
|
||||||
// Not a symlink, so return the resolved path and check if it's a directory
|
// Not a symlink, so return the resolved path and whether it's a directory
|
||||||
return path, info.IsDir(), nil
|
isDir := info.IsDir()
|
||||||
|
return path, isDir, nil
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
@ -461,7 +334,7 @@ func getContent(path string) (string, error) {
|
||||||
}
|
}
|
||||||
stringContent := string(content)
|
stringContent := string(content)
|
||||||
if !utf8.ValidString(stringContent) {
|
if !utf8.ValidString(stringContent) {
|
||||||
return "", fmt.Errorf("file is not utf8 encoded")
|
return "", nil
|
||||||
}
|
}
|
||||||
if stringContent == "" {
|
if stringContent == "" {
|
||||||
return "empty-file-x6OlSil", nil
|
return "empty-file-x6OlSil", nil
|
||||||
|
@ -470,21 +343,9 @@ func getContent(path string) (string, error) {
|
||||||
}
|
}
|
||||||
|
|
||||||
// detectType detects the file type.
|
// detectType detects the file type.
|
||||||
func (i *ReducedItem) detectType(path string, modify, saveContent, readHeader bool) error {
|
func (i *ItemInfo) detectType(path string, modify, saveContent, readHeader bool) error {
|
||||||
name := i.Name
|
name := i.Name
|
||||||
var contentErr error
|
var contentErr error
|
||||||
var contentString string
|
|
||||||
if saveContent {
|
|
||||||
contentString, contentErr = getContent(path)
|
|
||||||
if contentErr == nil {
|
|
||||||
i.Content = contentString
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
if IsNamedPipe(i.Mode) {
|
|
||||||
i.Type = "blob"
|
|
||||||
return contentErr
|
|
||||||
}
|
|
||||||
|
|
||||||
ext := filepath.Ext(name)
|
ext := filepath.Ext(name)
|
||||||
var buffer []byte
|
var buffer []byte
|
||||||
|
@ -533,7 +394,7 @@ func (i *ReducedItem) detectType(path string, modify, saveContent, readHeader bo
|
||||||
}
|
}
|
||||||
|
|
||||||
// readFirstBytes reads the first bytes of the file.
|
// readFirstBytes reads the first bytes of the file.
|
||||||
func (i *ReducedItem) readFirstBytes(path string) []byte {
|
func (i *ItemInfo) readFirstBytes(path string) []byte {
|
||||||
file, err := os.Open(path)
|
file, err := os.Open(path)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
i.Type = "blob"
|
i.Type = "blob"
|
||||||
|
@ -551,6 +412,7 @@ func (i *ReducedItem) readFirstBytes(path string) []byte {
|
||||||
return buffer[:n]
|
return buffer[:n]
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// TODO add subtitles back
|
||||||
// detectSubtitles detects subtitles for video files.
|
// detectSubtitles detects subtitles for video files.
|
||||||
//func (i *FileInfo) detectSubtitles(path string) {
|
//func (i *FileInfo) detectSubtitles(path string) {
|
||||||
// if i.Type != "video" {
|
// if i.Type != "video" {
|
||||||
|
@ -620,3 +482,26 @@ func Exists(path string) bool {
|
||||||
}
|
}
|
||||||
return false
|
return false
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func (info *FileInfo) SortItems() {
|
||||||
|
sort.Slice(info.Folders, func(i, j int) bool {
|
||||||
|
// Convert strings to integers for numeric sorting if both are numeric
|
||||||
|
numI, errI := strconv.Atoi(info.Folders[i].Name)
|
||||||
|
numJ, errJ := strconv.Atoi(info.Folders[j].Name)
|
||||||
|
if errI == nil && errJ == nil {
|
||||||
|
return numI < numJ
|
||||||
|
}
|
||||||
|
// Fallback to case-insensitive lexicographical sorting
|
||||||
|
return strings.ToLower(info.Folders[i].Name) < strings.ToLower(info.Folders[j].Name)
|
||||||
|
})
|
||||||
|
sort.Slice(info.Files, func(i, j int) bool {
|
||||||
|
// Convert strings to integers for numeric sorting if both are numeric
|
||||||
|
numI, errI := strconv.Atoi(info.Files[i].Name)
|
||||||
|
numJ, errJ := strconv.Atoi(info.Files[j].Name)
|
||||||
|
if errI == nil && errJ == nil {
|
||||||
|
return numI < numJ
|
||||||
|
}
|
||||||
|
// Fallback to case-insensitive lexicographical sorting
|
||||||
|
return strings.ToLower(info.Files[i].Name) < strings.ToLower(info.Files[j].Name)
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
|
@ -1,204 +0,0 @@
|
||||||
package files
|
|
||||||
|
|
||||||
import (
|
|
||||||
"log"
|
|
||||||
"os"
|
|
||||||
"path/filepath"
|
|
||||||
"strings"
|
|
||||||
"sync"
|
|
||||||
"time"
|
|
||||||
|
|
||||||
"github.com/gtsteffaniak/filebrowser/settings"
|
|
||||||
)
|
|
||||||
|
|
||||||
type Index struct {
|
|
||||||
Root string
|
|
||||||
Directories map[string]*FileInfo
|
|
||||||
NumDirs int
|
|
||||||
NumFiles int
|
|
||||||
inProgress bool
|
|
||||||
LastIndexed time.Time
|
|
||||||
mu sync.RWMutex
|
|
||||||
}
|
|
||||||
|
|
||||||
var (
|
|
||||||
rootPath string = "/srv"
|
|
||||||
indexes []*Index
|
|
||||||
indexesMutex sync.RWMutex
|
|
||||||
)
|
|
||||||
|
|
||||||
func InitializeIndex(intervalMinutes uint32, schedule bool) {
|
|
||||||
if schedule {
|
|
||||||
go indexingScheduler(intervalMinutes)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
func indexingScheduler(intervalMinutes uint32) {
|
|
||||||
if settings.Config.Server.Root != "" {
|
|
||||||
rootPath = settings.Config.Server.Root
|
|
||||||
}
|
|
||||||
si := GetIndex(rootPath)
|
|
||||||
for {
|
|
||||||
startTime := time.Now()
|
|
||||||
// Set the indexing flag to indicate that indexing is in progress
|
|
||||||
si.resetCount()
|
|
||||||
// Perform the indexing operation
|
|
||||||
err := si.indexFiles("/")
|
|
||||||
// Reset the indexing flag to indicate that indexing has finished
|
|
||||||
si.inProgress = false
|
|
||||||
// Update the LastIndexed time
|
|
||||||
si.LastIndexed = time.Now()
|
|
||||||
if err != nil {
|
|
||||||
log.Printf("Error during indexing: %v", err)
|
|
||||||
}
|
|
||||||
if si.NumFiles+si.NumDirs > 0 {
|
|
||||||
timeIndexedInSeconds := int(time.Since(startTime).Seconds())
|
|
||||||
log.Println("Successfully indexed files.")
|
|
||||||
log.Printf("Time spent indexing: %v seconds\n", timeIndexedInSeconds)
|
|
||||||
log.Printf("Files found: %v\n", si.NumFiles)
|
|
||||||
log.Printf("Directories found: %v\n", si.NumDirs)
|
|
||||||
}
|
|
||||||
// Sleep for the specified interval
|
|
||||||
time.Sleep(time.Duration(intervalMinutes) * time.Minute)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Define a function to recursively index files and directories
|
|
||||||
func (si *Index) indexFiles(adjustedPath string) error {
|
|
||||||
realPath := strings.TrimRight(si.Root, "/") + adjustedPath
|
|
||||||
|
|
||||||
// Open the directory
|
|
||||||
dir, err := os.Open(realPath)
|
|
||||||
if err != nil {
|
|
||||||
si.RemoveDirectory(adjustedPath) // Remove if it can't be opened
|
|
||||||
return err
|
|
||||||
}
|
|
||||||
defer dir.Close()
|
|
||||||
|
|
||||||
dirInfo, err := dir.Stat()
|
|
||||||
if err != nil {
|
|
||||||
return err
|
|
||||||
}
|
|
||||||
|
|
||||||
// Skip directories that haven't been modified since the last index
|
|
||||||
if dirInfo.ModTime().Before(si.LastIndexed) {
|
|
||||||
return nil
|
|
||||||
}
|
|
||||||
|
|
||||||
// Read directory contents
|
|
||||||
files, err := dir.Readdir(-1)
|
|
||||||
if err != nil {
|
|
||||||
return err
|
|
||||||
}
|
|
||||||
|
|
||||||
var totalSize int64
|
|
||||||
var numDirs, numFiles int
|
|
||||||
fileInfos := []ReducedItem{}
|
|
||||||
dirInfos := map[string]*FileInfo{}
|
|
||||||
combinedPath := adjustedPath + "/"
|
|
||||||
if adjustedPath == "/" {
|
|
||||||
combinedPath = "/"
|
|
||||||
}
|
|
||||||
|
|
||||||
// Process each file and directory in the current directory
|
|
||||||
for _, file := range files {
|
|
||||||
itemInfo := &FileInfo{
|
|
||||||
ModTime: file.ModTime(),
|
|
||||||
}
|
|
||||||
if file.IsDir() {
|
|
||||||
itemInfo.Name = file.Name()
|
|
||||||
itemInfo.Path = combinedPath + file.Name()
|
|
||||||
// Recursively index the subdirectory
|
|
||||||
err := si.indexFiles(itemInfo.Path)
|
|
||||||
if err != nil {
|
|
||||||
log.Printf("Failed to index directory %s: %v", itemInfo.Path, err)
|
|
||||||
continue
|
|
||||||
}
|
|
||||||
// Fetch the metadata for the subdirectory after indexing
|
|
||||||
subDirInfo, exists := si.GetMetadataInfo(itemInfo.Path, true)
|
|
||||||
if exists {
|
|
||||||
itemInfo.Size = subDirInfo.Size
|
|
||||||
totalSize += subDirInfo.Size // Add subdirectory size to the total
|
|
||||||
}
|
|
||||||
dirInfos[itemInfo.Name] = itemInfo
|
|
||||||
numDirs++
|
|
||||||
} else {
|
|
||||||
itemInfo := &ReducedItem{
|
|
||||||
Name: file.Name(),
|
|
||||||
ModTime: file.ModTime(),
|
|
||||||
Size: file.Size(),
|
|
||||||
Mode: file.Mode(),
|
|
||||||
}
|
|
||||||
_ = itemInfo.detectType(combinedPath+file.Name(), true, false, false)
|
|
||||||
fileInfos = append(fileInfos, *itemInfo)
|
|
||||||
totalSize += itemInfo.Size
|
|
||||||
numFiles++
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Create FileInfo for the current directory
|
|
||||||
dirFileInfo := &FileInfo{
|
|
||||||
Path: adjustedPath,
|
|
||||||
Files: fileInfos,
|
|
||||||
Dirs: dirInfos,
|
|
||||||
Size: totalSize,
|
|
||||||
ModTime: dirInfo.ModTime(),
|
|
||||||
}
|
|
||||||
|
|
||||||
// Update the current directory metadata in the index
|
|
||||||
si.UpdateMetadata(dirFileInfo)
|
|
||||||
si.NumDirs += numDirs
|
|
||||||
si.NumFiles += numFiles
|
|
||||||
|
|
||||||
return nil
|
|
||||||
}
|
|
||||||
|
|
||||||
func (si *Index) makeIndexPath(subPath string) string {
|
|
||||||
if strings.HasPrefix(subPath, "./") {
|
|
||||||
subPath = strings.TrimPrefix(subPath, ".")
|
|
||||||
}
|
|
||||||
if strings.HasPrefix(subPath, ".") || si.Root == subPath {
|
|
||||||
return "/"
|
|
||||||
}
|
|
||||||
// clean path
|
|
||||||
subPath = strings.TrimSuffix(subPath, "/")
|
|
||||||
// remove index prefix
|
|
||||||
adjustedPath := strings.TrimPrefix(subPath, si.Root)
|
|
||||||
// remove trailing slash
|
|
||||||
adjustedPath = strings.TrimSuffix(adjustedPath, "/")
|
|
||||||
if !strings.HasPrefix(adjustedPath, "/") {
|
|
||||||
adjustedPath = "/" + adjustedPath
|
|
||||||
}
|
|
||||||
return adjustedPath
|
|
||||||
}
|
|
||||||
|
|
||||||
//func getParentPath(path string) string {
|
|
||||||
// // Trim trailing slash for consistency
|
|
||||||
// path = strings.TrimSuffix(path, "/")
|
|
||||||
// if path == "" || path == "/" {
|
|
||||||
// return "" // Root has no parent
|
|
||||||
// }
|
|
||||||
//
|
|
||||||
// lastSlash := strings.LastIndex(path, "/")
|
|
||||||
// if lastSlash == -1 {
|
|
||||||
// return "/" // Parent of a top-level directory
|
|
||||||
// }
|
|
||||||
// return path[:lastSlash]
|
|
||||||
//}
|
|
||||||
|
|
||||||
func (si *Index) recursiveUpdateDirSizes(parentDir string, childInfo *FileInfo, previousSize int64) {
|
|
||||||
childDirName := filepath.Base(childInfo.Path)
|
|
||||||
if parentDir == childDirName {
|
|
||||||
return
|
|
||||||
}
|
|
||||||
dir, exists := si.GetMetadataInfo(parentDir, true)
|
|
||||||
if !exists {
|
|
||||||
return
|
|
||||||
}
|
|
||||||
dir.Dirs[childDirName] = childInfo
|
|
||||||
newSize := dir.Size - previousSize + childInfo.Size
|
|
||||||
dir.Size += newSize
|
|
||||||
si.UpdateMetadata(dir)
|
|
||||||
dir, _ = si.GetMetadataInfo(parentDir, true)
|
|
||||||
si.recursiveUpdateDirSizes(filepath.Dir(parentDir), dir, newSize)
|
|
||||||
}
|
|
|
@ -0,0 +1,229 @@
|
||||||
|
package files
|
||||||
|
|
||||||
|
import (
|
||||||
|
"fmt"
|
||||||
|
"log"
|
||||||
|
"os"
|
||||||
|
"path/filepath"
|
||||||
|
"strings"
|
||||||
|
"sync"
|
||||||
|
"time"
|
||||||
|
|
||||||
|
"github.com/gtsteffaniak/filebrowser/settings"
|
||||||
|
"github.com/gtsteffaniak/filebrowser/utils"
|
||||||
|
)
|
||||||
|
|
||||||
|
type Index struct {
|
||||||
|
Root string
|
||||||
|
Directories map[string]*FileInfo
|
||||||
|
NumDirs uint64
|
||||||
|
NumFiles uint64
|
||||||
|
NumDeleted uint64
|
||||||
|
FilesChangedDuringIndexing bool
|
||||||
|
currentSchedule int
|
||||||
|
assessment string
|
||||||
|
indexingTime int
|
||||||
|
LastIndexed time.Time
|
||||||
|
SmartModifier time.Duration
|
||||||
|
mu sync.RWMutex
|
||||||
|
scannerMu sync.Mutex
|
||||||
|
}
|
||||||
|
|
||||||
|
var (
|
||||||
|
rootPath string = "/srv"
|
||||||
|
indexes []*Index
|
||||||
|
indexesMutex sync.RWMutex
|
||||||
|
)
|
||||||
|
|
||||||
|
func InitializeIndex(enabled bool) {
|
||||||
|
if enabled {
|
||||||
|
time.Sleep(time.Second)
|
||||||
|
if settings.Config.Server.Root != "" {
|
||||||
|
rootPath = settings.Config.Server.Root
|
||||||
|
}
|
||||||
|
si := GetIndex(rootPath)
|
||||||
|
log.Println("Initializing index and assessing file system complexity")
|
||||||
|
si.RunIndexing("/", false)
|
||||||
|
go si.setupIndexingScanners()
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Define a function to recursively index files and directories
|
||||||
|
func (si *Index) indexDirectory(adjustedPath string, quick, recursive bool) error {
|
||||||
|
realPath := strings.TrimRight(si.Root, "/") + adjustedPath
|
||||||
|
|
||||||
|
// Open the directory
|
||||||
|
dir, err := os.Open(realPath)
|
||||||
|
if err != nil {
|
||||||
|
si.RemoveDirectory(adjustedPath) // Remove, must have been deleted
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
defer dir.Close()
|
||||||
|
|
||||||
|
dirInfo, err := dir.Stat()
|
||||||
|
if err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
combinedPath := adjustedPath + "/"
|
||||||
|
if adjustedPath == "/" {
|
||||||
|
combinedPath = "/"
|
||||||
|
}
|
||||||
|
// get whats currently in cache
|
||||||
|
si.mu.RLock()
|
||||||
|
cacheDirItems := []ItemInfo{}
|
||||||
|
modChange := true // default to true
|
||||||
|
cachedDir, exists := si.Directories[adjustedPath]
|
||||||
|
if exists && quick {
|
||||||
|
modChange = dirInfo.ModTime() != cachedDir.ModTime
|
||||||
|
cacheDirItems = cachedDir.Folders
|
||||||
|
}
|
||||||
|
si.mu.RUnlock()
|
||||||
|
|
||||||
|
// If the directory has not been modified since the last index, skip expensive readdir
|
||||||
|
// recursively check cached dirs for mod time changes as well
|
||||||
|
if !modChange && recursive {
|
||||||
|
for _, item := range cacheDirItems {
|
||||||
|
err = si.indexDirectory(combinedPath+item.Name, quick, true)
|
||||||
|
if err != nil {
|
||||||
|
fmt.Printf("error indexing directory %v : %v", combinedPath+item.Name, err)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
if quick {
|
||||||
|
si.mu.Lock()
|
||||||
|
si.FilesChangedDuringIndexing = true
|
||||||
|
si.mu.Unlock()
|
||||||
|
}
|
||||||
|
|
||||||
|
// Read directory contents
|
||||||
|
files, err := dir.Readdir(-1)
|
||||||
|
if err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
var totalSize int64
|
||||||
|
fileInfos := []ItemInfo{}
|
||||||
|
dirInfos := []ItemInfo{}
|
||||||
|
|
||||||
|
// Process each file and directory in the current directory
|
||||||
|
for _, file := range files {
|
||||||
|
itemInfo := &ItemInfo{
|
||||||
|
Name: file.Name(),
|
||||||
|
ModTime: file.ModTime(),
|
||||||
|
}
|
||||||
|
if file.IsDir() {
|
||||||
|
dirPath := combinedPath + file.Name()
|
||||||
|
if recursive {
|
||||||
|
// Recursively index the subdirectory
|
||||||
|
err = si.indexDirectory(dirPath, quick, recursive)
|
||||||
|
if err != nil {
|
||||||
|
log.Printf("Failed to index directory %s: %v", dirPath, err)
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
}
|
||||||
|
realDirInfo, exists := si.GetMetadataInfo(dirPath, true)
|
||||||
|
if exists {
|
||||||
|
itemInfo.Size = realDirInfo.Size
|
||||||
|
}
|
||||||
|
totalSize += itemInfo.Size
|
||||||
|
itemInfo.Type = "directory"
|
||||||
|
dirInfos = append(dirInfos, *itemInfo)
|
||||||
|
si.NumDirs++
|
||||||
|
} else {
|
||||||
|
_ = itemInfo.detectType(combinedPath+file.Name(), true, false, false)
|
||||||
|
itemInfo.Size = file.Size()
|
||||||
|
fileInfos = append(fileInfos, *itemInfo)
|
||||||
|
totalSize += itemInfo.Size
|
||||||
|
si.NumFiles++
|
||||||
|
}
|
||||||
|
}
|
||||||
|
// Create FileInfo for the current directory
|
||||||
|
dirFileInfo := &FileInfo{
|
||||||
|
Path: adjustedPath,
|
||||||
|
Files: fileInfos,
|
||||||
|
Folders: dirInfos,
|
||||||
|
}
|
||||||
|
dirFileInfo.ItemInfo = ItemInfo{
|
||||||
|
Name: dirInfo.Name(),
|
||||||
|
Type: "directory",
|
||||||
|
Size: totalSize,
|
||||||
|
ModTime: dirInfo.ModTime(),
|
||||||
|
}
|
||||||
|
|
||||||
|
dirFileInfo.SortItems()
|
||||||
|
|
||||||
|
// Update the current directory metadata in the index
|
||||||
|
si.UpdateMetadata(dirFileInfo)
|
||||||
|
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func (si *Index) makeIndexPath(subPath string) string {
|
||||||
|
if strings.HasPrefix(subPath, "./") {
|
||||||
|
subPath = strings.TrimPrefix(subPath, ".")
|
||||||
|
}
|
||||||
|
if strings.HasPrefix(subPath, ".") || si.Root == subPath {
|
||||||
|
return "/"
|
||||||
|
}
|
||||||
|
// clean path
|
||||||
|
subPath = strings.TrimSuffix(subPath, "/")
|
||||||
|
// remove index prefix
|
||||||
|
adjustedPath := strings.TrimPrefix(subPath, si.Root)
|
||||||
|
// remove trailing slash
|
||||||
|
adjustedPath = strings.TrimSuffix(adjustedPath, "/")
|
||||||
|
if !strings.HasPrefix(adjustedPath, "/") {
|
||||||
|
adjustedPath = "/" + adjustedPath
|
||||||
|
}
|
||||||
|
return adjustedPath
|
||||||
|
}
|
||||||
|
|
||||||
|
func (si *Index) recursiveUpdateDirSizes(childInfo *FileInfo, previousSize int64) {
|
||||||
|
parentDir := utils.GetParentDirectoryPath(childInfo.Path)
|
||||||
|
parentInfo, exists := si.GetMetadataInfo(parentDir, true)
|
||||||
|
if !exists || parentDir == "" {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
newSize := parentInfo.Size - previousSize + childInfo.Size
|
||||||
|
parentInfo.Size += newSize
|
||||||
|
si.UpdateMetadata(parentInfo)
|
||||||
|
si.recursiveUpdateDirSizes(parentInfo, newSize)
|
||||||
|
}
|
||||||
|
|
||||||
|
func (si *Index) RefreshFileInfo(opts FileOptions) error {
|
||||||
|
refreshOptions := FileOptions{
|
||||||
|
Path: opts.Path,
|
||||||
|
IsDir: opts.IsDir,
|
||||||
|
}
|
||||||
|
|
||||||
|
if !refreshOptions.IsDir {
|
||||||
|
refreshOptions.Path = si.makeIndexPath(filepath.Dir(refreshOptions.Path))
|
||||||
|
refreshOptions.IsDir = true
|
||||||
|
} else {
|
||||||
|
refreshOptions.Path = si.makeIndexPath(refreshOptions.Path)
|
||||||
|
}
|
||||||
|
err := si.indexDirectory(refreshOptions.Path, false, false)
|
||||||
|
if err != nil {
|
||||||
|
return fmt.Errorf("file/folder does not exist to refresh data: %s", refreshOptions.Path)
|
||||||
|
}
|
||||||
|
file, exists := si.GetMetadataInfo(refreshOptions.Path, true)
|
||||||
|
if !exists {
|
||||||
|
return fmt.Errorf("file/folder does not exist in metadata: %s", refreshOptions.Path)
|
||||||
|
}
|
||||||
|
|
||||||
|
current, firstExisted := si.GetMetadataInfo(refreshOptions.Path, true)
|
||||||
|
refreshParentInfo := firstExisted && current.Size != file.Size
|
||||||
|
//utils.PrintStructFields(*file)
|
||||||
|
result := si.UpdateMetadata(file)
|
||||||
|
if !result {
|
||||||
|
return fmt.Errorf("file/folder does not exist in metadata: %s", refreshOptions.Path)
|
||||||
|
}
|
||||||
|
if !exists {
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
if refreshParentInfo {
|
||||||
|
si.recursiveUpdateDirSizes(file, current.Size)
|
||||||
|
}
|
||||||
|
return nil
|
||||||
|
}
|
|
@ -0,0 +1,120 @@
|
||||||
|
package files
|
||||||
|
|
||||||
|
import (
|
||||||
|
"log"
|
||||||
|
"time"
|
||||||
|
|
||||||
|
"github.com/gtsteffaniak/filebrowser/settings"
|
||||||
|
)
|
||||||
|
|
||||||
|
// schedule in minutes
|
||||||
|
var scanSchedule = []time.Duration{
|
||||||
|
5 * time.Minute, // 5 minute quick scan & 25 minutes for a full scan
|
||||||
|
10 * time.Minute,
|
||||||
|
20 * time.Minute, // [3] element is 20 minutes, reset anchor for full scan
|
||||||
|
40 * time.Minute,
|
||||||
|
1 * time.Hour,
|
||||||
|
2 * time.Hour,
|
||||||
|
3 * time.Hour,
|
||||||
|
4 * time.Hour, // 4 hours for quick scan & 20 hours for a full scan
|
||||||
|
}
|
||||||
|
|
||||||
|
func (si *Index) newScanner(origin string) {
|
||||||
|
fullScanAnchor := 3
|
||||||
|
fullScanCounter := 0 // every 5th scan is a full scan
|
||||||
|
for {
|
||||||
|
// Determine sleep time with modifiers
|
||||||
|
fullScanCounter++
|
||||||
|
sleepTime := scanSchedule[si.currentSchedule] + si.SmartModifier
|
||||||
|
if si.assessment == "simple" {
|
||||||
|
sleepTime = scanSchedule[si.currentSchedule] - si.SmartModifier
|
||||||
|
}
|
||||||
|
if settings.Config.Server.IndexingInterval > 0 {
|
||||||
|
sleepTime = time.Duration(settings.Config.Server.IndexingInterval) * time.Minute
|
||||||
|
}
|
||||||
|
|
||||||
|
// Log and sleep before indexing
|
||||||
|
log.Printf("Next scan in %v\n", sleepTime)
|
||||||
|
time.Sleep(sleepTime)
|
||||||
|
|
||||||
|
si.scannerMu.Lock()
|
||||||
|
if fullScanCounter == 5 {
|
||||||
|
si.RunIndexing(origin, false) // Full scan
|
||||||
|
fullScanCounter = 0
|
||||||
|
} else {
|
||||||
|
si.RunIndexing(origin, true) // Quick scan
|
||||||
|
}
|
||||||
|
si.scannerMu.Unlock()
|
||||||
|
|
||||||
|
// Adjust schedule based on file changes
|
||||||
|
if si.FilesChangedDuringIndexing {
|
||||||
|
// Move to at least the full-scan anchor or reduce interval
|
||||||
|
if si.currentSchedule > fullScanAnchor {
|
||||||
|
si.currentSchedule = fullScanAnchor
|
||||||
|
} else if si.currentSchedule > 0 {
|
||||||
|
si.currentSchedule--
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
// Increment toward the longest interval if no changes
|
||||||
|
if si.currentSchedule < len(scanSchedule)-1 {
|
||||||
|
si.currentSchedule++
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if si.assessment == "simple" && si.currentSchedule > 3 {
|
||||||
|
si.currentSchedule = 3
|
||||||
|
}
|
||||||
|
// Ensure `currentSchedule` stays within bounds
|
||||||
|
if si.currentSchedule < 0 {
|
||||||
|
si.currentSchedule = 0
|
||||||
|
} else if si.currentSchedule >= len(scanSchedule) {
|
||||||
|
si.currentSchedule = len(scanSchedule) - 1
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func (si *Index) RunIndexing(origin string, quick bool) {
|
||||||
|
prevNumDirs := si.NumDirs
|
||||||
|
prevNumFiles := si.NumFiles
|
||||||
|
if quick {
|
||||||
|
log.Println("Starting quick scan")
|
||||||
|
} else {
|
||||||
|
log.Println("Starting full scan")
|
||||||
|
si.NumDirs = 0
|
||||||
|
si.NumFiles = 0
|
||||||
|
}
|
||||||
|
startTime := time.Now()
|
||||||
|
si.FilesChangedDuringIndexing = false
|
||||||
|
// Perform the indexing operation
|
||||||
|
err := si.indexDirectory("/", quick, true)
|
||||||
|
if err != nil {
|
||||||
|
log.Printf("Error during indexing: %v", err)
|
||||||
|
}
|
||||||
|
// Update the LastIndexed time
|
||||||
|
si.LastIndexed = time.Now()
|
||||||
|
si.indexingTime = int(time.Since(startTime).Seconds())
|
||||||
|
if !quick {
|
||||||
|
// update smart indexing
|
||||||
|
if si.indexingTime < 3 || si.NumDirs < 10000 {
|
||||||
|
si.assessment = "simple"
|
||||||
|
si.SmartModifier = 4 * time.Minute
|
||||||
|
log.Println("Index is small and efficient, adjusting scan interval accordingly.")
|
||||||
|
} else if si.indexingTime > 120 || si.NumDirs > 500000 {
|
||||||
|
si.assessment = "complex"
|
||||||
|
modifier := si.indexingTime / 10 // seconds
|
||||||
|
si.SmartModifier = time.Duration(modifier) * time.Minute
|
||||||
|
log.Println("Index is large and complex, adjusting scan interval accordingly.")
|
||||||
|
} else {
|
||||||
|
si.assessment = "normal"
|
||||||
|
log.Println("Index is normal, quick scan set to every 5 minutes.")
|
||||||
|
}
|
||||||
|
log.Printf("Index assessment : complexity=%v directories=%v files=%v \n", si.assessment, si.NumDirs, si.NumFiles)
|
||||||
|
if si.NumDirs != prevNumDirs || si.NumFiles != prevNumFiles {
|
||||||
|
si.FilesChangedDuringIndexing = true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
log.Printf("Time Spent Indexing : %v seconds\n", si.indexingTime)
|
||||||
|
}
|
||||||
|
|
||||||
|
func (si *Index) setupIndexingScanners() {
|
||||||
|
go si.newScanner("/")
|
||||||
|
}
|
|
@ -3,7 +3,6 @@ package files
|
||||||
import (
|
import (
|
||||||
"encoding/json"
|
"encoding/json"
|
||||||
"math/rand"
|
"math/rand"
|
||||||
"path/filepath"
|
|
||||||
"reflect"
|
"reflect"
|
||||||
"testing"
|
"testing"
|
||||||
"time"
|
"time"
|
||||||
|
@ -12,7 +11,7 @@ import (
|
||||||
)
|
)
|
||||||
|
|
||||||
func BenchmarkFillIndex(b *testing.B) {
|
func BenchmarkFillIndex(b *testing.B) {
|
||||||
InitializeIndex(5, false)
|
InitializeIndex(false)
|
||||||
si := GetIndex(settings.Config.Server.Root)
|
si := GetIndex(settings.Config.Server.Root)
|
||||||
b.ResetTimer()
|
b.ResetTimer()
|
||||||
b.ReportAllocs()
|
b.ReportAllocs()
|
||||||
|
@ -24,11 +23,11 @@ func BenchmarkFillIndex(b *testing.B) {
|
||||||
func (si *Index) createMockData(numDirs, numFilesPerDir int) {
|
func (si *Index) createMockData(numDirs, numFilesPerDir int) {
|
||||||
for i := 0; i < numDirs; i++ {
|
for i := 0; i < numDirs; i++ {
|
||||||
dirPath := generateRandomPath(rand.Intn(3) + 1)
|
dirPath := generateRandomPath(rand.Intn(3) + 1)
|
||||||
files := []ReducedItem{} // Slice of FileInfo
|
files := []ItemInfo{} // Slice of FileInfo
|
||||||
|
|
||||||
// Simulating files and directories with FileInfo
|
// Simulating files and directories with FileInfo
|
||||||
for j := 0; j < numFilesPerDir; j++ {
|
for j := 0; j < numFilesPerDir; j++ {
|
||||||
newFile := ReducedItem{
|
newFile := ItemInfo{
|
||||||
Name: "file-" + getRandomTerm() + getRandomExtension(),
|
Name: "file-" + getRandomTerm() + getRandomExtension(),
|
||||||
Size: rand.Int63n(1000), // Random size
|
Size: rand.Int63n(1000), // Random size
|
||||||
ModTime: time.Now().Add(-time.Duration(rand.Intn(100)) * time.Hour), // Random mod time
|
ModTime: time.Now().Add(-time.Duration(rand.Intn(100)) * time.Hour), // Random mod time
|
||||||
|
@ -37,7 +36,6 @@ func (si *Index) createMockData(numDirs, numFilesPerDir int) {
|
||||||
files = append(files, newFile)
|
files = append(files, newFile)
|
||||||
}
|
}
|
||||||
dirInfo := &FileInfo{
|
dirInfo := &FileInfo{
|
||||||
Name: filepath.Base(dirPath),
|
|
||||||
Path: dirPath,
|
Path: dirPath,
|
||||||
Files: files,
|
Files: files,
|
||||||
}
|
}
|
||||||
|
@ -112,37 +110,3 @@ func TestGetIndex(t *testing.T) {
|
||||||
})
|
})
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
func TestInitializeIndex(t *testing.T) {
|
|
||||||
type args struct {
|
|
||||||
intervalMinutes uint32
|
|
||||||
}
|
|
||||||
tests := []struct {
|
|
||||||
name string
|
|
||||||
args args
|
|
||||||
}{
|
|
||||||
// TODO: Add test cases.
|
|
||||||
}
|
|
||||||
for _, tt := range tests {
|
|
||||||
t.Run(tt.name, func(t *testing.T) {
|
|
||||||
InitializeIndex(tt.args.intervalMinutes, false)
|
|
||||||
})
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
func Test_indexingScheduler(t *testing.T) {
|
|
||||||
type args struct {
|
|
||||||
intervalMinutes uint32
|
|
||||||
}
|
|
||||||
tests := []struct {
|
|
||||||
name string
|
|
||||||
args args
|
|
||||||
}{
|
|
||||||
// TODO: Add test cases.
|
|
||||||
}
|
|
||||||
for _, tt := range tests {
|
|
||||||
t.Run(tt.name, func(t *testing.T) {
|
|
||||||
indexingScheduler(tt.args.intervalMinutes)
|
|
||||||
})
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
|
@ -28,7 +28,14 @@ func (si *Index) Search(search string, scope string, sourceSession string) []sea
|
||||||
searchOptions := ParseSearch(search)
|
searchOptions := ParseSearch(search)
|
||||||
results := make(map[string]searchResult, 0)
|
results := make(map[string]searchResult, 0)
|
||||||
count := 0
|
count := 0
|
||||||
directories := si.getDirsInScope(scope)
|
var directories []string
|
||||||
|
cachedDirs, ok := utils.SearchResultsCache.Get(si.Root + scope).([]string)
|
||||||
|
if ok {
|
||||||
|
directories = cachedDirs
|
||||||
|
} else {
|
||||||
|
directories = si.getDirsInScope(scope)
|
||||||
|
utils.SearchResultsCache.Set(si.Root+scope, directories)
|
||||||
|
}
|
||||||
for _, searchTerm := range searchOptions.Terms {
|
for _, searchTerm := range searchOptions.Terms {
|
||||||
if searchTerm == "" {
|
if searchTerm == "" {
|
||||||
continue
|
continue
|
||||||
|
@ -38,6 +45,7 @@ func (si *Index) Search(search string, scope string, sourceSession string) []sea
|
||||||
}
|
}
|
||||||
si.mu.Lock()
|
si.mu.Lock()
|
||||||
for _, dirName := range directories {
|
for _, dirName := range directories {
|
||||||
|
scopedPath := strings.TrimPrefix(strings.TrimPrefix(dirName, scope), "/") + "/"
|
||||||
si.mu.Unlock()
|
si.mu.Unlock()
|
||||||
dir, found := si.GetReducedMetadata(dirName, true)
|
dir, found := si.GetReducedMetadata(dirName, true)
|
||||||
si.mu.Lock()
|
si.mu.Lock()
|
||||||
|
@ -47,25 +55,22 @@ func (si *Index) Search(search string, scope string, sourceSession string) []sea
|
||||||
if count > maxSearchResults {
|
if count > maxSearchResults {
|
||||||
break
|
break
|
||||||
}
|
}
|
||||||
reducedDir := ReducedItem{
|
reducedDir := ItemInfo{
|
||||||
Name: filepath.Base(dirName),
|
Name: filepath.Base(dirName),
|
||||||
Type: "directory",
|
Type: "directory",
|
||||||
Size: dir.Size,
|
Size: dir.Size,
|
||||||
}
|
}
|
||||||
|
|
||||||
matches := reducedDir.containsSearchTerm(searchTerm, searchOptions)
|
matches := reducedDir.containsSearchTerm(searchTerm, searchOptions)
|
||||||
if matches {
|
if matches {
|
||||||
scopedPath := strings.TrimPrefix(strings.TrimPrefix(dirName, scope), "/") + "/"
|
|
||||||
results[scopedPath] = searchResult{Path: scopedPath, Type: "directory", Size: dir.Size}
|
results[scopedPath] = searchResult{Path: scopedPath, Type: "directory", Size: dir.Size}
|
||||||
count++
|
count++
|
||||||
}
|
}
|
||||||
|
|
||||||
// search files first
|
// search files first
|
||||||
for _, item := range dir.Items {
|
for _, item := range dir.Files {
|
||||||
|
|
||||||
fullPath := dirName + "/" + item.Name
|
fullPath := dirName + "/" + item.Name
|
||||||
|
scopedPath := strings.TrimPrefix(strings.TrimPrefix(fullPath, scope), "/")
|
||||||
if item.Type == "directory" {
|
if item.Type == "directory" {
|
||||||
fullPath += "/"
|
scopedPath += "/"
|
||||||
}
|
}
|
||||||
value, found := sessionInProgress.Load(sourceSession)
|
value, found := sessionInProgress.Load(sourceSession)
|
||||||
if !found || value != runningHash {
|
if !found || value != runningHash {
|
||||||
|
@ -77,7 +82,6 @@ func (si *Index) Search(search string, scope string, sourceSession string) []sea
|
||||||
}
|
}
|
||||||
matches := item.containsSearchTerm(searchTerm, searchOptions)
|
matches := item.containsSearchTerm(searchTerm, searchOptions)
|
||||||
if matches {
|
if matches {
|
||||||
scopedPath := strings.TrimPrefix(strings.TrimPrefix(fullPath, scope), "/")
|
|
||||||
results[scopedPath] = searchResult{Path: scopedPath, Type: item.Type, Size: item.Size}
|
results[scopedPath] = searchResult{Path: scopedPath, Type: item.Type, Size: item.Size}
|
||||||
count++
|
count++
|
||||||
}
|
}
|
||||||
|
@ -103,7 +107,7 @@ func (si *Index) Search(search string, scope string, sourceSession string) []sea
|
||||||
// returns true if the file name contains the search term
|
// returns true if the file name contains the search term
|
||||||
// returns file type if the file name contains the search term
|
// returns file type if the file name contains the search term
|
||||||
// returns size of file/dir if the file name contains the search term
|
// returns size of file/dir if the file name contains the search term
|
||||||
func (fi ReducedItem) containsSearchTerm(searchTerm string, options *SearchOptions) bool {
|
func (fi ItemInfo) containsSearchTerm(searchTerm string, options SearchOptions) bool {
|
||||||
|
|
||||||
fileTypes := map[string]bool{}
|
fileTypes := map[string]bool{}
|
||||||
largerThan := int64(options.LargerThan) * 1024 * 1024
|
largerThan := int64(options.LargerThan) * 1024 * 1024
|
||||||
|
|
|
@ -8,7 +8,7 @@ import (
|
||||||
)
|
)
|
||||||
|
|
||||||
func BenchmarkSearchAllIndexes(b *testing.B) {
|
func BenchmarkSearchAllIndexes(b *testing.B) {
|
||||||
InitializeIndex(5, false)
|
InitializeIndex(false)
|
||||||
si := GetIndex(rootPath)
|
si := GetIndex(rootPath)
|
||||||
|
|
||||||
si.createMockData(50, 3) // 50 dirs, 3 files per dir
|
si.createMockData(50, 3) // 50 dirs, 3 files per dir
|
||||||
|
@ -29,25 +29,25 @@ func BenchmarkSearchAllIndexes(b *testing.B) {
|
||||||
func TestParseSearch(t *testing.T) {
|
func TestParseSearch(t *testing.T) {
|
||||||
tests := []struct {
|
tests := []struct {
|
||||||
input string
|
input string
|
||||||
want *SearchOptions
|
want SearchOptions
|
||||||
}{
|
}{
|
||||||
{
|
{
|
||||||
input: "my test search",
|
input: "my test search",
|
||||||
want: &SearchOptions{
|
want: SearchOptions{
|
||||||
Conditions: map[string]bool{"exact": false},
|
Conditions: map[string]bool{"exact": false},
|
||||||
Terms: []string{"my test search"},
|
Terms: []string{"my test search"},
|
||||||
},
|
},
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
input: "case:exact my|test|search",
|
input: "case:exact my|test|search",
|
||||||
want: &SearchOptions{
|
want: SearchOptions{
|
||||||
Conditions: map[string]bool{"exact": true},
|
Conditions: map[string]bool{"exact": true},
|
||||||
Terms: []string{"my", "test", "search"},
|
Terms: []string{"my", "test", "search"},
|
||||||
},
|
},
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
input: "type:largerThan=100 type:smallerThan=1000 test",
|
input: "type:largerThan=100 type:smallerThan=1000 test",
|
||||||
want: &SearchOptions{
|
want: SearchOptions{
|
||||||
Conditions: map[string]bool{"exact": false, "larger": true, "smaller": true},
|
Conditions: map[string]bool{"exact": false, "larger": true, "smaller": true},
|
||||||
Terms: []string{"test"},
|
Terms: []string{"test"},
|
||||||
LargerThan: 100,
|
LargerThan: 100,
|
||||||
|
@ -56,7 +56,7 @@ func TestParseSearch(t *testing.T) {
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
input: "type:audio thisfile",
|
input: "type:audio thisfile",
|
||||||
want: &SearchOptions{
|
want: SearchOptions{
|
||||||
Conditions: map[string]bool{"exact": false, "audio": true},
|
Conditions: map[string]bool{"exact": false, "audio": true},
|
||||||
Terms: []string{"thisfile"},
|
Terms: []string{"thisfile"},
|
||||||
},
|
},
|
||||||
|
@ -74,7 +74,7 @@ func TestParseSearch(t *testing.T) {
|
||||||
}
|
}
|
||||||
|
|
||||||
func TestSearchWhileIndexing(t *testing.T) {
|
func TestSearchWhileIndexing(t *testing.T) {
|
||||||
InitializeIndex(5, false)
|
InitializeIndex(false)
|
||||||
si := GetIndex(rootPath)
|
si := GetIndex(rootPath)
|
||||||
|
|
||||||
searchTerms := generateRandomSearchTerms(10)
|
searchTerms := generateRandomSearchTerms(10)
|
||||||
|
@ -89,27 +89,29 @@ func TestSearchWhileIndexing(t *testing.T) {
|
||||||
func TestSearchIndexes(t *testing.T) {
|
func TestSearchIndexes(t *testing.T) {
|
||||||
index := Index{
|
index := Index{
|
||||||
Directories: map[string]*FileInfo{
|
Directories: map[string]*FileInfo{
|
||||||
"/test": {Files: []ReducedItem{{Name: "audio1.wav", Type: "audio"}}},
|
"/test": {Files: []ItemInfo{{Name: "audio1.wav", Type: "audio"}}},
|
||||||
"/test/path": {Files: []ReducedItem{{Name: "file.txt", Type: "text"}}},
|
"/test/path": {Files: []ItemInfo{{Name: "file.txt", Type: "text"}}},
|
||||||
"/new/test": {Files: []ReducedItem{
|
"/new/test": {Files: []ItemInfo{
|
||||||
{Name: "audio.wav", Type: "audio"},
|
{Name: "audio.wav", Type: "audio"},
|
||||||
{Name: "video.mp4", Type: "video"},
|
{Name: "video.mp4", Type: "video"},
|
||||||
{Name: "video.MP4", Type: "video"},
|
{Name: "video.MP4", Type: "video"},
|
||||||
}},
|
}},
|
||||||
"/new/test/path": {Files: []ReducedItem{{Name: "archive.zip", Type: "archive"}}},
|
"/new/test/path": {Files: []ItemInfo{{Name: "archive.zip", Type: "archive"}}},
|
||||||
"/firstDir": {
|
"/firstDir": {
|
||||||
Files: []ReducedItem{
|
Files: []ItemInfo{
|
||||||
{Name: "archive.zip", Size: 100, Type: "archive"},
|
{Name: "archive.zip", Size: 100, Type: "archive"},
|
||||||
},
|
},
|
||||||
Dirs: map[string]*FileInfo{
|
Folders: []ItemInfo{
|
||||||
"thisIsDir": {Name: "thisIsDir", Size: 2 * 1024 * 1024},
|
{Name: "thisIsDir", Type: "directory", Size: 2 * 1024 * 1024},
|
||||||
},
|
},
|
||||||
},
|
},
|
||||||
"/firstDir/thisIsDir": {
|
"/firstDir/thisIsDir": {
|
||||||
Files: []ReducedItem{
|
Files: []ItemInfo{
|
||||||
{Name: "hi.txt", Type: "text"},
|
{Name: "hi.txt", Type: "text"},
|
||||||
},
|
},
|
||||||
Size: 2 * 1024 * 1024,
|
ItemInfo: ItemInfo{
|
||||||
|
Size: 2 * 1024 * 1024,
|
||||||
|
},
|
||||||
},
|
},
|
||||||
},
|
},
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,10 +1,7 @@
|
||||||
package files
|
package files
|
||||||
|
|
||||||
import (
|
import (
|
||||||
"log"
|
|
||||||
"path/filepath"
|
"path/filepath"
|
||||||
"sort"
|
|
||||||
"time"
|
|
||||||
|
|
||||||
"github.com/gtsteffaniak/filebrowser/settings"
|
"github.com/gtsteffaniak/filebrowser/settings"
|
||||||
)
|
)
|
||||||
|
@ -13,15 +10,14 @@ import (
|
||||||
func (si *Index) UpdateMetadata(info *FileInfo) bool {
|
func (si *Index) UpdateMetadata(info *FileInfo) bool {
|
||||||
si.mu.Lock()
|
si.mu.Lock()
|
||||||
defer si.mu.Unlock()
|
defer si.mu.Unlock()
|
||||||
info.CacheTime = time.Now()
|
|
||||||
si.Directories[info.Path] = info
|
si.Directories[info.Path] = info
|
||||||
return true
|
return true
|
||||||
}
|
}
|
||||||
|
|
||||||
// GetMetadataInfo retrieves the FileInfo from the specified directory in the index.
|
// GetMetadataInfo retrieves the FileInfo from the specified directory in the index.
|
||||||
func (si *Index) GetReducedMetadata(target string, isDir bool) (*FileInfo, bool) {
|
func (si *Index) GetReducedMetadata(target string, isDir bool) (*FileInfo, bool) {
|
||||||
si.mu.RLock()
|
si.mu.Lock()
|
||||||
defer si.mu.RUnlock()
|
defer si.mu.Unlock()
|
||||||
checkDir := si.makeIndexPath(target)
|
checkDir := si.makeIndexPath(target)
|
||||||
if !isDir {
|
if !isDir {
|
||||||
checkDir = si.makeIndexPath(filepath.Dir(target))
|
checkDir = si.makeIndexPath(filepath.Dir(target))
|
||||||
|
@ -30,50 +26,25 @@ func (si *Index) GetReducedMetadata(target string, isDir bool) (*FileInfo, bool)
|
||||||
if !exists {
|
if !exists {
|
||||||
return nil, false
|
return nil, false
|
||||||
}
|
}
|
||||||
if !isDir {
|
|
||||||
if checkDir == "/" {
|
|
||||||
checkDir = ""
|
|
||||||
}
|
|
||||||
|
|
||||||
baseName := filepath.Base(target)
|
if isDir {
|
||||||
for _, item := range dir.Files {
|
return dir, true
|
||||||
if item.Name == baseName {
|
}
|
||||||
return &FileInfo{
|
// handle file
|
||||||
Name: item.Name,
|
if checkDir == "/" {
|
||||||
Size: item.Size,
|
checkDir = ""
|
||||||
ModTime: item.ModTime,
|
}
|
||||||
Type: item.Type,
|
baseName := filepath.Base(target)
|
||||||
Path: checkDir + "/" + item.Name,
|
for _, item := range dir.Files {
|
||||||
}, true
|
if item.Name == baseName {
|
||||||
}
|
return &FileInfo{
|
||||||
|
Path: checkDir + "/" + item.Name,
|
||||||
|
ItemInfo: item,
|
||||||
|
}, true
|
||||||
}
|
}
|
||||||
return nil, false
|
|
||||||
}
|
}
|
||||||
cleanedItems := []ReducedItem{}
|
return nil, false
|
||||||
for name, item := range dir.Dirs {
|
|
||||||
cleanedItems = append(cleanedItems, ReducedItem{
|
|
||||||
Name: name,
|
|
||||||
Size: item.Size,
|
|
||||||
ModTime: item.ModTime,
|
|
||||||
Type: "directory",
|
|
||||||
})
|
|
||||||
}
|
|
||||||
cleanedItems = append(cleanedItems, dir.Files...)
|
|
||||||
sort.Slice(cleanedItems, func(i, j int) bool {
|
|
||||||
return cleanedItems[i].Name < cleanedItems[j].Name
|
|
||||||
})
|
|
||||||
dirname := filepath.Base(dir.Path)
|
|
||||||
if dirname == "." {
|
|
||||||
dirname = "/"
|
|
||||||
}
|
|
||||||
// construct file info
|
|
||||||
return &FileInfo{
|
|
||||||
Name: dirname,
|
|
||||||
Type: "directory",
|
|
||||||
Items: cleanedItems,
|
|
||||||
ModTime: dir.ModTime,
|
|
||||||
Size: dir.Size,
|
|
||||||
}, true
|
|
||||||
}
|
}
|
||||||
|
|
||||||
// GetMetadataInfo retrieves the FileInfo from the specified directory in the index.
|
// GetMetadataInfo retrieves the FileInfo from the specified directory in the index.
|
||||||
|
@ -91,29 +62,10 @@ func (si *Index) GetMetadataInfo(target string, isDir bool) (*FileInfo, bool) {
|
||||||
func (si *Index) RemoveDirectory(path string) {
|
func (si *Index) RemoveDirectory(path string) {
|
||||||
si.mu.Lock()
|
si.mu.Lock()
|
||||||
defer si.mu.Unlock()
|
defer si.mu.Unlock()
|
||||||
|
si.NumDeleted++
|
||||||
delete(si.Directories, path)
|
delete(si.Directories, path)
|
||||||
}
|
}
|
||||||
|
|
||||||
func (si *Index) UpdateCount(given string) {
|
|
||||||
si.mu.Lock()
|
|
||||||
defer si.mu.Unlock()
|
|
||||||
if given == "files" {
|
|
||||||
si.NumFiles++
|
|
||||||
} else if given == "dirs" {
|
|
||||||
si.NumDirs++
|
|
||||||
} else {
|
|
||||||
log.Println("could not update unknown type: ", given)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
func (si *Index) resetCount() {
|
|
||||||
si.mu.Lock()
|
|
||||||
defer si.mu.Unlock()
|
|
||||||
si.NumDirs = 0
|
|
||||||
si.NumFiles = 0
|
|
||||||
si.inProgress = true
|
|
||||||
}
|
|
||||||
|
|
||||||
func GetIndex(root string) *Index {
|
func GetIndex(root string) *Index {
|
||||||
for _, index := range indexes {
|
for _, index := range indexes {
|
||||||
if index.Root == root {
|
if index.Root == root {
|
||||||
|
@ -128,7 +80,6 @@ func GetIndex(root string) *Index {
|
||||||
Directories: map[string]*FileInfo{},
|
Directories: map[string]*FileInfo{},
|
||||||
NumDirs: 0,
|
NumDirs: 0,
|
||||||
NumFiles: 0,
|
NumFiles: 0,
|
||||||
inProgress: false,
|
|
||||||
}
|
}
|
||||||
newIndex.Directories["/"] = &FileInfo{}
|
newIndex.Directories["/"] = &FileInfo{}
|
||||||
indexesMutex.Lock()
|
indexesMutex.Lock()
|
||||||
|
|
|
@ -34,7 +34,7 @@ func TestGetFileMetadataSize(t *testing.T) {
|
||||||
t.Run(tt.name, func(t *testing.T) {
|
t.Run(tt.name, func(t *testing.T) {
|
||||||
fileInfo, _ := testIndex.GetReducedMetadata(tt.adjustedPath, true)
|
fileInfo, _ := testIndex.GetReducedMetadata(tt.adjustedPath, true)
|
||||||
// Iterate over fileInfo.Items to look for expectedName
|
// Iterate over fileInfo.Items to look for expectedName
|
||||||
for _, item := range fileInfo.Items {
|
for _, item := range fileInfo.Files {
|
||||||
// Assert the existence and the name
|
// Assert the existence and the name
|
||||||
if item.Name == tt.expectedName {
|
if item.Name == tt.expectedName {
|
||||||
assert.Equal(t, tt.expectedSize, item.Size)
|
assert.Equal(t, tt.expectedSize, item.Size)
|
||||||
|
@ -89,8 +89,8 @@ func TestGetFileMetadata(t *testing.T) {
|
||||||
}
|
}
|
||||||
for _, tt := range tests {
|
for _, tt := range tests {
|
||||||
t.Run(tt.name, func(t *testing.T) {
|
t.Run(tt.name, func(t *testing.T) {
|
||||||
fileInfo, _ := testIndex.GetReducedMetadata(tt.adjustedPath, tt.isDir)
|
fileInfo, exists := testIndex.GetReducedMetadata(tt.adjustedPath, tt.isDir)
|
||||||
if fileInfo == nil {
|
if !exists {
|
||||||
found := false
|
found := false
|
||||||
assert.Equal(t, tt.expectedExists, found)
|
assert.Equal(t, tt.expectedExists, found)
|
||||||
return
|
return
|
||||||
|
@ -98,7 +98,7 @@ func TestGetFileMetadata(t *testing.T) {
|
||||||
found := false
|
found := false
|
||||||
if tt.isDir {
|
if tt.isDir {
|
||||||
// Iterate over fileInfo.Items to look for expectedName
|
// Iterate over fileInfo.Items to look for expectedName
|
||||||
for _, item := range fileInfo.Items {
|
for _, item := range fileInfo.Files {
|
||||||
// Assert the existence and the name
|
// Assert the existence and the name
|
||||||
if item.Name == tt.expectedName {
|
if item.Name == tt.expectedName {
|
||||||
found = true
|
found = true
|
||||||
|
@ -120,9 +120,7 @@ func TestGetFileMetadata(t *testing.T) {
|
||||||
func TestUpdateFileMetadata(t *testing.T) {
|
func TestUpdateFileMetadata(t *testing.T) {
|
||||||
info := &FileInfo{
|
info := &FileInfo{
|
||||||
Path: "/testpath",
|
Path: "/testpath",
|
||||||
Name: "testpath",
|
Files: []ItemInfo{
|
||||||
Type: "directory",
|
|
||||||
Files: []ReducedItem{
|
|
||||||
{Name: "testfile.txt"},
|
{Name: "testfile.txt"},
|
||||||
{Name: "anotherfile.txt"},
|
{Name: "anotherfile.txt"},
|
||||||
},
|
},
|
||||||
|
@ -165,9 +163,11 @@ func TestSetDirectoryInfo(t *testing.T) {
|
||||||
Directories: map[string]*FileInfo{
|
Directories: map[string]*FileInfo{
|
||||||
"/testpath": {
|
"/testpath": {
|
||||||
Path: "/testpath",
|
Path: "/testpath",
|
||||||
Name: "testpath",
|
ItemInfo: ItemInfo{
|
||||||
Type: "directory",
|
Name: "testpath",
|
||||||
Items: []ReducedItem{
|
Type: "directory",
|
||||||
|
},
|
||||||
|
Files: []ItemInfo{
|
||||||
{Name: "testfile.txt"},
|
{Name: "testfile.txt"},
|
||||||
{Name: "anotherfile.txt"},
|
{Name: "anotherfile.txt"},
|
||||||
},
|
},
|
||||||
|
@ -176,15 +176,17 @@ func TestSetDirectoryInfo(t *testing.T) {
|
||||||
}
|
}
|
||||||
dir := &FileInfo{
|
dir := &FileInfo{
|
||||||
Path: "/newPath",
|
Path: "/newPath",
|
||||||
Name: "newPath",
|
ItemInfo: ItemInfo{
|
||||||
Type: "directory",
|
Name: "newPath",
|
||||||
Items: []ReducedItem{
|
Type: "directory",
|
||||||
|
},
|
||||||
|
Files: []ItemInfo{
|
||||||
{Name: "testfile.txt"},
|
{Name: "testfile.txt"},
|
||||||
},
|
},
|
||||||
}
|
}
|
||||||
index.UpdateMetadata(dir)
|
index.UpdateMetadata(dir)
|
||||||
storedDir, exists := index.Directories["/newPath"]
|
storedDir, exists := index.Directories["/newPath"]
|
||||||
if !exists || storedDir.Items[0].Name != "testfile.txt" {
|
if !exists || storedDir.Files[0].Name != "testfile.txt" {
|
||||||
t.Fatalf("expected SetDirectoryInfo to store directory info correctly")
|
t.Fatalf("expected SetDirectoryInfo to store directory info correctly")
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
@ -203,56 +205,34 @@ func TestRemoveDirectory(t *testing.T) {
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
// Test for UpdateCount
|
|
||||||
func TestUpdateCount(t *testing.T) {
|
|
||||||
index := &Index{}
|
|
||||||
index.UpdateCount("files")
|
|
||||||
if index.NumFiles != 1 {
|
|
||||||
t.Fatalf("expected NumFiles to be 1 after UpdateCount('files')")
|
|
||||||
}
|
|
||||||
if index.NumFiles != 1 {
|
|
||||||
t.Fatalf("expected NumFiles to be 1 after UpdateCount('files')")
|
|
||||||
}
|
|
||||||
index.UpdateCount("dirs")
|
|
||||||
if index.NumDirs != 1 {
|
|
||||||
t.Fatalf("expected NumDirs to be 1 after UpdateCount('dirs')")
|
|
||||||
}
|
|
||||||
index.UpdateCount("unknown")
|
|
||||||
// Just ensure it does not panic or update any counters
|
|
||||||
if index.NumFiles != 1 || index.NumDirs != 1 {
|
|
||||||
t.Fatalf("expected counts to remain unchanged for unknown type")
|
|
||||||
}
|
|
||||||
index.resetCount()
|
|
||||||
if index.NumFiles != 0 || index.NumDirs != 0 || !index.inProgress {
|
|
||||||
t.Fatalf("expected resetCount to reset counts and set inProgress to true")
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
func init() {
|
func init() {
|
||||||
testIndex = Index{
|
testIndex = Index{
|
||||||
Root: "/",
|
Root: "/",
|
||||||
NumFiles: 10,
|
NumFiles: 10,
|
||||||
NumDirs: 5,
|
NumDirs: 5,
|
||||||
inProgress: false,
|
|
||||||
Directories: map[string]*FileInfo{
|
Directories: map[string]*FileInfo{
|
||||||
"/testpath": {
|
"/testpath": {
|
||||||
Path: "/testpath",
|
Path: "/testpath",
|
||||||
Name: "testpath",
|
ItemInfo: ItemInfo{
|
||||||
Type: "directory",
|
Name: "testpath",
|
||||||
Files: []ReducedItem{
|
Type: "directory",
|
||||||
|
},
|
||||||
|
Files: []ItemInfo{
|
||||||
{Name: "testfile.txt", Size: 100},
|
{Name: "testfile.txt", Size: 100},
|
||||||
{Name: "anotherfile.txt", Size: 100},
|
{Name: "anotherfile.txt", Size: 100},
|
||||||
},
|
},
|
||||||
},
|
},
|
||||||
"/anotherpath": {
|
"/anotherpath": {
|
||||||
Path: "/anotherpath",
|
Path: "/anotherpath",
|
||||||
Name: "anotherpath",
|
ItemInfo: ItemInfo{
|
||||||
Type: "directory",
|
Name: "anotherpath",
|
||||||
Files: []ReducedItem{
|
Type: "directory",
|
||||||
|
},
|
||||||
|
Files: []ItemInfo{
|
||||||
{Name: "afile.txt", Size: 100},
|
{Name: "afile.txt", Size: 100},
|
||||||
},
|
},
|
||||||
Dirs: map[string]*FileInfo{
|
Folders: []ItemInfo{
|
||||||
"directory": {Name: "directory", Type: "directory", Size: 100},
|
{Name: "directory", Type: "directory", Size: 100},
|
||||||
},
|
},
|
||||||
},
|
},
|
||||||
},
|
},
|
||||||
|
|
|
@ -2,9 +2,11 @@ package http
|
||||||
|
|
||||||
import (
|
import (
|
||||||
"encoding/json"
|
"encoding/json"
|
||||||
|
libError "errors"
|
||||||
"fmt"
|
"fmt"
|
||||||
"log"
|
"log"
|
||||||
"net/http"
|
"net/http"
|
||||||
|
"net/url"
|
||||||
"os"
|
"os"
|
||||||
"strings"
|
"strings"
|
||||||
"sync"
|
"sync"
|
||||||
|
@ -12,9 +14,11 @@ import (
|
||||||
|
|
||||||
"github.com/golang-jwt/jwt/v4"
|
"github.com/golang-jwt/jwt/v4"
|
||||||
"github.com/golang-jwt/jwt/v4/request"
|
"github.com/golang-jwt/jwt/v4/request"
|
||||||
|
"golang.org/x/crypto/bcrypt"
|
||||||
|
|
||||||
"github.com/gtsteffaniak/filebrowser/errors"
|
"github.com/gtsteffaniak/filebrowser/errors"
|
||||||
"github.com/gtsteffaniak/filebrowser/settings"
|
"github.com/gtsteffaniak/filebrowser/settings"
|
||||||
|
"github.com/gtsteffaniak/filebrowser/share"
|
||||||
"github.com/gtsteffaniak/filebrowser/users"
|
"github.com/gtsteffaniak/filebrowser/users"
|
||||||
"github.com/gtsteffaniak/filebrowser/utils"
|
"github.com/gtsteffaniak/filebrowser/utils"
|
||||||
)
|
)
|
||||||
|
@ -207,3 +211,29 @@ func makeSignedTokenAPI(user *users.User, name string, duration time.Duration, p
|
||||||
}
|
}
|
||||||
return claim, err
|
return claim, err
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func authenticateShareRequest(r *http.Request, l *share.Link) (int, error) {
|
||||||
|
if l.PasswordHash == "" {
|
||||||
|
return 200, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
if r.URL.Query().Get("token") == l.Token {
|
||||||
|
return 200, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
password := r.Header.Get("X-SHARE-PASSWORD")
|
||||||
|
password, err := url.QueryUnescape(password)
|
||||||
|
if err != nil {
|
||||||
|
return http.StatusUnauthorized, err
|
||||||
|
}
|
||||||
|
if password == "" {
|
||||||
|
return http.StatusUnauthorized, nil
|
||||||
|
}
|
||||||
|
if err := bcrypt.CompareHashAndPassword([]byte(l.PasswordHash), []byte(password)); err != nil {
|
||||||
|
if libError.Is(err, bcrypt.ErrMismatchedHashAndPassword) {
|
||||||
|
return http.StatusUnauthorized, nil
|
||||||
|
}
|
||||||
|
return 401, err
|
||||||
|
}
|
||||||
|
return 200, nil
|
||||||
|
}
|
||||||
|
|
|
@ -26,6 +26,8 @@ type HttpResponse struct {
|
||||||
Token string `json:"token,omitempty"`
|
Token string `json:"token,omitempty"`
|
||||||
}
|
}
|
||||||
|
|
||||||
|
var FileInfoFasterFunc = files.FileInfoFaster
|
||||||
|
|
||||||
// Updated handleFunc to match the new signature
|
// Updated handleFunc to match the new signature
|
||||||
type handleFunc func(w http.ResponseWriter, r *http.Request, data *requestContext) (int, error)
|
type handleFunc func(w http.ResponseWriter, r *http.Request, data *requestContext) (int, error)
|
||||||
|
|
||||||
|
@ -39,30 +41,30 @@ func withHashFileHelper(fn handleFunc) handleFunc {
|
||||||
// Get the file link by hash
|
// Get the file link by hash
|
||||||
link, err := store.Share.GetByHash(hash)
|
link, err := store.Share.GetByHash(hash)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return http.StatusNotFound, err
|
return http.StatusNotFound, fmt.Errorf("share not found")
|
||||||
}
|
}
|
||||||
// Authenticate the share request if needed
|
// Authenticate the share request if needed
|
||||||
var status int
|
var status int
|
||||||
if link.Hash != "" {
|
if link.Hash != "" {
|
||||||
status, err = authenticateShareRequest(r, link)
|
status, err = authenticateShareRequest(r, link)
|
||||||
if err != nil || status != http.StatusOK {
|
if err != nil || status != http.StatusOK {
|
||||||
return status, err
|
return status, fmt.Errorf("could not authenticate share request")
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
// Retrieve the user (using the public user by default)
|
// Retrieve the user (using the public user by default)
|
||||||
user := &users.PublicUser
|
user := &users.PublicUser
|
||||||
|
|
||||||
// Get file information with options
|
// Get file information with options
|
||||||
file, err := files.FileInfoFaster(files.FileOptions{
|
file, err := FileInfoFasterFunc(files.FileOptions{
|
||||||
Path: filepath.Join(user.Scope, link.Path+"/"+path),
|
Path: filepath.Join(user.Scope, link.Path+"/"+path),
|
||||||
Modify: user.Perm.Modify,
|
Modify: user.Perm.Modify,
|
||||||
Expand: true,
|
Expand: true,
|
||||||
ReadHeader: config.Server.TypeDetectionByHeader,
|
ReadHeader: config.Server.TypeDetectionByHeader,
|
||||||
Checker: user, // Call your checker function here
|
Checker: user, // Call your checker function here
|
||||||
Token: link.Token,
|
|
||||||
})
|
})
|
||||||
|
file.Token = link.Token
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return errToStatus(err), err
|
return errToStatus(err), fmt.Errorf("error fetching share from server")
|
||||||
}
|
}
|
||||||
|
|
||||||
// Set the file info in the `data` object
|
// Set the file info in the `data` object
|
||||||
|
@ -89,6 +91,7 @@ func withAdminHelper(fn handleFunc) handleFunc {
|
||||||
// Middleware to retrieve and authenticate user
|
// Middleware to retrieve and authenticate user
|
||||||
func withUserHelper(fn handleFunc) handleFunc {
|
func withUserHelper(fn handleFunc) handleFunc {
|
||||||
return func(w http.ResponseWriter, r *http.Request, data *requestContext) (int, error) {
|
return func(w http.ResponseWriter, r *http.Request, data *requestContext) (int, error) {
|
||||||
|
|
||||||
keyFunc := func(token *jwt.Token) (interface{}, error) {
|
keyFunc := func(token *jwt.Token) (interface{}, error) {
|
||||||
return config.Auth.Key, nil
|
return config.Auth.Key, nil
|
||||||
}
|
}
|
||||||
|
@ -243,6 +246,7 @@ func (w *ResponseWriterWrapper) Write(b []byte) (int, error) {
|
||||||
// LoggingMiddleware logs each request and its status code
|
// LoggingMiddleware logs each request and its status code
|
||||||
func LoggingMiddleware(next http.Handler) http.Handler {
|
func LoggingMiddleware(next http.Handler) http.Handler {
|
||||||
return http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
|
return http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
|
||||||
|
|
||||||
start := time.Now()
|
start := time.Now()
|
||||||
|
|
||||||
// Wrap the ResponseWriter to capture the status code
|
// Wrap the ResponseWriter to capture the status code
|
||||||
|
|
|
@ -9,6 +9,7 @@ import (
|
||||||
|
|
||||||
"github.com/asdine/storm/v3"
|
"github.com/asdine/storm/v3"
|
||||||
"github.com/gtsteffaniak/filebrowser/diskcache"
|
"github.com/gtsteffaniak/filebrowser/diskcache"
|
||||||
|
"github.com/gtsteffaniak/filebrowser/files"
|
||||||
"github.com/gtsteffaniak/filebrowser/img"
|
"github.com/gtsteffaniak/filebrowser/img"
|
||||||
"github.com/gtsteffaniak/filebrowser/settings"
|
"github.com/gtsteffaniak/filebrowser/settings"
|
||||||
"github.com/gtsteffaniak/filebrowser/share"
|
"github.com/gtsteffaniak/filebrowser/share"
|
||||||
|
@ -37,6 +38,27 @@ func setupTestEnv(t *testing.T) {
|
||||||
fileCache = diskcache.NewNoOp() // mocked
|
fileCache = diskcache.NewNoOp() // mocked
|
||||||
imgSvc = img.New(1) // mocked
|
imgSvc = img.New(1) // mocked
|
||||||
config = &settings.Config // mocked
|
config = &settings.Config // mocked
|
||||||
|
mockFileInfoFaster(t) // Mock FileInfoFasterFunc for this test
|
||||||
|
}
|
||||||
|
|
||||||
|
func mockFileInfoFaster(t *testing.T) {
|
||||||
|
// Backup the original function
|
||||||
|
originalFileInfoFaster := FileInfoFasterFunc
|
||||||
|
// Defer restoration of the original function
|
||||||
|
t.Cleanup(func() { FileInfoFasterFunc = originalFileInfoFaster })
|
||||||
|
|
||||||
|
// Mock the function to skip execution
|
||||||
|
FileInfoFasterFunc = func(opts files.FileOptions) (files.ExtendedFileInfo, error) {
|
||||||
|
return files.ExtendedFileInfo{
|
||||||
|
FileInfo: &files.FileInfo{
|
||||||
|
Path: opts.Path,
|
||||||
|
ItemInfo: files.ItemInfo{
|
||||||
|
Name: "mocked_file",
|
||||||
|
Size: 12345,
|
||||||
|
},
|
||||||
|
},
|
||||||
|
}, nil
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
func TestWithAdminHelper(t *testing.T) {
|
func TestWithAdminHelper(t *testing.T) {
|
||||||
|
@ -197,10 +219,7 @@ func TestPublicShareHandlerAuthentication(t *testing.T) {
|
||||||
req := newTestRequest(t, tc.share.Hash, tc.token, tc.password, tc.extraHeaders)
|
req := newTestRequest(t, tc.share.Hash, tc.token, tc.password, tc.extraHeaders)
|
||||||
|
|
||||||
// Serve the request
|
// Serve the request
|
||||||
status, err := handler(recorder, req, &requestContext{})
|
status, _ := handler(recorder, req, &requestContext{})
|
||||||
if err != nil {
|
|
||||||
t.Fatalf("unexpected error: %v", err)
|
|
||||||
}
|
|
||||||
|
|
||||||
// Check if the response matches the expected status code
|
// Check if the response matches the expected status code
|
||||||
if status != tc.expectedStatusCode {
|
if status != tc.expectedStatusCode {
|
||||||
|
|
|
@ -49,27 +49,23 @@ func previewHandler(w http.ResponseWriter, r *http.Request, d *requestContext) (
|
||||||
if path == "" {
|
if path == "" {
|
||||||
return http.StatusBadRequest, fmt.Errorf("invalid request path")
|
return http.StatusBadRequest, fmt.Errorf("invalid request path")
|
||||||
}
|
}
|
||||||
file, err := files.FileInfoFaster(files.FileOptions{
|
response, err := files.FileInfoFaster(files.FileOptions{
|
||||||
Path: filepath.Join(d.user.Scope, path),
|
Path: filepath.Join(d.user.Scope, path),
|
||||||
Modify: d.user.Perm.Modify,
|
Modify: d.user.Perm.Modify,
|
||||||
Expand: true,
|
Expand: true,
|
||||||
ReadHeader: config.Server.TypeDetectionByHeader,
|
ReadHeader: config.Server.TypeDetectionByHeader,
|
||||||
Checker: d.user,
|
Checker: d.user,
|
||||||
})
|
})
|
||||||
|
fileInfo := response.FileInfo
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return errToStatus(err), err
|
return errToStatus(err), err
|
||||||
}
|
}
|
||||||
realPath, _, err := files.GetRealPath(file.Path)
|
if fileInfo.Type == "directory" {
|
||||||
if err != nil {
|
|
||||||
return http.StatusInternalServerError, err
|
|
||||||
}
|
|
||||||
file.Path = realPath
|
|
||||||
if file.Type == "directory" {
|
|
||||||
return http.StatusBadRequest, fmt.Errorf("can't create preview for directory")
|
return http.StatusBadRequest, fmt.Errorf("can't create preview for directory")
|
||||||
}
|
}
|
||||||
setContentDisposition(w, r, file)
|
setContentDisposition(w, r, fileInfo)
|
||||||
if file.Type != "image" {
|
if fileInfo.Type != "image" {
|
||||||
return http.StatusNotImplemented, fmt.Errorf("can't create preview for %s type", file.Type)
|
return http.StatusNotImplemented, fmt.Errorf("can't create preview for %s type", fileInfo.Type)
|
||||||
}
|
}
|
||||||
|
|
||||||
if (previewSize == "large" && !config.Server.ResizePreview) ||
|
if (previewSize == "large" && !config.Server.ResizePreview) ||
|
||||||
|
@ -77,40 +73,40 @@ func previewHandler(w http.ResponseWriter, r *http.Request, d *requestContext) (
|
||||||
if !d.user.Perm.Download {
|
if !d.user.Perm.Download {
|
||||||
return http.StatusAccepted, nil
|
return http.StatusAccepted, nil
|
||||||
}
|
}
|
||||||
return rawFileHandler(w, r, file)
|
return rawFileHandler(w, r, fileInfo)
|
||||||
}
|
}
|
||||||
|
|
||||||
format, err := imgSvc.FormatFromExtension(filepath.Ext(file.Name))
|
format, err := imgSvc.FormatFromExtension(filepath.Ext(fileInfo.Name))
|
||||||
// Unsupported extensions directly return the raw data
|
// Unsupported extensions directly return the raw data
|
||||||
if err == img.ErrUnsupportedFormat || format == img.FormatGif {
|
if err == img.ErrUnsupportedFormat || format == img.FormatGif {
|
||||||
if !d.user.Perm.Download {
|
if !d.user.Perm.Download {
|
||||||
return http.StatusAccepted, nil
|
return http.StatusAccepted, nil
|
||||||
}
|
}
|
||||||
return rawFileHandler(w, r, file)
|
return rawFileHandler(w, r, fileInfo)
|
||||||
}
|
}
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return errToStatus(err), err
|
return errToStatus(err), err
|
||||||
}
|
}
|
||||||
cacheKey := previewCacheKey(file, previewSize)
|
cacheKey := previewCacheKey(fileInfo, previewSize)
|
||||||
resizedImage, ok, err := fileCache.Load(r.Context(), cacheKey)
|
resizedImage, ok, err := fileCache.Load(r.Context(), cacheKey)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return errToStatus(err), err
|
return errToStatus(err), err
|
||||||
}
|
}
|
||||||
|
|
||||||
if !ok {
|
if !ok {
|
||||||
resizedImage, err = createPreview(imgSvc, fileCache, file, previewSize)
|
resizedImage, err = createPreview(imgSvc, fileCache, fileInfo, previewSize)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return errToStatus(err), err
|
return errToStatus(err), err
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
w.Header().Set("Cache-Control", "private")
|
w.Header().Set("Cache-Control", "private")
|
||||||
http.ServeContent(w, r, file.Path, file.ModTime, bytes.NewReader(resizedImage))
|
http.ServeContent(w, r, fileInfo.RealPath(), fileInfo.ModTime, bytes.NewReader(resizedImage))
|
||||||
|
|
||||||
return 0, nil
|
return 0, nil
|
||||||
}
|
}
|
||||||
|
|
||||||
func createPreview(imgSvc ImgService, fileCache FileCache, file *files.FileInfo, previewSize string) ([]byte, error) {
|
func createPreview(imgSvc ImgService, fileCache FileCache, file *files.FileInfo, previewSize string) ([]byte, error) {
|
||||||
fd, err := os.Open(file.Path)
|
fd, err := os.Open(file.RealPath())
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, err
|
return nil, err
|
||||||
}
|
}
|
||||||
|
|
|
@ -2,24 +2,19 @@ package http
|
||||||
|
|
||||||
import (
|
import (
|
||||||
"encoding/json"
|
"encoding/json"
|
||||||
"errors"
|
|
||||||
"fmt"
|
"fmt"
|
||||||
"net/http"
|
"net/http"
|
||||||
"net/url"
|
|
||||||
"strings"
|
"strings"
|
||||||
|
|
||||||
"golang.org/x/crypto/bcrypt"
|
|
||||||
|
|
||||||
"github.com/gtsteffaniak/filebrowser/files"
|
"github.com/gtsteffaniak/filebrowser/files"
|
||||||
"github.com/gtsteffaniak/filebrowser/settings"
|
"github.com/gtsteffaniak/filebrowser/settings"
|
||||||
"github.com/gtsteffaniak/filebrowser/share"
|
|
||||||
"github.com/gtsteffaniak/filebrowser/users"
|
"github.com/gtsteffaniak/filebrowser/users"
|
||||||
|
|
||||||
_ "github.com/gtsteffaniak/filebrowser/swagger/docs"
|
_ "github.com/gtsteffaniak/filebrowser/swagger/docs"
|
||||||
)
|
)
|
||||||
|
|
||||||
func publicShareHandler(w http.ResponseWriter, r *http.Request, d *requestContext) (int, error) {
|
func publicShareHandler(w http.ResponseWriter, r *http.Request, d *requestContext) (int, error) {
|
||||||
file, ok := d.raw.(*files.FileInfo)
|
file, ok := d.raw.(files.ExtendedFileInfo)
|
||||||
if !ok {
|
if !ok {
|
||||||
return http.StatusInternalServerError, fmt.Errorf("failed to assert type *files.FileInfo")
|
return http.StatusInternalServerError, fmt.Errorf("failed to assert type *files.FileInfo")
|
||||||
}
|
}
|
||||||
|
@ -38,8 +33,8 @@ func publicUserGetHandler(w http.ResponseWriter, r *http.Request) {
|
||||||
}
|
}
|
||||||
|
|
||||||
func publicDlHandler(w http.ResponseWriter, r *http.Request, d *requestContext) (int, error) {
|
func publicDlHandler(w http.ResponseWriter, r *http.Request, d *requestContext) (int, error) {
|
||||||
file, _ := d.raw.(*files.FileInfo)
|
file, ok := d.raw.(files.ExtendedFileInfo)
|
||||||
if file == nil {
|
if !ok {
|
||||||
return http.StatusInternalServerError, fmt.Errorf("failed to assert type files.FileInfo")
|
return http.StatusInternalServerError, fmt.Errorf("failed to assert type files.FileInfo")
|
||||||
}
|
}
|
||||||
if d.user == nil {
|
if d.user == nil {
|
||||||
|
@ -47,36 +42,10 @@ func publicDlHandler(w http.ResponseWriter, r *http.Request, d *requestContext)
|
||||||
}
|
}
|
||||||
|
|
||||||
if file.Type == "directory" {
|
if file.Type == "directory" {
|
||||||
return rawDirHandler(w, r, d, file)
|
return rawDirHandler(w, r, d, file.FileInfo)
|
||||||
}
|
}
|
||||||
|
|
||||||
return rawFileHandler(w, r, file)
|
return rawFileHandler(w, r, file.FileInfo)
|
||||||
}
|
|
||||||
|
|
||||||
func authenticateShareRequest(r *http.Request, l *share.Link) (int, error) {
|
|
||||||
if l.PasswordHash == "" {
|
|
||||||
return 200, nil
|
|
||||||
}
|
|
||||||
|
|
||||||
if r.URL.Query().Get("token") == l.Token {
|
|
||||||
return 200, nil
|
|
||||||
}
|
|
||||||
|
|
||||||
password := r.Header.Get("X-SHARE-PASSWORD")
|
|
||||||
password, err := url.QueryUnescape(password)
|
|
||||||
if err != nil {
|
|
||||||
return http.StatusUnauthorized, err
|
|
||||||
}
|
|
||||||
if password == "" {
|
|
||||||
return http.StatusUnauthorized, nil
|
|
||||||
}
|
|
||||||
if err := bcrypt.CompareHashAndPassword([]byte(l.PasswordHash), []byte(password)); err != nil {
|
|
||||||
if errors.Is(err, bcrypt.ErrMismatchedHashAndPassword) {
|
|
||||||
return http.StatusUnauthorized, nil
|
|
||||||
}
|
|
||||||
return 401, err
|
|
||||||
}
|
|
||||||
return 200, nil
|
|
||||||
}
|
}
|
||||||
|
|
||||||
// health godoc
|
// health godoc
|
||||||
|
|
|
@ -99,7 +99,7 @@ func rawHandler(w http.ResponseWriter, r *http.Request, d *requestContext) (int,
|
||||||
return http.StatusAccepted, nil
|
return http.StatusAccepted, nil
|
||||||
}
|
}
|
||||||
path := r.URL.Query().Get("path")
|
path := r.URL.Query().Get("path")
|
||||||
file, err := files.FileInfoFaster(files.FileOptions{
|
fileInfo, err := files.FileInfoFaster(files.FileOptions{
|
||||||
Path: filepath.Join(d.user.Scope, path),
|
Path: filepath.Join(d.user.Scope, path),
|
||||||
Modify: d.user.Perm.Modify,
|
Modify: d.user.Perm.Modify,
|
||||||
Expand: false,
|
Expand: false,
|
||||||
|
@ -109,15 +109,19 @@ func rawHandler(w http.ResponseWriter, r *http.Request, d *requestContext) (int,
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return errToStatus(err), err
|
return errToStatus(err), err
|
||||||
}
|
}
|
||||||
if files.IsNamedPipe(file.Mode) {
|
|
||||||
setContentDisposition(w, r, file)
|
// TODO, how to handle? we removed mode, is it needed?
|
||||||
return 0, nil
|
// maybe instead of mode we use bool only two conditions are checked
|
||||||
}
|
//if files.IsNamedPipe(fileInfo.Mode) {
|
||||||
if file.Type == "directory" {
|
// setContentDisposition(w, r, file)
|
||||||
return rawDirHandler(w, r, d, file)
|
// return 0, nil
|
||||||
|
//}
|
||||||
|
|
||||||
|
if fileInfo.Type == "directory" {
|
||||||
|
return rawDirHandler(w, r, d, fileInfo.FileInfo)
|
||||||
}
|
}
|
||||||
|
|
||||||
return rawFileHandler(w, r, file)
|
return rawFileHandler(w, r, fileInfo.FileInfo)
|
||||||
}
|
}
|
||||||
|
|
||||||
func addFile(ar archiver.Writer, d *requestContext, path, commonPath string) error {
|
func addFile(ar archiver.Writer, d *requestContext, path, commonPath string) error {
|
||||||
|
|
|
@ -14,6 +14,7 @@ import (
|
||||||
|
|
||||||
"github.com/gtsteffaniak/filebrowser/errors"
|
"github.com/gtsteffaniak/filebrowser/errors"
|
||||||
"github.com/gtsteffaniak/filebrowser/files"
|
"github.com/gtsteffaniak/filebrowser/files"
|
||||||
|
"github.com/gtsteffaniak/filebrowser/utils"
|
||||||
)
|
)
|
||||||
|
|
||||||
// resourceGetHandler retrieves information about a resource.
|
// resourceGetHandler retrieves information about a resource.
|
||||||
|
@ -31,9 +32,10 @@ import (
|
||||||
// @Failure 500 {object} map[string]string "Internal server error"
|
// @Failure 500 {object} map[string]string "Internal server error"
|
||||||
// @Router /api/resources [get]
|
// @Router /api/resources [get]
|
||||||
func resourceGetHandler(w http.ResponseWriter, r *http.Request, d *requestContext) (int, error) {
|
func resourceGetHandler(w http.ResponseWriter, r *http.Request, d *requestContext) (int, error) {
|
||||||
|
|
||||||
// TODO source := r.URL.Query().Get("source")
|
// TODO source := r.URL.Query().Get("source")
|
||||||
path := r.URL.Query().Get("path")
|
path := r.URL.Query().Get("path")
|
||||||
file, err := files.FileInfoFaster(files.FileOptions{
|
fileInfo, err := files.FileInfoFaster(files.FileOptions{
|
||||||
Path: filepath.Join(d.user.Scope, path),
|
Path: filepath.Join(d.user.Scope, path),
|
||||||
Modify: d.user.Perm.Modify,
|
Modify: d.user.Perm.Modify,
|
||||||
Expand: true,
|
Expand: true,
|
||||||
|
@ -44,18 +46,19 @@ func resourceGetHandler(w http.ResponseWriter, r *http.Request, d *requestContex
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return errToStatus(err), err
|
return errToStatus(err), err
|
||||||
}
|
}
|
||||||
if file.Type == "directory" {
|
if fileInfo.Type == "directory" {
|
||||||
return renderJSON(w, r, file)
|
return renderJSON(w, r, fileInfo)
|
||||||
}
|
}
|
||||||
if checksum := r.URL.Query().Get("checksum"); checksum != "" {
|
if algo := r.URL.Query().Get("checksum"); algo != "" {
|
||||||
err := file.Checksum(checksum)
|
checksums, err := files.GetChecksum(fileInfo.Path, algo)
|
||||||
if err == errors.ErrInvalidOption {
|
if err == errors.ErrInvalidOption {
|
||||||
return http.StatusBadRequest, nil
|
return http.StatusBadRequest, nil
|
||||||
} else if err != nil {
|
} else if err != nil {
|
||||||
return http.StatusInternalServerError, err
|
return http.StatusInternalServerError, err
|
||||||
}
|
}
|
||||||
|
fileInfo.Checksums = checksums
|
||||||
}
|
}
|
||||||
return renderJSON(w, r, file)
|
return renderJSON(w, r, fileInfo)
|
||||||
|
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -90,13 +93,13 @@ func resourceDeleteHandler(w http.ResponseWriter, r *http.Request, d *requestCon
|
||||||
ReadHeader: config.Server.TypeDetectionByHeader,
|
ReadHeader: config.Server.TypeDetectionByHeader,
|
||||||
Checker: d.user,
|
Checker: d.user,
|
||||||
}
|
}
|
||||||
file, err := files.FileInfoFaster(fileOpts)
|
fileInfo, err := files.FileInfoFaster(fileOpts)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return errToStatus(err), err
|
return errToStatus(err), err
|
||||||
}
|
}
|
||||||
|
|
||||||
// delete thumbnails
|
// delete thumbnails
|
||||||
err = delThumbs(r.Context(), fileCache, file)
|
err = delThumbs(r.Context(), fileCache, fileInfo.FileInfo)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return errToStatus(err), err
|
return errToStatus(err), err
|
||||||
}
|
}
|
||||||
|
@ -131,11 +134,10 @@ func resourcePostHandler(w http.ResponseWriter, r *http.Request, d *requestConte
|
||||||
return http.StatusForbidden, nil
|
return http.StatusForbidden, nil
|
||||||
}
|
}
|
||||||
fileOpts := files.FileOptions{
|
fileOpts := files.FileOptions{
|
||||||
Path: filepath.Join(d.user.Scope, path),
|
Path: filepath.Join(d.user.Scope, path),
|
||||||
Modify: d.user.Perm.Modify,
|
Modify: d.user.Perm.Modify,
|
||||||
Expand: false,
|
Expand: false,
|
||||||
ReadHeader: config.Server.TypeDetectionByHeader,
|
Checker: d.user,
|
||||||
Checker: d.user,
|
|
||||||
}
|
}
|
||||||
// Directories creation on POST.
|
// Directories creation on POST.
|
||||||
if strings.HasSuffix(path, "/") {
|
if strings.HasSuffix(path, "/") {
|
||||||
|
@ -145,7 +147,7 @@ func resourcePostHandler(w http.ResponseWriter, r *http.Request, d *requestConte
|
||||||
}
|
}
|
||||||
return http.StatusOK, nil
|
return http.StatusOK, nil
|
||||||
}
|
}
|
||||||
file, err := files.FileInfoFaster(fileOpts)
|
fileInfo, err := files.FileInfoFaster(fileOpts)
|
||||||
if err == nil {
|
if err == nil {
|
||||||
if r.URL.Query().Get("override") != "true" {
|
if r.URL.Query().Get("override") != "true" {
|
||||||
return http.StatusConflict, nil
|
return http.StatusConflict, nil
|
||||||
|
@ -156,13 +158,17 @@ func resourcePostHandler(w http.ResponseWriter, r *http.Request, d *requestConte
|
||||||
return http.StatusForbidden, nil
|
return http.StatusForbidden, nil
|
||||||
}
|
}
|
||||||
|
|
||||||
err = delThumbs(r.Context(), fileCache, file)
|
err = delThumbs(r.Context(), fileCache, fileInfo.FileInfo)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return errToStatus(err), err
|
return errToStatus(err), err
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
err = files.WriteFile(fileOpts, r.Body)
|
err = files.WriteFile(fileOpts, r.Body)
|
||||||
return errToStatus(err), err
|
if err != nil {
|
||||||
|
return errToStatus(err), err
|
||||||
|
|
||||||
|
}
|
||||||
|
return http.StatusOK, nil
|
||||||
}
|
}
|
||||||
|
|
||||||
// resourcePutHandler updates an existing file resource.
|
// resourcePutHandler updates an existing file resource.
|
||||||
|
@ -301,7 +307,7 @@ func patchAction(ctx context.Context, action, src, dst string, d *requestContext
|
||||||
if !d.user.Perm.Rename {
|
if !d.user.Perm.Rename {
|
||||||
return errors.ErrPermissionDenied
|
return errors.ErrPermissionDenied
|
||||||
}
|
}
|
||||||
file, err := files.FileInfoFaster(files.FileOptions{
|
fileInfo, err := files.FileInfoFaster(files.FileOptions{
|
||||||
Path: src,
|
Path: src,
|
||||||
IsDir: isSrcDir,
|
IsDir: isSrcDir,
|
||||||
Modify: d.user.Perm.Modify,
|
Modify: d.user.Perm.Modify,
|
||||||
|
@ -314,7 +320,7 @@ func patchAction(ctx context.Context, action, src, dst string, d *requestContext
|
||||||
}
|
}
|
||||||
|
|
||||||
// delete thumbnails
|
// delete thumbnails
|
||||||
err = delThumbs(ctx, fileCache, file)
|
err = delThumbs(ctx, fileCache, fileInfo.FileInfo)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return err
|
return err
|
||||||
}
|
}
|
||||||
|
@ -345,25 +351,29 @@ func diskUsage(w http.ResponseWriter, r *http.Request, d *requestContext) (int,
|
||||||
if source == "" {
|
if source == "" {
|
||||||
source = "/"
|
source = "/"
|
||||||
}
|
}
|
||||||
file, err := files.FileInfoFaster(files.FileOptions{
|
|
||||||
Path: source,
|
value, ok := utils.DiskUsageCache.Get(source).(DiskUsageResponse)
|
||||||
Checker: d.user,
|
if ok {
|
||||||
})
|
return renderJSON(w, r, &value)
|
||||||
|
}
|
||||||
|
|
||||||
|
fPath, isDir, err := files.GetRealPath(d.user.Scope, source)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return errToStatus(err), err
|
return errToStatus(err), err
|
||||||
}
|
}
|
||||||
fPath := file.RealPath()
|
if !isDir {
|
||||||
if file.Type != "directory" {
|
return http.StatusNotFound, fmt.Errorf("not a directory: %s", source)
|
||||||
return http.StatusBadRequest, fmt.Errorf("path is not a directory")
|
|
||||||
}
|
}
|
||||||
usage, err := disk.UsageWithContext(r.Context(), fPath)
|
usage, err := disk.UsageWithContext(r.Context(), fPath)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return errToStatus(err), err
|
return errToStatus(err), err
|
||||||
}
|
}
|
||||||
return renderJSON(w, r, &DiskUsageResponse{
|
latestUsage := DiskUsageResponse{
|
||||||
Total: usage.Total,
|
Total: usage.Total,
|
||||||
Used: usage.Used,
|
Used: usage.Used,
|
||||||
})
|
}
|
||||||
|
utils.DiskUsageCache.Set(source, latestUsage)
|
||||||
|
return renderJSON(w, r, &latestUsage)
|
||||||
}
|
}
|
||||||
|
|
||||||
func inspectIndex(w http.ResponseWriter, r *http.Request) {
|
func inspectIndex(w http.ResponseWriter, r *http.Request) {
|
||||||
|
|
|
@ -122,7 +122,7 @@ func StartHttp(Service ImgService, storage *storage.Storage, cache FileCache) {
|
||||||
router.HandleFunc(config.Server.BaseURL, indexHandler)
|
router.HandleFunc(config.Server.BaseURL, indexHandler)
|
||||||
|
|
||||||
// health
|
// health
|
||||||
router.HandleFunc(fmt.Sprintf("GET %vhealth/", config.Server.BaseURL), healthHandler)
|
router.HandleFunc(fmt.Sprintf("GET %vhealth", config.Server.BaseURL), healthHandler)
|
||||||
|
|
||||||
// Swagger
|
// Swagger
|
||||||
router.Handle(fmt.Sprintf("%vswagger/", config.Server.BaseURL),
|
router.Handle(fmt.Sprintf("%vswagger/", config.Server.BaseURL),
|
||||||
|
@ -172,7 +172,7 @@ func StartHttp(Service ImgService, storage *storage.Storage, cache FileCache) {
|
||||||
} else {
|
} else {
|
||||||
// Set HTTP scheme and the default port for HTTP
|
// Set HTTP scheme and the default port for HTTP
|
||||||
scheme = "http"
|
scheme = "http"
|
||||||
if config.Server.Port != 443 {
|
if config.Server.Port != 80 {
|
||||||
port = fmt.Sprintf(":%d", config.Server.Port)
|
port = fmt.Sprintf(":%d", config.Server.Port)
|
||||||
}
|
}
|
||||||
// Build the full URL with host and port
|
// Build the full URL with host and port
|
||||||
|
|
|
@ -69,7 +69,7 @@ func shareGetsHandler(w http.ResponseWriter, r *http.Request, d *requestContext)
|
||||||
return renderJSON(w, r, []*share.Link{})
|
return renderJSON(w, r, []*share.Link{})
|
||||||
}
|
}
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return http.StatusInternalServerError, err
|
return http.StatusInternalServerError, fmt.Errorf("error getting share info from server")
|
||||||
}
|
}
|
||||||
return renderJSON(w, r, s)
|
return renderJSON(w, r, s)
|
||||||
}
|
}
|
||||||
|
@ -188,7 +188,7 @@ func getSharePasswordHash(body share.CreateBody) (data []byte, statuscode int, e
|
||||||
|
|
||||||
hash, err := bcrypt.GenerateFromPassword([]byte(body.Password), bcrypt.DefaultCost)
|
hash, err := bcrypt.GenerateFromPassword([]byte(body.Password), bcrypt.DefaultCost)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, http.StatusInternalServerError, fmt.Errorf("failed to hash password: %w", err)
|
return nil, http.StatusInternalServerError, fmt.Errorf("failed to hash password")
|
||||||
}
|
}
|
||||||
|
|
||||||
return hash, 0, nil
|
return hash, 0, nil
|
||||||
|
|
Binary file not shown.
Before Width: | Height: | Size: 72 KiB |
|
@ -66,7 +66,6 @@ func setDefaults() Settings {
|
||||||
EnableThumbnails: true,
|
EnableThumbnails: true,
|
||||||
ResizePreview: false,
|
ResizePreview: false,
|
||||||
EnableExec: false,
|
EnableExec: false,
|
||||||
IndexingInterval: 5,
|
|
||||||
Port: 80,
|
Port: 80,
|
||||||
NumImageProcessors: 4,
|
NumImageProcessors: 4,
|
||||||
BaseURL: "",
|
BaseURL: "",
|
||||||
|
|
|
@ -1155,22 +1155,16 @@ const docTemplate = `{
|
||||||
"files.FileInfo": {
|
"files.FileInfo": {
|
||||||
"type": "object",
|
"type": "object",
|
||||||
"properties": {
|
"properties": {
|
||||||
"checksums": {
|
"files": {
|
||||||
"type": "object",
|
|
||||||
"additionalProperties": {
|
|
||||||
"type": "string"
|
|
||||||
}
|
|
||||||
},
|
|
||||||
"content": {
|
|
||||||
"type": "string"
|
|
||||||
},
|
|
||||||
"isSymlink": {
|
|
||||||
"type": "boolean"
|
|
||||||
},
|
|
||||||
"items": {
|
|
||||||
"type": "array",
|
"type": "array",
|
||||||
"items": {
|
"items": {
|
||||||
"$ref": "#/definitions/files.ReducedItem"
|
"$ref": "#/definitions/files.ItemInfo"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"folders": {
|
||||||
|
"type": "array",
|
||||||
|
"items": {
|
||||||
|
"$ref": "#/definitions/files.ItemInfo"
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
"modified": {
|
"modified": {
|
||||||
|
@ -1185,26 +1179,14 @@ const docTemplate = `{
|
||||||
"size": {
|
"size": {
|
||||||
"type": "integer"
|
"type": "integer"
|
||||||
},
|
},
|
||||||
"subtitles": {
|
|
||||||
"type": "array",
|
|
||||||
"items": {
|
|
||||||
"type": "string"
|
|
||||||
}
|
|
||||||
},
|
|
||||||
"token": {
|
|
||||||
"type": "string"
|
|
||||||
},
|
|
||||||
"type": {
|
"type": {
|
||||||
"type": "string"
|
"type": "string"
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
"files.ReducedItem": {
|
"files.ItemInfo": {
|
||||||
"type": "object",
|
"type": "object",
|
||||||
"properties": {
|
"properties": {
|
||||||
"content": {
|
|
||||||
"type": "string"
|
|
||||||
},
|
|
||||||
"modified": {
|
"modified": {
|
||||||
"type": "string"
|
"type": "string"
|
||||||
},
|
},
|
||||||
|
|
|
@ -1144,22 +1144,16 @@
|
||||||
"files.FileInfo": {
|
"files.FileInfo": {
|
||||||
"type": "object",
|
"type": "object",
|
||||||
"properties": {
|
"properties": {
|
||||||
"checksums": {
|
"files": {
|
||||||
"type": "object",
|
|
||||||
"additionalProperties": {
|
|
||||||
"type": "string"
|
|
||||||
}
|
|
||||||
},
|
|
||||||
"content": {
|
|
||||||
"type": "string"
|
|
||||||
},
|
|
||||||
"isSymlink": {
|
|
||||||
"type": "boolean"
|
|
||||||
},
|
|
||||||
"items": {
|
|
||||||
"type": "array",
|
"type": "array",
|
||||||
"items": {
|
"items": {
|
||||||
"$ref": "#/definitions/files.ReducedItem"
|
"$ref": "#/definitions/files.ItemInfo"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"folders": {
|
||||||
|
"type": "array",
|
||||||
|
"items": {
|
||||||
|
"$ref": "#/definitions/files.ItemInfo"
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
"modified": {
|
"modified": {
|
||||||
|
@ -1174,26 +1168,14 @@
|
||||||
"size": {
|
"size": {
|
||||||
"type": "integer"
|
"type": "integer"
|
||||||
},
|
},
|
||||||
"subtitles": {
|
|
||||||
"type": "array",
|
|
||||||
"items": {
|
|
||||||
"type": "string"
|
|
||||||
}
|
|
||||||
},
|
|
||||||
"token": {
|
|
||||||
"type": "string"
|
|
||||||
},
|
|
||||||
"type": {
|
"type": {
|
||||||
"type": "string"
|
"type": "string"
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
"files.ReducedItem": {
|
"files.ItemInfo": {
|
||||||
"type": "object",
|
"type": "object",
|
||||||
"properties": {
|
"properties": {
|
||||||
"content": {
|
|
||||||
"type": "string"
|
|
||||||
},
|
|
||||||
"modified": {
|
"modified": {
|
||||||
"type": "string"
|
"type": "string"
|
||||||
},
|
},
|
||||||
|
|
|
@ -1,17 +1,13 @@
|
||||||
definitions:
|
definitions:
|
||||||
files.FileInfo:
|
files.FileInfo:
|
||||||
properties:
|
properties:
|
||||||
checksums:
|
files:
|
||||||
additionalProperties:
|
|
||||||
type: string
|
|
||||||
type: object
|
|
||||||
content:
|
|
||||||
type: string
|
|
||||||
isSymlink:
|
|
||||||
type: boolean
|
|
||||||
items:
|
|
||||||
items:
|
items:
|
||||||
$ref: '#/definitions/files.ReducedItem'
|
$ref: '#/definitions/files.ItemInfo'
|
||||||
|
type: array
|
||||||
|
folders:
|
||||||
|
items:
|
||||||
|
$ref: '#/definitions/files.ItemInfo'
|
||||||
type: array
|
type: array
|
||||||
modified:
|
modified:
|
||||||
type: string
|
type: string
|
||||||
|
@ -21,19 +17,11 @@ definitions:
|
||||||
type: string
|
type: string
|
||||||
size:
|
size:
|
||||||
type: integer
|
type: integer
|
||||||
subtitles:
|
|
||||||
items:
|
|
||||||
type: string
|
|
||||||
type: array
|
|
||||||
token:
|
|
||||||
type: string
|
|
||||||
type:
|
type:
|
||||||
type: string
|
type: string
|
||||||
type: object
|
type: object
|
||||||
files.ReducedItem:
|
files.ItemInfo:
|
||||||
properties:
|
properties:
|
||||||
content:
|
|
||||||
type: string
|
|
||||||
modified:
|
modified:
|
||||||
type: string
|
type: string
|
||||||
name:
|
name:
|
||||||
|
|
|
@ -0,0 +1,80 @@
|
||||||
|
package utils
|
||||||
|
|
||||||
|
import (
|
||||||
|
"sync"
|
||||||
|
"time"
|
||||||
|
)
|
||||||
|
|
||||||
|
var (
|
||||||
|
DiskUsageCache = newCache(30*time.Second, 24*time.Hour)
|
||||||
|
RealPathCache = newCache(48*time.Hour, 72*time.Hour)
|
||||||
|
SearchResultsCache = newCache(15*time.Second, time.Hour)
|
||||||
|
)
|
||||||
|
|
||||||
|
func newCache(expires time.Duration, cleanup time.Duration) *KeyCache {
|
||||||
|
newCache := KeyCache{
|
||||||
|
data: make(map[string]cachedValue),
|
||||||
|
expiresAfter: expires, // default
|
||||||
|
}
|
||||||
|
go newCache.cleanupExpiredJob(cleanup)
|
||||||
|
return &newCache
|
||||||
|
}
|
||||||
|
|
||||||
|
type KeyCache struct {
|
||||||
|
data map[string]cachedValue
|
||||||
|
mu sync.RWMutex
|
||||||
|
expiresAfter time.Duration
|
||||||
|
}
|
||||||
|
|
||||||
|
type cachedValue struct {
|
||||||
|
value interface{}
|
||||||
|
expiresAt time.Time
|
||||||
|
}
|
||||||
|
|
||||||
|
func (c *KeyCache) Set(key string, value interface{}) {
|
||||||
|
c.mu.Lock()
|
||||||
|
defer c.mu.Unlock()
|
||||||
|
c.data[key] = cachedValue{
|
||||||
|
value: value,
|
||||||
|
expiresAt: time.Now().Add(c.expiresAfter),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func (c *KeyCache) SetWithExp(key string, value interface{}, exp time.Duration) {
|
||||||
|
c.mu.Lock()
|
||||||
|
defer c.mu.Unlock()
|
||||||
|
c.data[key] = cachedValue{
|
||||||
|
value: value,
|
||||||
|
expiresAt: time.Now().Add(exp),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func (c *KeyCache) Get(key string) interface{} {
|
||||||
|
c.mu.RLock()
|
||||||
|
defer c.mu.RUnlock()
|
||||||
|
cached, ok := c.data[key]
|
||||||
|
if !ok || time.Now().After(cached.expiresAt) {
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
return cached.value
|
||||||
|
}
|
||||||
|
|
||||||
|
func (c *KeyCache) cleanupExpired() {
|
||||||
|
c.mu.Lock()
|
||||||
|
defer c.mu.Unlock()
|
||||||
|
now := time.Now()
|
||||||
|
for key, cached := range c.data {
|
||||||
|
if now.After(cached.expiresAt) {
|
||||||
|
delete(c.data, key)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// should automatically run for all cache types as part of init.
|
||||||
|
func (c *KeyCache) cleanupExpiredJob(frequency time.Duration) {
|
||||||
|
ticker := time.NewTicker(frequency)
|
||||||
|
defer ticker.Stop()
|
||||||
|
for range ticker.C {
|
||||||
|
c.cleanupExpired()
|
||||||
|
}
|
||||||
|
}
|
|
@ -69,3 +69,18 @@ func PrintStructFields(v interface{}) {
|
||||||
fmt.Printf("Field: %s, %s\n", fieldType.Name, fieldValue)
|
fmt.Printf("Field: %s, %s\n", fieldType.Name, fieldValue)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func GetParentDirectoryPath(path string) string {
|
||||||
|
if path == "/" || path == "" {
|
||||||
|
return ""
|
||||||
|
}
|
||||||
|
path = strings.TrimSuffix(path, "/") // Remove trailing slash if any
|
||||||
|
lastSlash := strings.LastIndex(path, "/")
|
||||||
|
if lastSlash == -1 {
|
||||||
|
return "" // No parent directory for a relative path without slashes
|
||||||
|
}
|
||||||
|
if lastSlash == 0 {
|
||||||
|
return "/" // If the last slash is the first character, return root
|
||||||
|
}
|
||||||
|
return path[:lastSlash]
|
||||||
|
}
|
||||||
|
|
|
@ -0,0 +1,59 @@
|
||||||
|
package utils
|
||||||
|
|
||||||
|
import (
|
||||||
|
"testing"
|
||||||
|
)
|
||||||
|
|
||||||
|
func TestGetParentDirectoryPath(t *testing.T) {
|
||||||
|
tests := []struct {
|
||||||
|
input string
|
||||||
|
expectedOutput string
|
||||||
|
}{
|
||||||
|
{input: "/", expectedOutput: ""}, // Root directory
|
||||||
|
{input: "/subfolder", expectedOutput: "/"}, // Single subfolder
|
||||||
|
{input: "/sub/sub/", expectedOutput: "/sub"}, // Nested subfolder with trailing slash
|
||||||
|
{input: "/subfolder/", expectedOutput: "/"}, // Relative path with trailing slash
|
||||||
|
{input: "", expectedOutput: ""}, // Empty string treated as root
|
||||||
|
{input: "/sub/subfolder", expectedOutput: "/sub"}, // Double slash in path
|
||||||
|
{input: "/sub/subfolder/deep/nested/", expectedOutput: "/sub/subfolder/deep"}, // Double slash in path
|
||||||
|
}
|
||||||
|
|
||||||
|
for _, test := range tests {
|
||||||
|
t.Run(test.input, func(t *testing.T) {
|
||||||
|
actualOutput := GetParentDirectoryPath(test.input)
|
||||||
|
if actualOutput != test.expectedOutput {
|
||||||
|
t.Errorf("\n\tinput %q\n\texpected %q\n\tgot %q",
|
||||||
|
test.input, test.expectedOutput, actualOutput)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestCapitalizeFirst(t *testing.T) {
|
||||||
|
tests := []struct {
|
||||||
|
input string
|
||||||
|
expectedOutput string
|
||||||
|
}{
|
||||||
|
{input: "", expectedOutput: ""}, // Empty string
|
||||||
|
{input: "a", expectedOutput: "A"}, // Single lowercase letter
|
||||||
|
{input: "A", expectedOutput: "A"}, // Single uppercase letter
|
||||||
|
{input: "hello", expectedOutput: "Hello"}, // All lowercase
|
||||||
|
{input: "Hello", expectedOutput: "Hello"}, // Already capitalized
|
||||||
|
{input: "123hello", expectedOutput: "123hello"}, // Non-alphabetic first character
|
||||||
|
{input: "hELLO", expectedOutput: "HELLO"}, // Mixed case
|
||||||
|
{input: " hello", expectedOutput: " hello"}, // Leading space, no capitalization
|
||||||
|
{input: "hello world", expectedOutput: "Hello world"}, // Phrase with spaces
|
||||||
|
{input: " hello world", expectedOutput: " hello world"}, // Phrase with leading space
|
||||||
|
{input: "123 hello world", expectedOutput: "123 hello world"}, // Numbers before text
|
||||||
|
}
|
||||||
|
|
||||||
|
for _, test := range tests {
|
||||||
|
t.Run(test.input, func(t *testing.T) {
|
||||||
|
actualOutput := CapitalizeFirst(test.input)
|
||||||
|
if actualOutput != test.expectedOutput {
|
||||||
|
t.Errorf("\n\tinput %q\n\texpected %q\n\tgot %q",
|
||||||
|
test.input, test.expectedOutput, actualOutput)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
|
@ -10,7 +10,6 @@ Here is an expanded config file which includes all possible configurations:
|
||||||
server:
|
server:
|
||||||
CreateUserDir: false
|
CreateUserDir: false
|
||||||
UserHomeBasePath: ""
|
UserHomeBasePath: ""
|
||||||
indexingInterval: 5
|
|
||||||
indexing: true
|
indexing: true
|
||||||
numImageProcessors: 4
|
numImageProcessors: 4
|
||||||
socket: ""
|
socket: ""
|
||||||
|
@ -71,7 +70,6 @@ Here are the defaults if nothing is set:
|
||||||
server:
|
server:
|
||||||
enableThumbnails: true
|
enableThumbnails: true
|
||||||
enableExec: false
|
enableExec: false
|
||||||
indexingInterval: 5
|
|
||||||
port: 80
|
port: 80
|
||||||
numImageProcessors: 4
|
numImageProcessors: 4
|
||||||
baseURL: ""
|
baseURL: ""
|
||||||
|
@ -109,7 +107,7 @@ userDefaults:
|
||||||
|
|
||||||
### Server configuration settings
|
### Server configuration settings
|
||||||
|
|
||||||
- `indexingInterval`: This is the time in minutes the system waits before checking for filesystem changes. Default: `5`
|
- `indexingInterval`: This optional paramter disables smart indexing and specifies a time in minutes the system waits before checking for filesystem changes. See [indexing readme](indexing.md) for more information.
|
||||||
|
|
||||||
- `indexing`: This enables or disables indexing. (Note: search will not work without indexing) Default: `true`
|
- `indexing`: This enables or disables indexing. (Note: search will not work without indexing) Default: `true`
|
||||||
|
|
||||||
|
|
|
@ -1,2 +1,13 @@
|
||||||
# Contributing Guide
|
# Contributing Guide
|
||||||
|
|
||||||
|
If you would like to contribute, please open a pull request against main or the latest `dev_` branch thats currently in progress.
|
||||||
|
|
||||||
|
A PR is required to have:
|
||||||
|
|
||||||
|
1. A clear description about why it was opened
|
||||||
|
2. A short title that best describes the issue.
|
||||||
|
3. Test evidence for anything that is not self evident or covered by unit tests.
|
||||||
|
|
||||||
|
Unit tests should be updated to pass before merging. So, the best way to handle this is to create a fork and test your changes there, then merge to this repo. You can also create a draft pull request if it is not fully ready.
|
||||||
|
|
||||||
|
Please don't hesitate to open an issue for any ideas you have, but cannot contribute directly for whatever reason.
|
|
@ -0,0 +1,189 @@
|
||||||
|
# About Indexing on FileBrowser Quantum
|
||||||
|
|
||||||
|
The most significant feature is the index, this document intends to demystify how it works so you can understand how to ensure your index closely matches the current state of your filesystem.
|
||||||
|
|
||||||
|
## How does the index work?
|
||||||
|
|
||||||
|
The approach used by this repo includes filesystem watchers that periodically scan the directory tree for changes. By default, this uses a smart scan strategy, but you can also configure a set interval in your config file.
|
||||||
|
|
||||||
|
The `scan interval` is the break time between scans and does not include the time a scan takes. A typical scan can vary dramatically, but here are some expectations for SSD-based disks:
|
||||||
|
|
||||||
|
| # folders | # files | time to index | memory usage (RAM) |
|
||||||
|
|---|---|---|---|
|
||||||
|
| 10,000 | 10,000 | ~ 0-5 seconds | 15 MB |
|
||||||
|
| 2,000 | 250,000 | ~ 0-5 seconds | 300 MB |
|
||||||
|
| 50,000 | 50,000 | ~ 5-30 seconds | 150 MB |
|
||||||
|
| 250,000 | 10,000 | ~ 2-5 minutes | 300 MB |
|
||||||
|
| 500,000 | 500,000 | ~ 5+ minutes | 500+ MB |
|
||||||
|
|
||||||
|
### Smart Scanning
|
||||||
|
|
||||||
|
1. There is a floating `smart scan interval` that ranges from **1 minute - 4 hours** depending on the complexity of your filesystem
|
||||||
|
2. The smart interval changes based on how often it discovers changed files:
|
||||||
|
- ```
|
||||||
|
// Schedule in minutes
|
||||||
|
var scanSchedule = []time.Duration{
|
||||||
|
5 * time.Minute, // 5 minute quick scan & 25 minutes for a full scan
|
||||||
|
10 * time.Minute,
|
||||||
|
20 * time.Minute, // [3] element is 20 minutes, reset anchor for full scan
|
||||||
|
40 * time.Minute,
|
||||||
|
1 * time.Hour,
|
||||||
|
2 * time.Hour,
|
||||||
|
3 * time.Hour,
|
||||||
|
4 * time.Hour, // 4 hours for quick scan & 20 hours for a full scan
|
||||||
|
}
|
||||||
|
```
|
||||||
|
3. The `smart scan interval` performs a `quick scan` 4 times in a row, followed by a 5th `full scan` which completely rebuilds the index.
|
||||||
|
- A `quick scan` is limited to detecting directory changes, but is 10x faster than a full scan. Here is what a quick scan can see:
|
||||||
|
1. New files or folders created.
|
||||||
|
2. Files or folders deleted.
|
||||||
|
3. Renaming of files or folders.
|
||||||
|
- A quick scan **cannot** detect when a file has been updated, for example when you save a file and the size increases.
|
||||||
|
- A `full scan` is a complete re-indexing. This is always more disk and computationally intense but will capture individual file changes.
|
||||||
|
4. The `smart scan interval` changes based on several things. A `simple` complexity enables scans every 1 minute if changes happen frequently and a maximum full scan interval of every 100 minutes. `high` complexity indicates a minimum scanning interval of 10 minutes.
|
||||||
|
- **under 10,000 folders** or **Under 3 seconds** to index is awlays considered `simple` complexity.
|
||||||
|
- **more than 500,000 folders** or **Over 2 minutes** to index is always considered `high` complexity.
|
||||||
|
|
||||||
|
### Manual Scanning Interval
|
||||||
|
|
||||||
|
If you don't like the behavior of smart scanning, you can configure set intervals instead by setting `indexingInterval` to a number greater than 0. This will make FileBrowser Quantum always scan at the given interval in minutes.
|
||||||
|
|
||||||
|
The scan behavior is still 4 quick scans at a given interval, followed by a 5th full scan.
|
||||||
|
|
||||||
|
### System requirements
|
||||||
|
|
||||||
|
You can expect FileBrowser Quantum to use 100 MB of RAM for a typical installation. If you have many files and folders then the requirement could climb to multiple Gigabytes. Please monitor your system on the first run to know your specific requirements.
|
||||||
|
|
||||||
|
### Why does FileBrowser Quantum index the way it does?
|
||||||
|
|
||||||
|
The smart indexing method uses filesystem scanners because it allows a low-footprint design that can cater to individual filesystem complexity. There are a few options for monitoring a filesystem for changes:
|
||||||
|
|
||||||
|
1. **Option 1**: Recursive Traversal with ReadDir
|
||||||
|
- This is quite computationally intensive but creates an accurate record of the filesystem
|
||||||
|
- Requires periodic scanning to remain updated
|
||||||
|
- Low overhead and straightforward implementation.
|
||||||
|
2. **Option 2**: Use File System Monitoring (Real-Time or Periodic Check) such as `fsnotify`
|
||||||
|
- This allows for event-based reactions to filesystem changes.
|
||||||
|
- Requires extra overhead.
|
||||||
|
- Relies on OS level features and behavior differs between OS's
|
||||||
|
- Requires OS-level configuration to ulimits in order to properly watch a large filesystem.
|
||||||
|
3. **Option 3**: Directory Metadata Heuristics.
|
||||||
|
- Using ModTime to determine when directory structures change.
|
||||||
|
- Has minimal insight into actual file changes.
|
||||||
|
- Much faster to scan for changes than Recursive transversal.
|
||||||
|
|
||||||
|
Ultimately, FileBrowser Quantum uses a combination of 1 and 3 to perform index updates. Using something like fsnotify is a non-starter for large filesystems, where it would require manual host OS tuning to work at all. Besides, I can essentially offer the same behavior by creating "watchers" for top-level folders (a feature to come in the future). However, right now there is a single root-level watcher that works over the entire index.
|
||||||
|
|
||||||
|
The main disadvantage of the approach is the delay caused by the scanning interval.
|
||||||
|
|
||||||
|
### How to manually refresh the index?
|
||||||
|
|
||||||
|
There is currently no way to manually trigger a new full indexing. This will come in a future release when the "jobs" functionality is added back.
|
||||||
|
|
||||||
|
However, if you want to force-refresh a certain directory, this happens every time you **view it** in the UI or via the resources API.
|
||||||
|
|
||||||
|
This also means the resources API is always up to date with the current status of the filesystem. When you "look" at a specific folder, you are causing the index to be refreshed at that location.
|
||||||
|
|
||||||
|
### What information does the index have?
|
||||||
|
|
||||||
|
You can see what the index looks like by using the resources API via the GET method, which returns individual directory information -- all of this information is stored in the index.
|
||||||
|
|
||||||
|
Here is an example:
|
||||||
|
|
||||||
|
```
|
||||||
|
{
|
||||||
|
"name": "filebrowser",
|
||||||
|
"size": 274467914,
|
||||||
|
"modified": "2024-11-23T19:18:57.68013727-06:00",
|
||||||
|
"type": "directory",
|
||||||
|
"files": [
|
||||||
|
{
|
||||||
|
"name": ".dockerignore",
|
||||||
|
"size": 73,
|
||||||
|
"modified": "2024-11-20T18:14:44.91135413-06:00",
|
||||||
|
"type": "blob"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name": ".DS_Store",
|
||||||
|
"size": 6148,
|
||||||
|
"modified": "2024-11-22T14:45:15.901211088-06:00",
|
||||||
|
"type": "blob"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name": ".gitignore",
|
||||||
|
"size": 455,
|
||||||
|
"modified": "2024-11-23T19:18:57.616132373-06:00",
|
||||||
|
"type": "blob"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name": "CHANGELOG.md",
|
||||||
|
"size": 9325,
|
||||||
|
"modified": "2024-11-23T19:18:57.616646332-06:00",
|
||||||
|
"type": "text"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name": "Dockerfile",
|
||||||
|
"size": 769,
|
||||||
|
"modified": "2024-11-23T19:18:57.616941333-06:00",
|
||||||
|
"type": "blob"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name": "Dockerfile.playwright",
|
||||||
|
"size": 542,
|
||||||
|
"modified": "2024-11-23T19:18:57.617151875-06:00",
|
||||||
|
"type": "blob"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name": "makefile",
|
||||||
|
"size": 1311,
|
||||||
|
"modified": "2024-11-23T19:18:57.68017352-06:00",
|
||||||
|
"type": "blob"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name": "README.md",
|
||||||
|
"size": 10625,
|
||||||
|
"modified": "2024-11-23T19:18:57.617464334-06:00",
|
||||||
|
"type": "text"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"folders": [
|
||||||
|
{
|
||||||
|
"name": ".git",
|
||||||
|
"size": 60075460,
|
||||||
|
"modified": "2024-11-24T14:44:42.52180215-06:00",
|
||||||
|
"type": "directory"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name": ".github",
|
||||||
|
"size": 11584,
|
||||||
|
"modified": "2024-11-20T18:14:44.911805335-06:00",
|
||||||
|
"type": "directory"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name": "backend",
|
||||||
|
"size": 29247172,
|
||||||
|
"modified": "2024-11-23T19:18:57.667109624-06:00",
|
||||||
|
"type": "directory"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name": "docs",
|
||||||
|
"size": 14272,
|
||||||
|
"modified": "2024-11-24T13:46:12.082024018-06:00",
|
||||||
|
"type": "directory"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name": "frontend",
|
||||||
|
"size": 185090178,
|
||||||
|
"modified": "2024-11-24T14:44:39.880678934-06:00",
|
||||||
|
"type": "directory"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"path": "/filebrowser"
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Can I disable the index and still use FileBrowser Quantum?
|
||||||
|
|
||||||
|
You can disable the index by setting `indexing: false` in your config file. You will still be able to browse your files, but the search will not work and you may run into issues as it's not intended to be used without indexing.
|
||||||
|
|
||||||
|
I'm not sure why you would run it like this, if you have a good reason please open an issue request on how you would like it to work -- and why you would run it without the index.
|
|
@ -1,18 +1,18 @@
|
||||||
# Planned Roadmap
|
# Planned Roadmap
|
||||||
|
|
||||||
upcoming 0.3.x releases:
|
upcoming 0.3.x releases, ordered by priority:
|
||||||
|
|
||||||
|
- More filetype icons and refreshed icons.
|
||||||
|
- more filetype previews - eg. office, photoshop, vector, 3d files.
|
||||||
|
- Enable mobile search with same features as desktop
|
||||||
|
- Enable mobile search with same features as desktop
|
||||||
- Theme configuration from settings
|
- Theme configuration from settings
|
||||||
- File synchronization improvements
|
|
||||||
- more filetype previews
|
|
||||||
- introduce jobs as replacement to runners.
|
- introduce jobs as replacement to runners.
|
||||||
- Add Job status to the sidebar
|
- Add Job status to the sidebar
|
||||||
- index status.
|
- index status.
|
||||||
- Job status from users
|
- Job status from users
|
||||||
- upload status
|
- upload status
|
||||||
- opentelemetry metrics
|
- opentelemetry metrics
|
||||||
- simple search/filter for current listings.
|
|
||||||
- Enable mobile search with same features as desktop
|
|
||||||
|
|
||||||
Unplanned Future releases:
|
Unplanned Future releases:
|
||||||
- multiple sources https://github.com/filebrowser/filebrowser/issues/2514
|
- multiple sources https://github.com/filebrowser/filebrowser/issues/2514
|
||||||
|
|
|
@ -13,10 +13,11 @@
|
||||||
"build-docker": "vite build",
|
"build-docker": "vite build",
|
||||||
"watch": "vite build --watch",
|
"watch": "vite build --watch",
|
||||||
"typecheck": "vue-tsc -p ./tsconfig.json --noEmit",
|
"typecheck": "vue-tsc -p ./tsconfig.json --noEmit",
|
||||||
"lint": "npm run typecheck && eslint src/",
|
"lint": "eslint --ext .js,.vue,ts src",
|
||||||
"lint:fix": "eslint --fix src/",
|
"lint:fix": "eslint --fix src/",
|
||||||
"format": "prettier --write .",
|
"format": "prettier --write .",
|
||||||
"test": "npx playwright test"
|
"integration-test": "npx playwright test",
|
||||||
|
"test": "vitest run "
|
||||||
},
|
},
|
||||||
"dependencies": {
|
"dependencies": {
|
||||||
"ace-builds": "^1.24.2",
|
"ace-builds": "^1.24.2",
|
||||||
|
@ -32,15 +33,17 @@
|
||||||
"vue-router": "^4.3.0"
|
"vue-router": "^4.3.0"
|
||||||
},
|
},
|
||||||
"devDependencies": {
|
"devDependencies": {
|
||||||
"@playwright/test": "^1.42.1",
|
|
||||||
"@intlify/unplugin-vue-i18n": "^4.0.0",
|
"@intlify/unplugin-vue-i18n": "^4.0.0",
|
||||||
|
"@playwright/test": "^1.42.1",
|
||||||
"@vitejs/plugin-vue": "^5.0.4",
|
"@vitejs/plugin-vue": "^5.0.4",
|
||||||
"@vue/eslint-config-typescript": "^13.0.0",
|
"@vue/eslint-config-typescript": "^13.0.0",
|
||||||
"eslint": "^8.57.0",
|
"eslint": "^8.57.0",
|
||||||
"eslint-plugin-prettier": "^5.1.3",
|
"eslint-config-prettier": "^9.1.0",
|
||||||
"eslint-plugin-vue": "^9.24.0",
|
"eslint-plugin-vue": "^9.24.0",
|
||||||
|
"jsdom": "^25.0.1",
|
||||||
"vite": "^5.2.7",
|
"vite": "^5.2.7",
|
||||||
"vite-plugin-compression2": "^1.0.0",
|
"vite-plugin-compression2": "^1.0.0",
|
||||||
|
"vitest": "^2.1.5",
|
||||||
"vue-tsc": "^2.0.7"
|
"vue-tsc": "^2.0.7"
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,5 +1,4 @@
|
||||||
import { createURL, fetchURL, adjustedData} from "./utils";
|
import { createURL, fetchURL, adjustedData } from "./utils";
|
||||||
import { baseURL } from "@/utils/constants";
|
|
||||||
import { removePrefix, getApiPath } from "@/utils/url.js";
|
import { removePrefix, getApiPath } from "@/utils/url.js";
|
||||||
import { state } from "@/store";
|
import { state } from "@/store";
|
||||||
import { notify } from "@/notify";
|
import { notify } from "@/notify";
|
||||||
|
@ -7,11 +6,12 @@ import { notify } from "@/notify";
|
||||||
// Notify if errors occur
|
// Notify if errors occur
|
||||||
export async function fetchFiles(url, content = false) {
|
export async function fetchFiles(url, content = false) {
|
||||||
try {
|
try {
|
||||||
url = removePrefix(url,"files");
|
let path = removePrefix(url, "files");
|
||||||
const apiPath = getApiPath("api/resources",{path: url, content: content});
|
const apiPath = getApiPath("api/resources",{path: path, content: content});
|
||||||
const res = await fetchURL(apiPath);
|
const res = await fetchURL(apiPath);
|
||||||
const data = await res.json();
|
const data = await res.json();
|
||||||
return adjustedData(data,url);
|
const adjusted = adjustedData(data, url);
|
||||||
|
return adjusted;
|
||||||
} catch (err) {
|
} catch (err) {
|
||||||
notify.showError(err.message || "Error fetching data");
|
notify.showError(err.message || "Error fetching data");
|
||||||
throw err;
|
throw err;
|
||||||
|
@ -64,7 +64,7 @@ export function download(format, ...files) {
|
||||||
fileargs = fileargs.substring(0, fileargs.length - 1);
|
fileargs = fileargs.substring(0, fileargs.length - 1);
|
||||||
}
|
}
|
||||||
const apiPath = getApiPath("api/raw",{path: path, files: fileargs, algo: format});
|
const apiPath = getApiPath("api/raw",{path: path, files: fileargs, algo: format});
|
||||||
let url = `${baseURL}${apiPath}`;
|
const url = createURL(`${apiPath}`);
|
||||||
window.open(url);
|
window.open(url);
|
||||||
} catch (err) {
|
} catch (err) {
|
||||||
notify.showError(err.message || "Error downloading files");
|
notify.showError(err.message || "Error downloading files");
|
||||||
|
@ -155,10 +155,11 @@ export async function checksum(url, algo) {
|
||||||
export function getDownloadURL(path, inline) {
|
export function getDownloadURL(path, inline) {
|
||||||
try {
|
try {
|
||||||
const params = {
|
const params = {
|
||||||
path: path,
|
path: removePrefix(path,"files"),
|
||||||
...(inline && { inline: "true" }),
|
...(inline && { inline: "true" }),
|
||||||
};
|
};
|
||||||
return createURL("api/raw", params);
|
const apiPath = getApiPath("api/raw", params);
|
||||||
|
return createURL(apiPath);
|
||||||
} catch (err) {
|
} catch (err) {
|
||||||
notify.showError(err.message || "Error getting download URL");
|
notify.showError(err.message || "Error getting download URL");
|
||||||
throw err;
|
throw err;
|
||||||
|
@ -173,8 +174,8 @@ export function getPreviewURL(path, size, modified) {
|
||||||
key: Date.parse(modified),
|
key: Date.parse(modified),
|
||||||
inline: "true",
|
inline: "true",
|
||||||
};
|
};
|
||||||
|
const apiPath = getApiPath("api/preview", params);
|
||||||
return createURL("api/preview", params);
|
return createURL(apiPath);
|
||||||
} catch (err) {
|
} catch (err) {
|
||||||
notify.showError(err.message || "Error getting preview URL");
|
notify.showError(err.message || "Error getting preview URL");
|
||||||
throw err;
|
throw err;
|
||||||
|
@ -183,13 +184,14 @@ export function getPreviewURL(path, size, modified) {
|
||||||
|
|
||||||
export function getSubtitlesURL(file) {
|
export function getSubtitlesURL(file) {
|
||||||
try {
|
try {
|
||||||
const params = {
|
|
||||||
inline: "true",
|
|
||||||
};
|
|
||||||
|
|
||||||
const subtitles = [];
|
const subtitles = [];
|
||||||
for (const sub of file.subtitles) {
|
for (const sub of file.subtitles) {
|
||||||
subtitles.push(createURL("api/raw" + sub, params));
|
const params = {
|
||||||
|
inline: "true",
|
||||||
|
path: sub
|
||||||
|
};
|
||||||
|
const apiPath = getApiPath("api/raw", params);
|
||||||
|
return createURL(apiPath);
|
||||||
}
|
}
|
||||||
|
|
||||||
return subtitles;
|
return subtitles;
|
||||||
|
|
|
@ -1,53 +1,48 @@
|
||||||
import { createURL, adjustedData } from "./utils";
|
import { createURL, adjustedData } from "./utils";
|
||||||
import { getApiPath } from "@/utils/url.js";
|
import { getApiPath, removePrefix } from "@/utils/url.js";
|
||||||
import { notify } from "@/notify";
|
import { notify } from "@/notify";
|
||||||
|
|
||||||
// Fetch public share data
|
// Fetch public share data
|
||||||
export async function fetchPub(path, hash, password = "") {
|
export async function fetchPub(path, hash, password = "") {
|
||||||
try {
|
const params = { path, hash }
|
||||||
const params = { path, hash }
|
const apiPath = getApiPath("api/public/share", params);
|
||||||
const apiPath = getApiPath("api/public/share", params);
|
const response = await fetch(apiPath, {
|
||||||
const response = await fetch(apiPath, {
|
headers: {
|
||||||
headers: {
|
"X-SHARE-PASSWORD": password ? encodeURIComponent(password) : "",
|
||||||
"X-SHARE-PASSWORD": password ? encodeURIComponent(password) : "",
|
},
|
||||||
},
|
});
|
||||||
});
|
|
||||||
|
|
||||||
if (!response.ok) {
|
if (!response.ok) {
|
||||||
const error = new Error("Failed to connect to the server.");
|
const error = new Error("Failed to connect to the server.");
|
||||||
error.status = response.status;
|
error.status = response.status;
|
||||||
throw error;
|
throw error;
|
||||||
}
|
|
||||||
let data = await response.json()
|
|
||||||
return adjustedData(data, `${hash}${path}`);
|
|
||||||
} catch (err) {
|
|
||||||
notify.showError(err.message || "Error fetching public share data");
|
|
||||||
throw err;
|
|
||||||
}
|
}
|
||||||
|
let data = await response.json()
|
||||||
|
const adjusted = adjustedData(data, getApiPath(`share/${hash}${path}`));
|
||||||
|
return adjusted
|
||||||
}
|
}
|
||||||
|
|
||||||
// Download files with given parameters
|
// Download files with given parameters
|
||||||
export function download(path, hash, token, format, ...files) {
|
export function download(share, ...files) {
|
||||||
try {
|
try {
|
||||||
let fileInfo = files[0]
|
let fileInfo = files[0]
|
||||||
if (files.length > 1) {
|
if (files.length > 1) {
|
||||||
fileInfo = files.map(encodeURIComponent).join(",");
|
fileInfo = files.map(encodeURIComponent).join(",");
|
||||||
}
|
}
|
||||||
const params = {
|
const params = {
|
||||||
path,
|
"path": removePrefix(share.path, "share"),
|
||||||
hash,
|
"hash": share.hash,
|
||||||
...(format && { format}),
|
"token": share.token,
|
||||||
...(token && { token }),
|
"inline": share.inline,
|
||||||
fileInfo
|
"files": fileInfo,
|
||||||
};
|
};
|
||||||
const url = createURL(`api/public/dl`, params, false);
|
const apiPath = getApiPath("api/public/dl", params);
|
||||||
|
const url = createURL(apiPath);
|
||||||
window.open(url);
|
window.open(url);
|
||||||
} catch (err) {
|
} catch (err) {
|
||||||
notify.showError(err.message || "Error downloading files");
|
notify.showError(err.message || "Error downloading files");
|
||||||
throw err;
|
throw err;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
}
|
}
|
||||||
|
|
||||||
// Get the public user data
|
// Get the public user data
|
||||||
|
@ -64,11 +59,7 @@ export async function getPublicUser() {
|
||||||
|
|
||||||
// Generate a download URL
|
// Generate a download URL
|
||||||
export function getDownloadURL(share) {
|
export function getDownloadURL(share) {
|
||||||
const params = {
|
const apiPath = getApiPath("api/public/dl", share);
|
||||||
"path": share.path,
|
const url = createURL(apiPath)
|
||||||
"hash": share.hash,
|
return url
|
||||||
"token": share.token,
|
|
||||||
...(share.inline && { inline: "true" }),
|
|
||||||
};
|
|
||||||
return createURL(`api/public/dl`, params, false);
|
|
||||||
}
|
}
|
||||||
|
|
|
@ -60,36 +60,47 @@ export async function fetchJSON(url, opts) {
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
export function createURL(endpoint, params = {}) {
|
export function createURL(endpoint) {
|
||||||
let prefix = baseURL;
|
let prefix = baseURL;
|
||||||
|
|
||||||
|
// Ensure prefix ends with a single slash
|
||||||
if (!prefix.endsWith("/")) {
|
if (!prefix.endsWith("/")) {
|
||||||
prefix = prefix + "/";
|
prefix += "/";
|
||||||
}
|
}
|
||||||
const url = new URL(prefix + endpoint, origin);
|
|
||||||
|
|
||||||
const searchParams = {
|
// Remove leading slash from endpoint to avoid duplicate slashes
|
||||||
...params,
|
if (endpoint.startsWith("/")) {
|
||||||
};
|
endpoint = endpoint.substring(1);
|
||||||
|
|
||||||
for (const key in searchParams) {
|
|
||||||
url.searchParams.set(key, searchParams[key]);
|
|
||||||
}
|
}
|
||||||
|
|
||||||
|
const url = new URL(prefix + endpoint, window.location.origin);
|
||||||
|
|
||||||
return url.toString();
|
return url.toString();
|
||||||
}
|
}
|
||||||
|
|
||||||
export function adjustedData(data, url) {
|
export function adjustedData(data, url) {
|
||||||
data.url = url;
|
data.url = url;
|
||||||
if (data.type == "directory") {
|
|
||||||
|
if (data.type === "directory") {
|
||||||
if (!data.url.endsWith("/")) data.url += "/";
|
if (!data.url.endsWith("/")) data.url += "/";
|
||||||
|
|
||||||
|
// Combine folders and files into items
|
||||||
|
data.items = [...(data.folders || []), ...(data.files || [])];
|
||||||
|
|
||||||
data.items = data.items.map((item, index) => {
|
data.items = data.items.map((item, index) => {
|
||||||
item.index = index;
|
item.index = index;
|
||||||
item.url = `${data.url}${item.name}`;
|
item.url = `${data.url}${item.name}`;
|
||||||
if (item.type == "directory") {
|
if (item.type === "directory") {
|
||||||
item.url += "/";
|
item.url += "/";
|
||||||
}
|
}
|
||||||
return item;
|
return item;
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
return data
|
if (data.files) {
|
||||||
}
|
data.files = []
|
||||||
|
}
|
||||||
|
if (data.folders) {
|
||||||
|
data.folders = []
|
||||||
|
}
|
||||||
|
return data;
|
||||||
|
}
|
||||||
|
|
|
@ -0,0 +1,114 @@
|
||||||
|
import { describe, it, expect, vi } from 'vitest';
|
||||||
|
import { adjustedData, createURL } from './utils.js';
|
||||||
|
|
||||||
|
describe('adjustedData', () => {
|
||||||
|
it('should append the URL and process directory data correctly', () => {
|
||||||
|
const input = {
|
||||||
|
type: "directory",
|
||||||
|
folders: [
|
||||||
|
{ name: "folder1", type: "directory" },
|
||||||
|
{ name: "folder2", type: "directory" },
|
||||||
|
],
|
||||||
|
files: [
|
||||||
|
{ name: "file1.txt", type: "file" },
|
||||||
|
{ name: "file2.txt", type: "file" },
|
||||||
|
],
|
||||||
|
};
|
||||||
|
|
||||||
|
const url = "http://example.com/unit-testing/files/path/to/directory";
|
||||||
|
|
||||||
|
const expected = {
|
||||||
|
type: "directory",
|
||||||
|
url: "http://example.com/unit-testing/files/path/to/directory/",
|
||||||
|
folders: [],
|
||||||
|
files: [],
|
||||||
|
items: [
|
||||||
|
{ name: "folder1", type: "directory", index: 0, url: "http://example.com/unit-testing/files/path/to/directory/folder1/" },
|
||||||
|
{ name: "folder2", type: "directory", index: 1, url: "http://example.com/unit-testing/files/path/to/directory/folder2/" },
|
||||||
|
{ name: "file1.txt", type: "file", index: 2, url: "http://example.com/unit-testing/files/path/to/directory/file1.txt" },
|
||||||
|
{ name: "file2.txt", type: "file", index: 3, url: "http://example.com/unit-testing/files/path/to/directory/file2.txt" },
|
||||||
|
],
|
||||||
|
};
|
||||||
|
|
||||||
|
expect(adjustedData(input, url)).toEqual(expected);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should add a trailing slash to the URL if missing for a directory', () => {
|
||||||
|
const input = { type: "directory", folders: [], files: [] };
|
||||||
|
const url = "http://example.com/base";
|
||||||
|
|
||||||
|
const expected = {
|
||||||
|
type: "directory",
|
||||||
|
url: "http://example.com/base/",
|
||||||
|
folders: [],
|
||||||
|
files: [],
|
||||||
|
items: [],
|
||||||
|
};
|
||||||
|
|
||||||
|
expect(adjustedData(input, url)).toEqual(expected);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should handle non-directory types without modification to items', () => {
|
||||||
|
const input = { type: "file", name: "file1.txt" };
|
||||||
|
const url = "http://example.com/base";
|
||||||
|
|
||||||
|
const expected = {
|
||||||
|
type: "file",
|
||||||
|
name: "file1.txt",
|
||||||
|
url: "http://example.com/base",
|
||||||
|
};
|
||||||
|
|
||||||
|
expect(adjustedData(input, url)).toEqual(expected);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should handle missing folders and files gracefully', () => {
|
||||||
|
const input = { type: "directory" };
|
||||||
|
const url = "http://example.com/base";
|
||||||
|
|
||||||
|
const expected = {
|
||||||
|
type: "directory",
|
||||||
|
url: "http://example.com/base/",
|
||||||
|
items: [],
|
||||||
|
};
|
||||||
|
|
||||||
|
expect(adjustedData(input, url)).toEqual(expected);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should handle empty input object correctly', () => {
|
||||||
|
const input = {};
|
||||||
|
const url = "http://example.com/base";
|
||||||
|
|
||||||
|
const expected = {
|
||||||
|
url: "http://example.com/base",
|
||||||
|
};
|
||||||
|
|
||||||
|
expect(adjustedData(input, url)).toEqual(expected);
|
||||||
|
});
|
||||||
|
|
||||||
|
});
|
||||||
|
|
||||||
|
|
||||||
|
describe('createURL', () => {
|
||||||
|
it('createURL', () => {
|
||||||
|
const url = "base";
|
||||||
|
const expected = "http://localhost:3000/unit-testing/base"
|
||||||
|
expect(createURL(url)).toEqual(expected);
|
||||||
|
});
|
||||||
|
it('createURL with slash', () => {
|
||||||
|
const url = "/base";
|
||||||
|
const expected = "http://localhost:3000/unit-testing/base"
|
||||||
|
expect(createURL(url)).toEqual(expected);
|
||||||
|
});
|
||||||
|
it('createURL with slash', () => {
|
||||||
|
const url = "/base";
|
||||||
|
const expected = "http://localhost:3000/unit-testing/base"
|
||||||
|
expect(createURL(url)).toEqual(expected);
|
||||||
|
});
|
||||||
|
})
|
||||||
|
|
||||||
|
vi.mock('@/utils/constants', () => {
|
||||||
|
return {
|
||||||
|
baseURL: "unit-testing",
|
||||||
|
};
|
||||||
|
});
|
||||||
|
|
|
@ -21,7 +21,6 @@
|
||||||
type="range"
|
type="range"
|
||||||
id="gallery-size"
|
id="gallery-size"
|
||||||
name="gallery-size"
|
name="gallery-size"
|
||||||
:value="gallerySize"
|
|
||||||
min="0"
|
min="0"
|
||||||
max="10"
|
max="10"
|
||||||
@input="updateGallerySize"
|
@input="updateGallerySize"
|
||||||
|
@ -62,6 +61,9 @@ export default {
|
||||||
if (parts[0] === "") {
|
if (parts[0] === "") {
|
||||||
parts.shift();
|
parts.shift();
|
||||||
}
|
}
|
||||||
|
if (getters.currentView() == "share") {
|
||||||
|
parts.shift();
|
||||||
|
}
|
||||||
|
|
||||||
if (parts[parts.length - 1] === "") {
|
if (parts[parts.length - 1] === "") {
|
||||||
parts.pop();
|
parts.pop();
|
||||||
|
|
|
@ -76,6 +76,7 @@ import { filesApi } from "@/api";
|
||||||
import * as upload from "@/utils/upload";
|
import * as upload from "@/utils/upload";
|
||||||
import { state, getters, mutations } from "@/store"; // Import your custom store
|
import { state, getters, mutations } from "@/store"; // Import your custom store
|
||||||
import { baseURL } from "@/utils/constants";
|
import { baseURL } from "@/utils/constants";
|
||||||
|
import { router } from "@/router";
|
||||||
|
|
||||||
export default {
|
export default {
|
||||||
name: "item",
|
name: "item",
|
||||||
|
@ -323,7 +324,7 @@ export default {
|
||||||
mutations.addSelected(this.index);
|
mutations.addSelected(this.index);
|
||||||
},
|
},
|
||||||
open() {
|
open() {
|
||||||
this.$router.push({ path: this.url });
|
router.push({ path: this.url });
|
||||||
},
|
},
|
||||||
},
|
},
|
||||||
};
|
};
|
||||||
|
|
|
@ -59,7 +59,7 @@ export default {
|
||||||
if (!this.isListing) {
|
if (!this.isListing) {
|
||||||
await filesApi.remove(state.route.path);
|
await filesApi.remove(state.route.path);
|
||||||
buttons.success("delete");
|
buttons.success("delete");
|
||||||
showSuccess("Deleted item successfully");
|
notify.showSuccess("Deleted item successfully");
|
||||||
|
|
||||||
this.currentPrompt?.confirm();
|
this.currentPrompt?.confirm();
|
||||||
this.closeHovers();
|
this.closeHovers();
|
||||||
|
@ -79,7 +79,7 @@ export default {
|
||||||
|
|
||||||
await Promise.all(promises);
|
await Promise.all(promises);
|
||||||
buttons.success("delete");
|
buttons.success("delete");
|
||||||
showSuccess("Deleted item successfully");
|
notify.showSuccess("Deleted item successfully");
|
||||||
mutations.setReload(true); // Handle reload as needed
|
mutations.setReload(true); // Handle reload as needed
|
||||||
} catch (e) {
|
} catch (e) {
|
||||||
buttons.done("delete");
|
buttons.done("delete");
|
||||||
|
|
|
@ -8,50 +8,52 @@
|
||||||
<template v-if="listing">
|
<template v-if="listing">
|
||||||
<div class="card-content">
|
<div class="card-content">
|
||||||
<table>
|
<table>
|
||||||
<tr>
|
<tbody>
|
||||||
<th>#</th>
|
<tr>
|
||||||
<th>{{ $t("settings.shareDuration") }}</th>
|
<th>#</th>
|
||||||
<th></th>
|
<th>{{ $t("settings.shareDuration") }}</th>
|
||||||
<th></th>
|
<th></th>
|
||||||
</tr>
|
<th></th>
|
||||||
|
</tr>
|
||||||
|
|
||||||
<tr v-for="link in links" :key="link.hash">
|
<tr v-for="link in links" :key="link.hash">
|
||||||
<td>{{ link.hash }}</td>
|
<td>{{ link.hash }}</td>
|
||||||
<td>
|
<td>
|
||||||
<template v-if="link.expire !== 0">{{ humanTime(link.expire) }}</template>
|
<template v-if="link.expire !== 0">{{ humanTime(link.expire) }}</template>
|
||||||
<template v-else>{{ $t("permanent") }}</template>
|
<template v-else>{{ $t("permanent") }}</template>
|
||||||
</td>
|
</td>
|
||||||
<td class="small">
|
<td class="small">
|
||||||
<button
|
<button
|
||||||
class="action copy-clipboard"
|
class="action copy-clipboard"
|
||||||
:data-clipboard-text="buildLink(link)"
|
:data-clipboard-text="buildLink(link)"
|
||||||
:aria-label="$t('buttons.copyToClipboard')"
|
:aria-label="$t('buttons.copyToClipboard')"
|
||||||
:title="$t('buttons.copyToClipboard')"
|
:title="$t('buttons.copyToClipboard')"
|
||||||
>
|
>
|
||||||
<i class="material-icons">content_paste</i>
|
<i class="material-icons">content_paste</i>
|
||||||
</button>
|
</button>
|
||||||
</td>
|
</td>
|
||||||
<td class="small" v-if="hasDownloadLink()">
|
<td class="small" v-if="hasDownloadLink()">
|
||||||
<button
|
<button
|
||||||
class="action copy-clipboard"
|
class="action copy-clipboard"
|
||||||
:data-clipboard-text="buildDownloadLink(link)"
|
:data-clipboard-text="buildDownloadLink(link)"
|
||||||
:aria-label="$t('buttons.copyDownloadLinkToClipboard')"
|
:aria-label="$t('buttons.copyDownloadLinkToClipboard')"
|
||||||
:title="$t('buttons.copyDownloadLinkToClipboard')"
|
:title="$t('buttons.copyDownloadLinkToClipboard')"
|
||||||
>
|
>
|
||||||
<i class="material-icons">content_paste_go</i>
|
<i class="material-icons">content_paste_go</i>
|
||||||
</button>
|
</button>
|
||||||
</td>
|
</td>
|
||||||
<td class="small">
|
<td class="small">
|
||||||
<button
|
<button
|
||||||
class="action"
|
class="action"
|
||||||
@click="deleteLink($event, link)"
|
@click="deleteLink($event, link)"
|
||||||
:aria-label="$t('buttons.delete')"
|
:aria-label="$t('buttons.delete')"
|
||||||
:title="$t('buttons.delete')"
|
:title="$t('buttons.delete')"
|
||||||
>
|
>
|
||||||
<i class="material-icons">delete</i>
|
<i class="material-icons">delete</i>
|
||||||
</button>
|
</button>
|
||||||
</td>
|
</td>
|
||||||
</tr>
|
</tr>
|
||||||
|
</tbody>
|
||||||
</table>
|
</table>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
|
|
|
@ -7,8 +7,6 @@ import Settings from "@/views/Settings.vue";
|
||||||
import Errors from "@/views/Errors.vue";
|
import Errors from "@/views/Errors.vue";
|
||||||
import { baseURL, name } from "@/utils/constants";
|
import { baseURL, name } from "@/utils/constants";
|
||||||
import { getters, state } from "@/store";
|
import { getters, state } from "@/store";
|
||||||
import { recaptcha, loginPage } from "@/utils/constants";
|
|
||||||
import { validateLogin } from "@/utils/auth";
|
|
||||||
import { mutations } from "@/store";
|
import { mutations } from "@/store";
|
||||||
import i18n from "@/i18n";
|
import i18n from "@/i18n";
|
||||||
|
|
||||||
|
@ -106,8 +104,12 @@ const routes = [
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
path: "/:catchAll(.*)*",
|
path: "/:catchAll(.*)*",
|
||||||
redirect: (to: RouteLocation) =>
|
redirect: (to: RouteLocation) => {
|
||||||
`/files/${[...to.params.catchAll].join("/")}`,
|
const path = Array.isArray(to.params.catchAll)
|
||||||
|
? to.params.catchAll.join("/")
|
||||||
|
: to.params.catchAll || "";
|
||||||
|
return `/files/${path}`;
|
||||||
|
},
|
||||||
},
|
},
|
||||||
];
|
];
|
||||||
|
|
||||||
|
@ -116,45 +118,28 @@ const router = createRouter({
|
||||||
routes,
|
routes,
|
||||||
});
|
});
|
||||||
|
|
||||||
|
// Helper function to check if a route resolves to itself
|
||||||
async function initAuth() {
|
function isSameRoute(to: RouteLocation, from: RouteLocation) {
|
||||||
if (loginPage && !getters.isShare()) {
|
return to.path === from.path && JSON.stringify(to.params) === JSON.stringify(from.params);
|
||||||
await validateLogin();
|
|
||||||
}
|
|
||||||
if (recaptcha) {
|
|
||||||
await new Promise<void>((resolve) => {
|
|
||||||
const check = () => {
|
|
||||||
if (typeof window.grecaptcha === "undefined") {
|
|
||||||
setTimeout(check, 100);
|
|
||||||
} else {
|
|
||||||
resolve();
|
|
||||||
}
|
|
||||||
};
|
|
||||||
check();
|
|
||||||
});
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
router.beforeResolve(async (to, from, next) => {
|
router.beforeResolve(async (to, from, next) => {
|
||||||
mutations.closeHovers()
|
console.log("Navigating to", to.path,from.path);
|
||||||
|
if (isSameRoute(to, from)) {
|
||||||
|
console.warn("Avoiding recursive navigation to the same route.");
|
||||||
|
return next(false);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Set the page title using i18n
|
||||||
const title = i18n.global.t(titles[to.name as keyof typeof titles]);
|
const title = i18n.global.t(titles[to.name as keyof typeof titles]);
|
||||||
document.title = title + " - " + name;
|
document.title = title + " - " + name;
|
||||||
mutations.setRoute(to)
|
|
||||||
if (to.path.endsWith("/login") && getters.isLoggedIn()) {
|
// Update store with the current route
|
||||||
next({ path: "/files/" });
|
mutations.setRoute(to);
|
||||||
return;
|
|
||||||
}
|
// Handle auth requirements
|
||||||
// this will only be null on first route
|
|
||||||
if (to.name != "Login") {
|
|
||||||
try {
|
|
||||||
await initAuth();
|
|
||||||
} catch (error) {
|
|
||||||
console.error(error);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
if (to.matched.some((record) => record.meta.requiresAuth)) {
|
if (to.matched.some((record) => record.meta.requiresAuth)) {
|
||||||
if (!getters.isLoggedIn()) {
|
if (!getters.isLoggedIn()) {
|
||||||
console.log("not logged in");
|
|
||||||
next({
|
next({
|
||||||
path: "/login",
|
path: "/login",
|
||||||
query: { redirect: to.fullPath },
|
query: { redirect: to.fullPath },
|
||||||
|
@ -162,15 +147,22 @@ router.beforeResolve(async (to, from, next) => {
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Handle admin-only routes
|
||||||
if (to.matched.some((record) => record.meta.requiresAdmin)) {
|
if (to.matched.some((record) => record.meta.requiresAdmin)) {
|
||||||
if (state.user === null || !getters.isAdmin()) {
|
if (!state.user || !getters.isAdmin()) {
|
||||||
next({ path: "/403" });
|
next({ path: "/403" });
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Redirect logged-in users from login page
|
||||||
|
if (to.path.endsWith("/login") && getters.isLoggedIn()) {
|
||||||
|
next({ path: "/files/" });
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
next();
|
next();
|
||||||
});
|
});
|
||||||
|
|
||||||
export { router, router as default };
|
export { router, router as default };
|
||||||
|
|
|
@ -1,5 +1,6 @@
|
||||||
import { removePrefix } from "@/utils/url.js";
|
import { removePrefix } from "@/utils/url.js";
|
||||||
import { state } from "./state.js";
|
import { state } from "./state.js";
|
||||||
|
import { mutations } from "./mutations.js";
|
||||||
|
|
||||||
export const getters = {
|
export const getters = {
|
||||||
isCardView: () => (state.user.viewMode == "gallery" || state.user.viewMode == "normal" ) && getters.currentView() == "listingView" ,
|
isCardView: () => (state.user.viewMode == "gallery" || state.user.viewMode == "normal" ) && getters.currentView() == "listingView" ,
|
||||||
|
@ -15,14 +16,30 @@ export const getters = {
|
||||||
return state.user.darkMode === true;
|
return state.user.darkMode === true;
|
||||||
},
|
},
|
||||||
isLoggedIn: () => {
|
isLoggedIn: () => {
|
||||||
return state.user !== null && state.user?.username != undefined && state.user?.username != "publicUser";
|
if (state.user !== null && state.user?.username != undefined && state.user?.username != "publicUser") {
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
const userData = localStorage.getItem("userData");
|
||||||
|
if (userData == undefined) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
try {
|
||||||
|
const userInfo = JSON.parse(userData);
|
||||||
|
if (userInfo.username != "publicUser") {
|
||||||
|
mutations.setCurrentUser(userInfo);
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
return false
|
||||||
},
|
},
|
||||||
isAdmin: () => state.user.perm?.admin == true,
|
isAdmin: () => state.user.perm?.admin == true,
|
||||||
isFiles: () => state.route.name === "Files",
|
isFiles: () => state.route.name === "Files",
|
||||||
isListing: () => getters.isFiles() && state.req.type === "directory",
|
isListing: () => getters.isFiles() && state.req.type === "directory",
|
||||||
selectedCount: () => Array.isArray(state.selected) ? state.selected.length : 0,
|
selectedCount: () => Array.isArray(state.selected) ? state.selected.length : 0,
|
||||||
getFirstSelected: () => state.req.items[state.selected[0]],
|
getFirstSelected: () => state.req.items[state.selected[0]],
|
||||||
isSingleFileSelected: () => getters.selectedCount() === 1 && !state.req.items[state.selected[0]]?.type == "directory",
|
isSingleFileSelected: () => getters.selectedCount() === 1 && getters.getFirstSelected()?.type != "directory",
|
||||||
selectedDownloadUrl() {
|
selectedDownloadUrl() {
|
||||||
let selectedItem = state.selected[0]
|
let selectedItem = state.selected[0]
|
||||||
return state.req.items[selectedItem].url;
|
return state.req.items[selectedItem].url;
|
||||||
|
@ -87,6 +104,9 @@ export const getters = {
|
||||||
if (typeof getters.currentPromptName() === "string" && !getters.isStickySidebar()) {
|
if (typeof getters.currentPromptName() === "string" && !getters.isStickySidebar()) {
|
||||||
visible = false;
|
visible = false;
|
||||||
}
|
}
|
||||||
|
if (getters.currentView() == "editor" || getters.currentView() == "preview") {
|
||||||
|
visible = false;
|
||||||
|
}
|
||||||
return visible
|
return visible
|
||||||
},
|
},
|
||||||
isStickySidebar: () => {
|
isStickySidebar: () => {
|
||||||
|
@ -114,7 +134,7 @@ export const getters = {
|
||||||
return removePrefix(state.route.path,trimModifier)
|
return removePrefix(state.route.path,trimModifier)
|
||||||
},
|
},
|
||||||
currentView: () => {
|
currentView: () => {
|
||||||
const pathname = state.route.path.toLowerCase()
|
const pathname = getters.routePath()
|
||||||
if (pathname.startsWith(`/settings`)) {
|
if (pathname.startsWith(`/settings`)) {
|
||||||
return "settings"
|
return "settings"
|
||||||
} else if (pathname.startsWith(`/share`)) {
|
} else if (pathname.startsWith(`/share`)) {
|
||||||
|
|
|
@ -9,7 +9,7 @@ export const mutations = {
|
||||||
setGallerySize: (value) => {
|
setGallerySize: (value) => {
|
||||||
state.user.gallerySize = value
|
state.user.gallerySize = value
|
||||||
emitStateChanged();
|
emitStateChanged();
|
||||||
usersApi.update(state.user,['gallerySize']);
|
usersApi.update(state.user, ['gallerySize']);
|
||||||
},
|
},
|
||||||
setActiveSettingsView: (value) => {
|
setActiveSettingsView: (value) => {
|
||||||
state.activeSettingsView = value;
|
state.activeSettingsView = value;
|
||||||
|
@ -102,12 +102,17 @@ export const mutations = {
|
||||||
emitStateChanged();
|
emitStateChanged();
|
||||||
},
|
},
|
||||||
setCurrentUser: (value) => {
|
setCurrentUser: (value) => {
|
||||||
|
localStorage.setItem("userData", undefined);
|
||||||
// If value is null or undefined, emit state change and exit early
|
// If value is null or undefined, emit state change and exit early
|
||||||
if (!value) {
|
if (!value) {
|
||||||
state.user = value;
|
state.user = value;
|
||||||
emitStateChanged();
|
emitStateChanged();
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
if (value.username != "publicUser") {
|
||||||
|
localStorage.setItem("userData", JSON.stringify(value));
|
||||||
|
}
|
||||||
// Ensure locale exists and is valid
|
// Ensure locale exists and is valid
|
||||||
if (!value.locale) {
|
if (!value.locale) {
|
||||||
value.locale = i18n.detectLocale(); // Default to detected locale if missing
|
value.locale = i18n.detectLocale(); // Default to detected locale if missing
|
||||||
|
@ -153,6 +158,7 @@ export const mutations = {
|
||||||
emitStateChanged();
|
emitStateChanged();
|
||||||
},
|
},
|
||||||
updateCurrentUser: (value) => {
|
updateCurrentUser: (value) => {
|
||||||
|
localStorage.setItem("userData", undefined);
|
||||||
// Ensure the input is a valid object
|
// Ensure the input is a valid object
|
||||||
if (typeof value !== "object" || value === null) return;
|
if (typeof value !== "object" || value === null) return;
|
||||||
|
|
||||||
|
@ -180,9 +186,12 @@ export const mutations = {
|
||||||
}
|
}
|
||||||
// Update users if there's any change in state.user
|
// Update users if there's any change in state.user
|
||||||
if (JSON.stringify(state.user) !== JSON.stringify(previousUser)) {
|
if (JSON.stringify(state.user) !== JSON.stringify(previousUser)) {
|
||||||
usersApi.update(state.user,Object.keys(value));
|
usersApi.update(state.user, Object.keys(value));
|
||||||
}
|
}
|
||||||
|
|
||||||
|
if (state.user.username != "publicUser") {
|
||||||
|
localStorage.setItem("userData", JSON.stringify(state.user));
|
||||||
|
}
|
||||||
// Emit state change event
|
// Emit state change event
|
||||||
emitStateChanged();
|
emitStateChanged();
|
||||||
},
|
},
|
||||||
|
@ -232,5 +241,9 @@ export const mutations = {
|
||||||
state.clipboard.items = [];
|
state.clipboard.items = [];
|
||||||
emitStateChanged();
|
emitStateChanged();
|
||||||
},
|
},
|
||||||
|
setSharePassword: (value) => {
|
||||||
|
state.sharePassword = value;
|
||||||
|
emitStateChanged();
|
||||||
|
}
|
||||||
};
|
};
|
||||||
|
|
||||||
|
|
|
@ -43,6 +43,7 @@ export const state = reactive({
|
||||||
items: [],
|
items: [],
|
||||||
},
|
},
|
||||||
jwt: "",
|
jwt: "",
|
||||||
|
sharePassword: "",
|
||||||
loading: [],
|
loading: [],
|
||||||
reload: false,
|
reload: false,
|
||||||
selected: [],
|
selected: [],
|
||||||
|
|
|
@ -2,6 +2,7 @@ import { mutations, getters } from "@/store";
|
||||||
import router from "@/router";
|
import router from "@/router";
|
||||||
import { usersApi } from "@/api";
|
import { usersApi } from "@/api";
|
||||||
import { getApiPath } from "@/utils/url.js";
|
import { getApiPath } from "@/utils/url.js";
|
||||||
|
import { recaptcha, loginPage } from "@/utils/constants";
|
||||||
|
|
||||||
|
|
||||||
export async function setNewToken(token) {
|
export async function setNewToken(token) {
|
||||||
|
@ -99,4 +100,23 @@ export function logout() {
|
||||||
// .split('; ')
|
// .split('; ')
|
||||||
// .find(row => row.startsWith(name + '='))
|
// .find(row => row.startsWith(name + '='))
|
||||||
// ?.split('=')[1];
|
// ?.split('=')[1];
|
||||||
//}
|
//}
|
||||||
|
|
||||||
|
export async function initAuth() {
|
||||||
|
if (loginPage && !getters.isShare()) {
|
||||||
|
console.log("validating login");
|
||||||
|
await validateLogin();
|
||||||
|
}
|
||||||
|
if (recaptcha) {
|
||||||
|
await new Promise((resolve) => {
|
||||||
|
const check = () => {
|
||||||
|
if (typeof window.grecaptcha === "undefined") {
|
||||||
|
setTimeout(check, 100);
|
||||||
|
} else {
|
||||||
|
resolve();
|
||||||
|
}
|
||||||
|
};
|
||||||
|
check();
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
|
@ -3,28 +3,28 @@ import { filesApi } from "@/api";
|
||||||
import { notify } from "@/notify"
|
import { notify } from "@/notify"
|
||||||
|
|
||||||
export default function download() {
|
export default function download() {
|
||||||
if (getters.isSingleFileSelected()) {
|
if (getters.isSingleFileSelected()) {
|
||||||
filesApi.download(null, getters.selectedDownloadUrl());
|
filesApi.download(null, getters.selectedDownloadUrl());
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
mutations.showHover({
|
mutations.showHover({
|
||||||
name: "download",
|
name: "download",
|
||||||
confirm: (format) => {
|
confirm: (format) => {
|
||||||
mutations.closeHovers();
|
mutations.closeHovers();
|
||||||
let files = [];
|
let files = [];
|
||||||
if (state.selected.length > 0) {
|
if (state.selected.length > 0) {
|
||||||
for (let i of state.selected) {
|
for (let i of state.selected) {
|
||||||
files.push(state.req.items[i].url);
|
files.push(state.req.items[i].url);
|
||||||
}
|
|
||||||
} else {
|
|
||||||
files.push(state.route.path);
|
|
||||||
}
|
}
|
||||||
try {
|
} else {
|
||||||
filesApi.download(format, ...files);
|
files.push(state.route.path);
|
||||||
notify.showSuccess("download started");
|
}
|
||||||
} catch (e) {
|
try {
|
||||||
notify.showError("error downloading", e);
|
filesApi.download(format, ...files);
|
||||||
}
|
notify.showSuccess("download started");
|
||||||
},
|
} catch (e) {
|
||||||
});
|
notify.showError("error downloading", e);
|
||||||
|
}
|
||||||
|
},
|
||||||
|
});
|
||||||
}
|
}
|
||||||
|
|
|
@ -61,9 +61,7 @@ export default {
|
||||||
$route: "fetchData",
|
$route: "fetchData",
|
||||||
reload(value) {
|
reload(value) {
|
||||||
if (value) {
|
if (value) {
|
||||||
console.log("reloading fetch data");
|
|
||||||
this.fetchData();
|
this.fetchData();
|
||||||
console.log("reloading fetch data done", state.req);
|
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
},
|
},
|
||||||
|
@ -94,7 +92,12 @@ export default {
|
||||||
let res = await filesApi.fetchFiles(getters.routePath());
|
let res = await filesApi.fetchFiles(getters.routePath());
|
||||||
// If not a directory, fetch content
|
// If not a directory, fetch content
|
||||||
if (res.type != "directory") {
|
if (res.type != "directory") {
|
||||||
res = await filesApi.fetchFiles(getters.routePath(), true);
|
let content = false;
|
||||||
|
// only check content for blob or text files
|
||||||
|
if (res.type == "blob" || res.type == "text") {
|
||||||
|
content = true;
|
||||||
|
}
|
||||||
|
res = await filesApi.fetchFiles(getters.routePath(), content);
|
||||||
}
|
}
|
||||||
data = res;
|
data = res;
|
||||||
// Verify if the fetched path matches the current route
|
// Verify if the fetched path matches the current route
|
||||||
|
|
|
@ -44,7 +44,7 @@
|
||||||
<script>
|
<script>
|
||||||
import router from "@/router";
|
import router from "@/router";
|
||||||
import { state } from "@/store";
|
import { state } from "@/store";
|
||||||
import { signupLogin, login } from "@/utils/auth";
|
import { signupLogin, login, initAuth } from "@/utils/auth";
|
||||||
import {
|
import {
|
||||||
name,
|
name,
|
||||||
logoURL,
|
logoURL,
|
||||||
|
@ -114,6 +114,7 @@ export default {
|
||||||
await signupLogin(this.username, this.password);
|
await signupLogin(this.username, this.password);
|
||||||
}
|
}
|
||||||
await login(this.username, this.password, captcha);
|
await login(this.username, this.password, captcha);
|
||||||
|
await initAuth();
|
||||||
router.push({ path: redirect });
|
router.push({ path: redirect });
|
||||||
} catch (e) {
|
} catch (e) {
|
||||||
console.error(e);
|
console.error(e);
|
||||||
|
|
|
@ -166,26 +166,15 @@ export default {
|
||||||
hash: null,
|
hash: null,
|
||||||
subPath: "",
|
subPath: "",
|
||||||
clip: null,
|
clip: null,
|
||||||
|
token: "",
|
||||||
};
|
};
|
||||||
},
|
},
|
||||||
watch: {
|
watch: {
|
||||||
$route() {
|
$route() {
|
||||||
let urlPath = getters.routePath();
|
|
||||||
// Step 1: Split the path by '/'
|
|
||||||
let parts = urlPath.split("/");
|
|
||||||
// Step 2: Assign hash to the second part (index 2) and join the rest for subPath
|
|
||||||
this.hash = parts[1];
|
|
||||||
this.subPath = "/" + parts.slice(2).join("/");
|
|
||||||
this.fetchData();
|
this.fetchData();
|
||||||
},
|
},
|
||||||
},
|
},
|
||||||
created() {
|
created() {
|
||||||
let urlPath = getters.routePath();
|
|
||||||
// Step 1: Split the path by '/'
|
|
||||||
let parts = urlPath.split("/");
|
|
||||||
// Step 2: Assign hash to the second part (index 2) and join the rest for subPath
|
|
||||||
this.hash = parts[1];
|
|
||||||
this.subPath = "/" + parts.slice(2).join("/");
|
|
||||||
this.fetchData();
|
this.fetchData();
|
||||||
},
|
},
|
||||||
mounted() {
|
mounted() {
|
||||||
|
@ -251,6 +240,7 @@ export default {
|
||||||
return publicApi.getDownloadURL({
|
return publicApi.getDownloadURL({
|
||||||
path: this.subPath,
|
path: this.subPath,
|
||||||
hash: this.hash,
|
hash: this.hash,
|
||||||
|
token: this.token,
|
||||||
inline: inline,
|
inline: inline,
|
||||||
});
|
});
|
||||||
},
|
},
|
||||||
|
@ -258,9 +248,20 @@ export default {
|
||||||
return window.btoa(unescape(encodeURIComponent(name)));
|
return window.btoa(unescape(encodeURIComponent(name)));
|
||||||
},
|
},
|
||||||
async fetchData() {
|
async fetchData() {
|
||||||
|
let urlPath = getters.routePath("share");
|
||||||
|
// Step 1: Split the path by '/'
|
||||||
|
let parts = urlPath.split("/");
|
||||||
|
// Step 2: Assign hash to the second part (index 2) and join the rest for subPath
|
||||||
|
this.hash = parts[1];
|
||||||
|
this.subPath = "/" + parts.slice(2).join("/");
|
||||||
// Set loading to true and reset the error.
|
// Set loading to true and reset the error.
|
||||||
mutations.setLoading("share", true);
|
mutations.setLoading("share", true);
|
||||||
this.error = null;
|
this.error = null;
|
||||||
|
if (this.password == "" || this.password == null) {
|
||||||
|
this.password = localStorage.getItem("sharepass:" + this.hash);
|
||||||
|
} else {
|
||||||
|
localStorage.setItem("sharepass:" + this.hash, this.password);
|
||||||
|
}
|
||||||
// Reset view information.
|
// Reset view information.
|
||||||
if (!getters.isLoggedIn()) {
|
if (!getters.isLoggedIn()) {
|
||||||
let userData = await publicApi.getPublicUser();
|
let userData = await publicApi.getPublicUser();
|
||||||
|
@ -273,11 +274,11 @@ export default {
|
||||||
try {
|
try {
|
||||||
let file = await publicApi.fetchPub(this.subPath, this.hash, this.password);
|
let file = await publicApi.fetchPub(this.subPath, this.hash, this.password);
|
||||||
file.hash = this.hash;
|
file.hash = this.hash;
|
||||||
|
this.token = file.token;
|
||||||
mutations.updateRequest(file);
|
mutations.updateRequest(file);
|
||||||
document.title = `${file.name} - ${document.title}`;
|
document.title = `${file.name} - ${document.title}`;
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
this.error = error;
|
this.error = error;
|
||||||
notify.showError(error);
|
|
||||||
}
|
}
|
||||||
|
|
||||||
mutations.setLoading("share", false);
|
mutations.setLoading("share", false);
|
||||||
|
@ -296,7 +297,13 @@ export default {
|
||||||
},
|
},
|
||||||
download() {
|
download() {
|
||||||
if (getters.isSingleFileSelected()) {
|
if (getters.isSingleFileSelected()) {
|
||||||
public_api.download(this.subPath, this.hash, null, getters.selectedDownloadUrl());
|
const share = {
|
||||||
|
path: his.subPath,
|
||||||
|
hash: this.hash,
|
||||||
|
token: this.token,
|
||||||
|
format: null,
|
||||||
|
};
|
||||||
|
publicApi.download(share, getters.selectedDownloadUrl());
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
mutations.showHover({
|
mutations.showHover({
|
||||||
|
@ -309,8 +316,13 @@ export default {
|
||||||
for (let i of this.selected) {
|
for (let i of this.selected) {
|
||||||
files.push(state.req.items[i].path);
|
files.push(state.req.items[i].path);
|
||||||
}
|
}
|
||||||
|
const share = {
|
||||||
public_api.download(this.subPath, this.hash, format, ...files);
|
path: this.subPath,
|
||||||
|
hash: this.hash,
|
||||||
|
token: this.token,
|
||||||
|
format: format,
|
||||||
|
};
|
||||||
|
publicApi.download(share, ...files);
|
||||||
},
|
},
|
||||||
});
|
});
|
||||||
},
|
},
|
||||||
|
|
|
@ -1,6 +1,6 @@
|
||||||
<template>
|
<template>
|
||||||
<header>
|
<header>
|
||||||
<action icon="close" :label="$t('buttons.close')" @action="close()" />
|
<action v-if="notShare" icon="close" :label="$t('buttons.close')" @action="close()" />
|
||||||
<title v-if="isSettings" class="topTitle">Settings</title>
|
<title v-if="isSettings" class="topTitle">Settings</title>
|
||||||
<title v-else class="topTitle">{{ req.name }}</title>
|
<title v-else class="topTitle">{{ req.name }}</title>
|
||||||
</header>
|
</header>
|
||||||
|
@ -37,6 +37,9 @@ export default {
|
||||||
},
|
},
|
||||||
|
|
||||||
computed: {
|
computed: {
|
||||||
|
notShare() {
|
||||||
|
return getters.currentView() != "share";
|
||||||
|
},
|
||||||
isSettings() {
|
isSettings() {
|
||||||
return getters.isSettings();
|
return getters.isSettings();
|
||||||
},
|
},
|
||||||
|
|
|
@ -80,7 +80,7 @@ export default {
|
||||||
},
|
},
|
||||||
methods: {
|
methods: {
|
||||||
handleEditorValueRequest() {
|
handleEditorValueRequest() {
|
||||||
filesApi.put(state.route.path, this.editor.getValue());
|
filesApi.put(getters.routePath("files"), this.editor.getValue());
|
||||||
},
|
},
|
||||||
back() {
|
back() {
|
||||||
let uri = url.removeLastDir(state.route.path) + "/";
|
let uri = url.removeLastDir(state.route.path) + "/";
|
||||||
|
|
|
@ -40,7 +40,6 @@
|
||||||
type="range"
|
type="range"
|
||||||
id="gallary-size"
|
id="gallary-size"
|
||||||
name="gallary-size"
|
name="gallary-size"
|
||||||
:value="gallerySize"
|
|
||||||
min="0"
|
min="0"
|
||||||
max="10"
|
max="10"
|
||||||
/>
|
/>
|
||||||
|
|
|
@ -0,0 +1,81 @@
|
||||||
|
import { vi } from 'vitest';
|
||||||
|
|
||||||
|
vi.mock('@/store', () => {
|
||||||
|
return {
|
||||||
|
state: {
|
||||||
|
activeSettingsView: "",
|
||||||
|
isMobile: false,
|
||||||
|
showSidebar: false,
|
||||||
|
usage: {
|
||||||
|
used: "0 B",
|
||||||
|
total: "0 B",
|
||||||
|
usedPercentage: 0,
|
||||||
|
},
|
||||||
|
editor: null,
|
||||||
|
user: {
|
||||||
|
gallarySize: 0,
|
||||||
|
stickySidebar: false,
|
||||||
|
locale: "en",
|
||||||
|
viewMode: "normal",
|
||||||
|
hideDotfiles: false,
|
||||||
|
perm: {},
|
||||||
|
rules: [],
|
||||||
|
permissions: {},
|
||||||
|
darkMode: false,
|
||||||
|
profile: {
|
||||||
|
username: '',
|
||||||
|
email: '',
|
||||||
|
avatarUrl: '',
|
||||||
|
},
|
||||||
|
},
|
||||||
|
req: {
|
||||||
|
sorting: {
|
||||||
|
by: 'name',
|
||||||
|
asc: true,
|
||||||
|
},
|
||||||
|
items: [],
|
||||||
|
numDirs: 0,
|
||||||
|
numFiles: 0,
|
||||||
|
},
|
||||||
|
previewRaw: "",
|
||||||
|
oldReq: {},
|
||||||
|
clipboard: {
|
||||||
|
key: "",
|
||||||
|
items: [],
|
||||||
|
},
|
||||||
|
jwt: "",
|
||||||
|
loading: [],
|
||||||
|
reload: false,
|
||||||
|
selected: [],
|
||||||
|
multiple: false,
|
||||||
|
upload: {
|
||||||
|
uploads: {},
|
||||||
|
queue: [],
|
||||||
|
progress: [],
|
||||||
|
sizes: [],
|
||||||
|
},
|
||||||
|
prompts: [],
|
||||||
|
show: null,
|
||||||
|
showConfirm: null,
|
||||||
|
route: {},
|
||||||
|
settings: {
|
||||||
|
signup: false,
|
||||||
|
createUserDir: false,
|
||||||
|
userHomeBasePath: "",
|
||||||
|
rules: [],
|
||||||
|
frontend: {
|
||||||
|
disableExternal: false,
|
||||||
|
disableUsedPercentage: false,
|
||||||
|
name: "",
|
||||||
|
files: "",
|
||||||
|
},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
};
|
||||||
|
});
|
||||||
|
|
||||||
|
vi.mock('@/utils/constants', () => {
|
||||||
|
return {
|
||||||
|
baseURL: "http://example.com",
|
||||||
|
};
|
||||||
|
});
|
|
@ -23,51 +23,44 @@ const resolve = {
|
||||||
|
|
||||||
// https://vitejs.dev/config/
|
// https://vitejs.dev/config/
|
||||||
export default defineConfig(({ command }) => {
|
export default defineConfig(({ command }) => {
|
||||||
if (command === "serve") {
|
|
||||||
return {
|
// command === 'build'
|
||||||
plugins,
|
return {
|
||||||
resolve,
|
plugins,
|
||||||
server: {
|
resolve,
|
||||||
proxy: {
|
base: "",
|
||||||
"/api/command": {
|
build: {
|
||||||
target: "ws://127.0.0.1:8080",
|
rollupOptions: {
|
||||||
ws: true,
|
input: {
|
||||||
},
|
index: path.resolve(__dirname, "./public/index.html"),
|
||||||
"/api": "http://127.0.0.1:8080",
|
|
||||||
},
|
},
|
||||||
},
|
output: {
|
||||||
};
|
manualChunks: (id) => {
|
||||||
} else {
|
if (id.includes("i18n/")) {
|
||||||
// command === 'build'
|
return "i18n";
|
||||||
return {
|
}
|
||||||
plugins,
|
|
||||||
resolve,
|
|
||||||
base: "",
|
|
||||||
build: {
|
|
||||||
rollupOptions: {
|
|
||||||
input: {
|
|
||||||
index: path.resolve(__dirname, "./public/index.html"),
|
|
||||||
},
|
|
||||||
output: {
|
|
||||||
manualChunks: (id) => {
|
|
||||||
if (id.includes("i18n/")) {
|
|
||||||
return "i18n";
|
|
||||||
}
|
|
||||||
},
|
|
||||||
},
|
},
|
||||||
},
|
},
|
||||||
},
|
},
|
||||||
experimental: {
|
},
|
||||||
renderBuiltUrl(filename, { hostType }) {
|
experimental: {
|
||||||
if (hostType === "js") {
|
renderBuiltUrl(filename, { hostType }) {
|
||||||
return { runtime: `window.__prependStaticUrl("${filename}")` };
|
if (hostType === "js") {
|
||||||
} else if (hostType === "html") {
|
return { runtime: `window.__prependStaticUrl("${filename}")` };
|
||||||
return `{{ .StaticURL }}/${filename}`;
|
} else if (hostType === "html") {
|
||||||
} else {
|
return `{{ .StaticURL }}/${filename}`;
|
||||||
return { relative: true };
|
} else {
|
||||||
}
|
return { relative: true };
|
||||||
},
|
}
|
||||||
},
|
},
|
||||||
};
|
},
|
||||||
}
|
test: {
|
||||||
|
globals: true,
|
||||||
|
include: ["src/**/*.test.js"], // Explicitly include test files only
|
||||||
|
exclude: ["src/**/*.vue"], // Exclude Vue files unless tested directly
|
||||||
|
environment: "jsdom", // jsdom environment
|
||||||
|
setupFiles: "tests/mocks/setup.js", // Setup file for tests
|
||||||
|
},
|
||||||
|
};
|
||||||
|
|
||||||
});
|
});
|
||||||
|
|
|
@ -0,0 +1,74 @@
|
||||||
|
// vite.config.ts
|
||||||
|
import path from "node:path";
|
||||||
|
import { defineConfig } from "file:///Users/steffag/git/personal/filebrowser/frontend/node_modules/vite/dist/node/index.js";
|
||||||
|
import vue from "file:///Users/steffag/git/personal/filebrowser/frontend/node_modules/@vitejs/plugin-vue/dist/index.mjs";
|
||||||
|
import VueI18nPlugin from "file:///Users/steffag/git/personal/filebrowser/frontend/node_modules/@intlify/unplugin-vue-i18n/lib/vite.mjs";
|
||||||
|
import { compression } from "file:///Users/steffag/git/personal/filebrowser/frontend/node_modules/vite-plugin-compression2/dist/index.mjs";
|
||||||
|
var __vite_injected_original_dirname = "/Users/steffag/git/personal/filebrowser/frontend";
|
||||||
|
var plugins = [
|
||||||
|
vue(),
|
||||||
|
VueI18nPlugin({
|
||||||
|
include: [path.resolve(__vite_injected_original_dirname, "./src/i18n/**/*.json")]
|
||||||
|
}),
|
||||||
|
compression({
|
||||||
|
include: /\.(js|woff2|woff)(\?.*)?$/i,
|
||||||
|
deleteOriginalAssets: true
|
||||||
|
})
|
||||||
|
];
|
||||||
|
var resolve = {
|
||||||
|
alias: {
|
||||||
|
"@": path.resolve(__vite_injected_original_dirname, "src")
|
||||||
|
}
|
||||||
|
};
|
||||||
|
var vite_config_default = defineConfig(({ command }) => {
|
||||||
|
if (command === "serve") {
|
||||||
|
return {
|
||||||
|
plugins,
|
||||||
|
resolve,
|
||||||
|
server: {
|
||||||
|
proxy: {
|
||||||
|
"/api/command": {
|
||||||
|
target: "ws://127.0.0.1:8080",
|
||||||
|
ws: true
|
||||||
|
},
|
||||||
|
"/api": "http://127.0.0.1:8080"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
};
|
||||||
|
} else {
|
||||||
|
return {
|
||||||
|
plugins,
|
||||||
|
resolve,
|
||||||
|
base: "",
|
||||||
|
build: {
|
||||||
|
rollupOptions: {
|
||||||
|
input: {
|
||||||
|
index: path.resolve(__vite_injected_original_dirname, "./public/index.html")
|
||||||
|
},
|
||||||
|
output: {
|
||||||
|
manualChunks: (id) => {
|
||||||
|
if (id.includes("i18n/")) {
|
||||||
|
return "i18n";
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
experimental: {
|
||||||
|
renderBuiltUrl(filename, { hostType }) {
|
||||||
|
if (hostType === "js") {
|
||||||
|
return { runtime: `window.__prependStaticUrl("${filename}")` };
|
||||||
|
} else if (hostType === "html") {
|
||||||
|
return `{{ .StaticURL }}/${filename}`;
|
||||||
|
} else {
|
||||||
|
return { relative: true };
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
};
|
||||||
|
}
|
||||||
|
});
|
||||||
|
export {
|
||||||
|
vite_config_default as default
|
||||||
|
};
|
||||||
|
//# sourceMappingURL=data:application/json;base64,ewogICJ2ZXJzaW9uIjogMywKICAic291cmNlcyI6IFsidml0ZS5jb25maWcudHMiXSwKICAic291cmNlc0NvbnRlbnQiOiBbImNvbnN0IF9fdml0ZV9pbmplY3RlZF9vcmlnaW5hbF9kaXJuYW1lID0gXCIvVXNlcnMvc3RlZmZhZy9naXQvcGVyc29uYWwvZmlsZWJyb3dzZXIvZnJvbnRlbmRcIjtjb25zdCBfX3ZpdGVfaW5qZWN0ZWRfb3JpZ2luYWxfZmlsZW5hbWUgPSBcIi9Vc2Vycy9zdGVmZmFnL2dpdC9wZXJzb25hbC9maWxlYnJvd3Nlci9mcm9udGVuZC92aXRlLmNvbmZpZy50c1wiO2NvbnN0IF9fdml0ZV9pbmplY3RlZF9vcmlnaW5hbF9pbXBvcnRfbWV0YV91cmwgPSBcImZpbGU6Ly8vVXNlcnMvc3RlZmZhZy9naXQvcGVyc29uYWwvZmlsZWJyb3dzZXIvZnJvbnRlbmQvdml0ZS5jb25maWcudHNcIjtpbXBvcnQgcGF0aCBmcm9tIFwibm9kZTpwYXRoXCI7XG5pbXBvcnQgeyBkZWZpbmVDb25maWcgfSBmcm9tIFwidml0ZVwiO1xuaW1wb3J0IHZ1ZSBmcm9tIFwiQHZpdGVqcy9wbHVnaW4tdnVlXCI7XG5pbXBvcnQgVnVlSTE4blBsdWdpbiBmcm9tIFwiQGludGxpZnkvdW5wbHVnaW4tdnVlLWkxOG4vdml0ZVwiO1xuaW1wb3J0IHsgY29tcHJlc3Npb24gfSBmcm9tIFwidml0ZS1wbHVnaW4tY29tcHJlc3Npb24yXCI7XG5cbmNvbnN0IHBsdWdpbnMgPSBbXG4gIHZ1ZSgpLFxuICBWdWVJMThuUGx1Z2luKHtcbiAgICBpbmNsdWRlOiBbcGF0aC5yZXNvbHZlKF9fZGlybmFtZSwgXCIuL3NyYy9pMThuLyoqLyouanNvblwiKV0sXG4gIH0pLFxuICBjb21wcmVzc2lvbih7XG4gICAgaW5jbHVkZTogL1xcLihqc3x3b2ZmMnx3b2ZmKShcXD8uKik/JC9pLFxuICAgIGRlbGV0ZU9yaWdpbmFsQXNzZXRzOiB0cnVlLFxuICB9KSxcbl07XG5cbmNvbnN0IHJlc29sdmUgPSB7XG4gIGFsaWFzOiB7XG4gICAgXCJAXCI6IHBhdGgucmVzb2x2ZShfX2Rpcm5hbWUsIFwic3JjXCIpLFxuICB9LFxufTtcblxuLy8gaHR0cHM6Ly92aXRlanMuZGV2L2NvbmZpZy9cbmV4cG9ydCBkZWZhdWx0IGRlZmluZUNvbmZpZygoeyBjb21tYW5kIH0pID0+IHtcbiAgaWYgKGNvbW1hbmQgPT09IFwic2VydmVcIikge1xuICAgIHJldHVybiB7XG4gICAgICBwbHVnaW5zLFxuICAgICAgcmVzb2x2ZSxcbiAgICAgIHNlcnZlcjoge1xuICAgICAgICBwcm94eToge1xuICAgICAgICAgIFwiL2FwaS9jb21tYW5kXCI6IHtcbiAgICAgICAgICAgIHRhcmdldDogXCJ3czovLzEyNy4wLjAuMTo4MDgwXCIsXG4gICAgICAgICAgICB3czogdHJ1ZSxcbiAgICAgICAgICB9LFxuICAgICAgICAgIFwiL2FwaVwiOiBcImh0dHA6Ly8xMjcuMC4wLjE6ODA4MFwiLFxuICAgICAgICB9LFxuICAgICAgfSxcbiAgICB9O1xuICB9IGVsc2Uge1xuICAgIC8vIGNvbW1hbmQgPT09ICdidWlsZCdcbiAgICByZXR1cm4ge1xuICAgICAgcGx1Z2lucyxcbiAgICAgIHJlc29sdmUsXG4gICAgICBiYXNlOiBcIlwiLFxuICAgICAgYnVpbGQ6IHtcbiAgICAgICAgcm9sbHVwT3B0aW9uczoge1xuICAgICAgICAgIGlucHV0OiB7XG4gICAgICAgICAgICBpbmRleDogcGF0aC5yZXNvbHZlKF9fZGlybmFtZSwgXCIuL3B1YmxpYy9pbmRleC5odG1sXCIpLFxuICAgICAgICAgIH0sXG4gICAgICAgICAgb3V0cHV0OiB7XG4gICAgICAgICAgICBtYW51YWxDaHVua3M6IChpZCkgPT4ge1xuICAgICAgICAgICAgICBpZiAoaWQuaW5jbHVkZXMoXCJpMThuL1wiKSkge1xuICAgICAgICAgICAgICAgIHJldHVybiBcImkxOG5cIjtcbiAgICAgICAgICAgICAgfVxuICAgICAgICAgICAgfSxcbiAgICAgICAgICB9LFxuICAgICAgICB9LFxuICAgICAgfSxcbiAgICAgIGV4cGVyaW1lbnRhbDoge1xuICAgICAgICByZW5kZXJCdWlsdFVybChmaWxlbmFtZSwgeyBob3N0VHlwZSB9KSB7XG4gICAgICAgICAgaWYgKGhvc3RUeXBlID09PSBcImpzXCIpIHtcbiAgICAgICAgICAgIHJldHVybiB7IHJ1bnRpbWU6IGB3aW5kb3cuX19wcmVwZW5kU3RhdGljVXJsKFwiJHtmaWxlbmFtZX1cIilgIH07XG4gICAgICAgICAgfSBlbHNlIGlmIChob3N0VHlwZSA9PT0gXCJodG1sXCIpIHtcbiAgICAgICAgICAgIHJldHVybiBge3sgLlN0YXRpY1VSTCB9fS8ke2ZpbGVuYW1lfWA7XG4gICAgICAgICAgfSBlbHNlIHtcbiAgICAgICAgICAgIHJldHVybiB7IHJlbGF0aXZlOiB0cnVlIH07XG4gICAgICAgICAgfVxuICAgICAgICB9LFxuICAgICAgfSxcbiAgICB9O1xuICB9XG59KTtcbiJdLAogICJtYXBwaW5ncyI6ICI7QUFBa1UsT0FBTyxVQUFVO0FBQ25WLFNBQVMsb0JBQW9CO0FBQzdCLE9BQU8sU0FBUztBQUNoQixPQUFPLG1CQUFtQjtBQUMxQixTQUFTLG1CQUFtQjtBQUo1QixJQUFNLG1DQUFtQztBQU16QyxJQUFNLFVBQVU7QUFBQSxFQUNkLElBQUk7QUFBQSxFQUNKLGNBQWM7QUFBQSxJQUNaLFNBQVMsQ0FBQyxLQUFLLFFBQVEsa0NBQVcsc0JBQXNCLENBQUM7QUFBQSxFQUMzRCxDQUFDO0FBQUEsRUFDRCxZQUFZO0FBQUEsSUFDVixTQUFTO0FBQUEsSUFDVCxzQkFBc0I7QUFBQSxFQUN4QixDQUFDO0FBQ0g7QUFFQSxJQUFNLFVBQVU7QUFBQSxFQUNkLE9BQU87QUFBQSxJQUNMLEtBQUssS0FBSyxRQUFRLGtDQUFXLEtBQUs7QUFBQSxFQUNwQztBQUNGO0FBR0EsSUFBTyxzQkFBUSxhQUFhLENBQUMsRUFBRSxRQUFRLE1BQU07QUFDM0MsTUFBSSxZQUFZLFNBQVM7QUFDdkIsV0FBTztBQUFBLE1BQ0w7QUFBQSxNQUNBO0FBQUEsTUFDQSxRQUFRO0FBQUEsUUFDTixPQUFPO0FBQUEsVUFDTCxnQkFBZ0I7QUFBQSxZQUNkLFFBQVE7QUFBQSxZQUNSLElBQUk7QUFBQSxVQUNOO0FBQUEsVUFDQSxRQUFRO0FBQUEsUUFDVjtBQUFBLE1BQ0Y7QUFBQSxJQUNGO0FBQUEsRUFDRixPQUFPO0FBRUwsV0FBTztBQUFBLE1BQ0w7QUFBQSxNQUNBO0FBQUEsTUFDQSxNQUFNO0FBQUEsTUFDTixPQUFPO0FBQUEsUUFDTCxlQUFlO0FBQUEsVUFDYixPQUFPO0FBQUEsWUFDTCxPQUFPLEtBQUssUUFBUSxrQ0FBVyxxQkFBcUI7QUFBQSxVQUN0RDtBQUFBLFVBQ0EsUUFBUTtBQUFBLFlBQ04sY0FBYyxDQUFDLE9BQU87QUFDcEIsa0JBQUksR0FBRyxTQUFTLE9BQU8sR0FBRztBQUN4Qix1QkFBTztBQUFBLGNBQ1Q7QUFBQSxZQUNGO0FBQUEsVUFDRjtBQUFBLFFBQ0Y7QUFBQSxNQUNGO0FBQUEsTUFDQSxjQUFjO0FBQUEsUUFDWixlQUFlLFVBQVUsRUFBRSxTQUFTLEdBQUc7QUFDckMsY0FBSSxhQUFhLE1BQU07QUFDckIsbUJBQU8sRUFBRSxTQUFTLDhCQUE4QixRQUFRLEtBQUs7QUFBQSxVQUMvRCxXQUFXLGFBQWEsUUFBUTtBQUM5QixtQkFBTyxvQkFBb0IsUUFBUTtBQUFBLFVBQ3JDLE9BQU87QUFDTCxtQkFBTyxFQUFFLFVBQVUsS0FBSztBQUFBLFVBQzFCO0FBQUEsUUFDRjtBQUFBLE1BQ0Y7QUFBQSxJQUNGO0FBQUEsRUFDRjtBQUNGLENBQUM7IiwKICAibmFtZXMiOiBbXQp9Cg==
|
5
makefile
5
makefile
|
@ -35,8 +35,13 @@ lint: lint-backend lint-frontend
|
||||||
|
|
||||||
test: test-backend test-frontend
|
test: test-backend test-frontend
|
||||||
|
|
||||||
|
check-all: lint test
|
||||||
|
|
||||||
test-backend:
|
test-backend:
|
||||||
cd backend && go test -race -timeout=10s ./...
|
cd backend && go test -race -timeout=10s ./...
|
||||||
|
|
||||||
test-frontend:
|
test-frontend:
|
||||||
|
cd frontend && npm run test
|
||||||
|
|
||||||
|
test-frontend-playwright:
|
||||||
docker build -t gtstef/filebrowser-tests -f Dockerfile.playwright .
|
docker build -t gtstef/filebrowser-tests -f Dockerfile.playwright .
|
||||||
|
|
Loading…
Reference in New Issue