Skip to content

GitLab

  • Projects
  • Groups
  • Snippets
  • Help
    • Loading...
  • Help
    • Help
    • Support
    • Community forum
    • Submit feedback
    • Contribute to GitLab
  • Sign in
G
Genesys Website
  • Project overview
    • Project overview
    • Details
    • Activity
    • Releases
  • Repository
    • Repository
    • Files
    • Commits
    • Branches
    • Tags
    • Contributors
    • Graph
    • Compare
  • Issues 21
    • Issues 21
    • List
    • Boards
    • Labels
    • Service Desk
    • Milestones
  • Operations
    • Operations
    • Incidents
  • Packages & Registries
    • Packages & Registries
    • Container Registry
  • Analytics
    • Analytics
    • Repository
    • Value Stream
  • Wiki
    • Wiki
  • Snippets
    • Snippets
  • Members
    • Members
  • Collapse sidebar
  • Activity
  • Graph
  • Create a new issue
  • Commits
  • Issue Boards
  • Genesys PGR
  • Genesys Website
  • Issues
  • #256

Closed
Open
Opened Mar 18, 2019 by Matija Obreza@mobrezaOwner

Massive subsets/datasets

Creating subsets (or datasets) with 100,000s accession refs is a performance issue for the backend and a UI issue for the frontend: The frontend will time out before the backend responds with results, causing the user to "reupload" the 100,000 refs, using even more backend resources.

Update frontend to send accessionRef data to backend in batches of 10,000 (or 20,000). It should wait until the server responds and then send the next batch.

select instCode, acceNumb, genus from accession limit 200000;

Also test performance of "Create new version".

Edited Mar 25, 2019 by Matija Obreza
To upload designs, you'll need to enable LFS and have an admin enable hashed storage. More information
Assignee
Assign to
1.0
Milestone
1.0
Assign milestone
Time tracking
None
Due date
None
Reference: genesys-pgr/genesys-ui#256