Skip to content

GitLab

  • Projects
  • Groups
  • Snippets
  • Help
    • Loading...
  • Help
    • Help
    • Support
    • Community forum
    • Submit feedback
    • Contribute to GitLab
  • Sign in
Genesys Backend
Genesys Backend
  • Project overview
    • Project overview
    • Details
    • Activity
    • Releases
  • Repository
    • Repository
    • Files
    • Commits
    • Branches
    • Tags
    • Contributors
    • Graph
    • Compare
  • Issues 45
    • Issues 45
    • List
    • Boards
    • Labels
    • Service Desk
    • Milestones
  • Operations
    • Operations
    • Incidents
  • Packages & Registries
    • Packages & Registries
    • Container Registry
  • Analytics
    • Analytics
    • Repository
    • Value Stream
  • Wiki
    • Wiki
  • Members
    • Members
  • Collapse sidebar
  • Activity
  • Graph
  • Create a new issue
  • Commits
  • Issue Boards
  • Genesys PGR
  • Genesys BackendGenesys Backend
  • Issues
  • #429

Closed
Open
Opened Mar 18, 2019 by Matija Obreza@mobrezaOwner
  • Report abuse
  • New issue
Report abuse New issue

Massive subsets/datasets

Creating subsets (or datasets) with 100,000s accession refs is a performance issue for the backend, especially because the UI request will time out before the server is done processing. The user also cannot know that the server is processing the request and will re-upload the 100,000 refs, using more server resources.

Update the processing of AccessionRef uploads so that they require the current version of the dataset (or subset) to be correct before adding refs to it.

The UI ticket is genesys-ui#256

Assignee
Assign to
2.4
Milestone
2.4 (Past due)
Assign milestone
Time tracking
None
Due date
None
Reference: genesys-pgr/genesys-server#429