Skip to Main Content
IBM Power Ideas Portal


This portal is to open public enhancement requests against IBM Power Systems products, including IBM i. To view all of your ideas submitted to IBM, create and manage groups of Ideas, or create an idea explicitly set to be either visible by all (public) or visible only to you and IBM (private), use the IBM Unified Ideas Portal (https://ideas.ibm.com).


Shape the future of IBM!

We invite you to shape the future of IBM, including product roadmaps, by submitting ideas that matter to you the most. Here's how it works:

Search existing ideas

Start by searching and reviewing ideas and requests to enhance a product or service. Take a look at ideas others have posted, and add a comment, vote, or subscribe to updates on them if they matter to you. If you can't find what you are looking for,

Post your ideas
  1. Post an idea.

  2. Get feedback from the IBM team and other customers to refine your idea.

  3. Follow the idea through the IBM Ideas process.


Specific links you will want to bookmark for future use

Welcome to the IBM Ideas Portal (https://www.ibm.com/ideas) - Use this site to find out additional information and details about the IBM Ideas process and statuses.

IBM Unified Ideas Portal (https://ideas.ibm.com) - Use this site to view all of your ideas, create new ideas for any IBM product, or search for ideas across all of IBM.

ideasibm@us.ibm.com - Use this email to suggest enhancements to the Ideas process or request help from IBM for submitting your Ideas.

Status Not under consideration
Workspace IBM i
Created by Guest
Created on Jun 9, 2017

JOBQ that spans multiple systems

Consider possibility of creating a JOBQ that spans multiple systems.


Use Case:

Think of a client who only has one tape drive (FC attached) but multiple IBM i LPARs. In this case only one client at a time can save to this tape drive. Most saves are done using submit job into a jobq (which in this case would only allow one job to run at a time).
In case of my example I would then have a jobq that spans over all client LPARs and again allow only one job to run at a time. By submitting a backup/restore job into the "global" jobq from each lpar would allow to utilize the tape drive more efficiently as it's currently done using job schedule entries.
Or think of some batch processing jobs that might run on different systems and depend on some other jobs that need to finish first.
I guess there are a lot of use cases ...


Idea priority Medium
  • Guest
    Reply
    |
    Oct 10, 2017

    There are many challenges and obstacles involved in delivering this request, but perhaps the most obvious one is that when a job is placed on a JOBQ, it is not merely recording the SBMJOB command, it actually allocates a job structure, which defines it as a "job" even though it is not active or running yet. And Jobs are not objects that can be moved from one system to another. It would take a tremendous amount of work to re-architect the concept of a job and its internal structures.

    The other thing that would make this very difficult is that the two (or more?) systems would need to be nearly identical in terms of all the supporting objects that can be defined on the submit, such as job description, user profile, output queue, print device, routing entries, library list, ASP group, not to mention any application level objects required to run, such as programs, files, or other objects which would be expected to run the command or program successfully.

  • Guest
    Reply
    |
    Jul 24, 2017

    The thought of trying to create objects that span partitions blows the top of my head off and likely any IBM developer as well. It would be Herculean to change the architecture to support this. Managing tape drives is easy enough as long as they are owned by a host partition. If you have multiple partitions then you certainly have an HMC and it is easy to create a host partition. Simply vary the tape drive off on one partition and vary it on on another partition. The vary commands can be imbedded in CL and possibly executed remotely if needed.

    Tom Duncan
    CAAC member