Wednesday, 26 November 2025

How AI Turned Your Best-Kept Secret Into Your Competitive Advantage

How AI Turned Your Best-Kept Secret Into Your Competitive Advantage


The Hidden Gem Nobody Knew They Had

Let's talk about JD Edwards Orchestrator for a minute.

It's genuinely brilliant technology. Released in 2014, it gave you the ability to compose complex business processes using JDE's building blocks - Business Functions, Form Services, Data Services - all through visual configuration. No custom code. No modifications. Just pure, governed business logic.

You could automate virtually anything: price updates, order processing, journal entries, inventory movements, complex multi-step workflows. And it all ran natively in JDE, respecting security, maintaining audit trails, following your business rules.

The catch? Actually using them.

Sure, tools existed to trigger orchestrations from Excel. They worked. Sort of. For $50K+ and a consultant who had to configure each integration point manually. And once that consultant left, good luck making changes. The result? Most organizations built a handful of orchestrations and then... stopped.

Not because orchestrations weren't powerful. But because the interface between "I need this done" and "the orchestration runs" was too complicated for most users to bridge.

That just changed.


The Modern Take: OAuth 2.0 Security Meets Conversational AI

Here's what just became possible, and you can stand this up in hours:

You: "Process these 500 invoices from the email I just forwarded you."

Copilot: "Authenticated as shannon.moir@fusion5.com.au. Processing AP batch orchestration... Validating vendor codes... Checking GL accounts... Complete. 487 invoices posted under your authority, 13 flagged for review - 8 missing PO numbers, 5 over tolerance."

That sentence just:

  • Authenticated you via OAuth 2.0 (you're already signed into Teams/M365)
  • Extracted the spreadsheet from your email
  • Called your AP processing orchestration
  • Executed it under YOUR user credentials (respecting your JDE security profile)
  • Processed hundreds of transactions
  • Returned results in plain English
  • Created a complete audit trail showing YOU initiated this

Zero clicks. Zero forms. Zero credentials shared. Zero security compromises.

Your orchestration didn't change. The business logic didn't change. JDE didn't change.

You just started having a conversation with your business processes.


Why This Time It's Different

The Old Tools: Excel Plugins and Expensive Middleware

If you've been around JDE long enough, you've seen the attempts:

The Excel Add-in Approach (Circa 2016-2020):

  • Install a plugin on every user's machine
  • Map columns to orchestration parameters manually
  • Hope nobody changes the spreadsheet format
  • Pay annual licensing per user
  • Call consultants when it breaks
  • Cost: $50K-$200K for setup, $20K+/year maintenance

It worked for exactly one use case, set up by exactly one consultant, used by exactly one person who was terrified to change anything.

The Custom Integration Approach:

  • Build REST API wrapper around orchestrations
  • Write documentation nobody reads
  • Create training materials nobody watches
  • Maintain custom code forever
  • Cost: $100K+ and 6 months of developer time

Both approaches shared the same fundamental flaw: They required users to understand the technology, not just describe what they needed.

The Modern Approach: Conversational + Secure + Fast

This is different because:

1. OAuth 2.0 Authentication (Real Security)

  • Users authenticate once (they're already signed into Microsoft 365)
  • Every action runs under their actual JDE credentials
  • JDE security profiles apply automatically
  • Complete audit trails show who did what
  • No shared passwords, no service accounts, no security theater

2. Natural Language Interface (Real Usability)

  • Users describe what they need in their own words
  • AI maps that to the right orchestration
  • Parameters get filled from context (email attachments, spreadsheets, previous answers)
  • Results come back in language they understand

3. Hours to Deploy (Real Speed)

  • MCP server deployment: 2-3 hours
  • Orchestration enablement: Automatic (if they exist, they're available)
  • User training: "Just ask for what you need"
  • Total time to first conversation: Same afternoon

4. Zero Modification to JDE (Real Governance)

  • Your orchestrations run exactly as designed
  • Business logic stays in JDE where it belongs
  • No custom code, no modifications, no upgrade blockers
  • IT stays in control of what gets built

This isn't replacing the old tools. This is a completely different paradigm.


Real Scenarios: From "It Takes All Day" to "Ask and It's Done"

Scenario 1: The Monday Morning Vendor Price Update

The Old Reality:

Sarah from Purchasing receives a spreadsheet: 2,847 price updates from a major supplier.

With the Excel plugin tool, she still has to:

  1. Open the special Excel file with the plugin
  2. Make sure her columns match exactly
  3. Click "Validate" and wait
  4. Fix the 47 rows that error out
  5. Click "Submit" and pray
  6. Watch the progress bar for 45 minutes

Usually finishes by lunch. If nothing breaks.

The New Reality:

Sarah types in Teams: "Process the price update spreadsheet from Acme Corp."

Copilot: "Found spreadsheet with 2,847 items. Running vendor_price_update_v2 orchestration under your credentials... Complete in 4 minutes. 2,830 prices updated. 17 exceptions flagged - 12 items not found, 5 prices exceed your authorization threshold (forwarded to Purchasing Manager). Changes logged under your user ID."

Same orchestration. Same business logic. Same security. Different interface.

She's done in 5 minutes instead of 4 hours. And she didn't have to remember how to use the special Excel file.

Scenario 2: The AP Processing That Never Ends

The Old Reality:

Month-end means processing 40-50 journal entry spreadsheets from different departments.

With the old custom REST API integration, IT built a web form. Users still have to:

  1. Log into the custom portal (separate credentials)
  2. Upload their spreadsheet
  3. Wait for email confirmation
  4. Check for errors in a different screen
  5. Re-submit corrections
  6. Repeat until it works

The accounting team blocks out two days for this. Every month.

The New Reality:

"Process all month-end journals from the shared folder using my approval authority."

Copilot: "Found 47 files. Authenticating as robert.chen@company.com... Running journal_validation_posting orchestration on each file... Complete. 45 journals posted (batch ID: JE-2025-11-001), all under your user profile. 2 failed validation - Department 450 account 1234 inactive, Department 320 entries don't balance ($1,250 discrepancy). Details sent to your email. Audit trail complete."

Two days became 15 minutes. The orchestration has sophisticated validation logic that IT built once. Now anyone with proper authority can trigger it by asking.

Scenario 3: Customer Orders That Used to Ruin Fridays

The Old Reality:

Jake's biggest customer emails a weekly order spreadsheet. 300-500 items.

The old Excel plugin required him to:

  1. Download their spreadsheet
  2. Open it in the special template
  3. Map their columns to his columns (because they change their format)
  4. Run validation (15 minutes)
  5. Fix errors
  6. Submit batch
  7. Hope it works

Usually took 3-4 hours every Friday afternoon.

The New Reality:

Friday, 3:00 PM: "Create sales orders from the BigCustomer weekly file. Ship-to their distribution center, requested date next Monday, use our standard pricing agreement."

Copilot: "Processing 467 line items under your sales authority... Running customer_order_import_v3 orchestration... Complete. Orders 450123-450589 created (total $3.2M), all assigned to your user ID for commission tracking. 23 items flagged as below safety stock - PO suggestions generated and sent to Purchasing. Customer confirmation email sent. You're done for the week."

Jake makes his 3:30 PM tee time. The orchestration handles customer-specific item mappings, pricing rules, inventory checks, and order creation. IT built it once. The AI makes it conversational.


The Chaining Effect: Conversations That Compose Business Processes

Here's where it gets transformative.

Once your orchestrations are conversational, you can compose them into workflows without writing new code.

Example: The Supply Chain Cascade

"Check inventory for next week's production schedule, flag anything under safety stock, generate PO suggestions for approved vendors, and send the summary to procurement."

That sentence just:

  1. Called your production_schedule_analysis orchestration
  2. Piped results to inventory_status_check orchestration
  3. Fed those results to smart_po_generation orchestration
  4. Triggered email_procurement_summary notification
  5. All authenticated under your credentials
  6. All audited in JDE

Four orchestrations. Built separately. By different people. For different purposes.

The AI composed them into a workflow because you described the outcome you wanted.

This is English as a programming language. The orchestrations are your functions. The conversation is your code.

Another Example: The New Hire Cascade

"New employee starts Monday - Emma Wilson, Analyst role, Department 450, reports to badge 10234, standard benefits package, laptop and access to Finance systems."

That cascades through:

  • HR employee_onboarding orchestration (creates master record)
  • IT provisioning_automation orchestration (triggers Azure AD, assigns licenses)
  • Department assignment_workflow orchestration (manager notification, cost center allocation)
  • Benefits enrollment_automation orchestration (adds to next enrollment window)
  • Asset management orchestration (creates laptop request)
  • Email notification orchestration (welcome email to Emma, confirmation to manager)

Six orchestrations. Five different systems. One sentence. Under your authority.

The orchestrations don't know about each other. But the AI knows what each one does and can compose them into an onboarding process.

You just onboarded a digital employee to onboard a real employee.


The Vision: Prescriptive Assistants and Digital Employees

This isn't about replacing your existing tools. This is about creating a new class of worker: prescriptive AI assistants that act as your digital workforce.

Meet Your New Digital Employees

Dana: Your AP Processing Assistant

  • Monitors incoming invoices
  • Knows your approval thresholds
  • Validates against POs automatically
  • Routes exceptions to the right people
  • Runs your AP posting orchestration when everything checks out
  • Available 24/7, never takes vacation, doesn't forget the month-end deadline

Marcus: Your Inventory Management Assistant

  • Watches inventory levels constantly
  • Knows your safety stock rules by item and location
  • Predicts stockouts before they happen
  • Generates PO requisitions using your approved vendor list
  • Routes to the right approver based on dollar threshold
  • Triggers your inventory_replenishment orchestration automatically

Sofia: Your Order Management Assistant

  • Monitors incoming customer orders from all channels
  • Validates against credit limits and inventory
  • Flags orders that need special handling
  • Executes your order_processing orchestration for routine orders
  • Escalates complex orders to humans with full context
  • Learns your customers' patterns and preferences

These aren't chatbots. These aren't RPA bots clicking through screens.

These are digital employees with judgment, context, and the authority to execute your business processes through orchestrations.

The Onboarding Process: Hours, Not Months

Here's what's revolutionary: You can stand up a digital employee in an afternoon.

Morning:

  1. Identify a repetitive process (AP invoice processing, price updates, order entry)
  2. Build or dust off an orchestration (you probably already have one)
  3. Deploy the MCP server (2-3 hours if you're following the guide)
  4. Configure OAuth authentication (already done if you're using M365)

Afternoon:

  1. Test: "Process these test invoices"
  2. Refine: Adjust the orchestration if needed
  3. Document: "Dana handles AP processing for invoices under $10K"
  4. Enable: Users can now ask Dana to process invoices

Next Day:

  • Dana processes 200 invoices before anyone arrives
  • Flags 15 exceptions for human review
  • Sends summary report at 8 AM
  • Your team spends the day on exceptions, not data entry

Total setup time: 4-6 hours.

Compare that to:

  • Custom Excel plugin: 3 months and $50K
  • REST API integration: 6 months and $100K+
  • RPA bot development: 2-3 months and ongoing maintenance nightmares

The Multiplication Effect

Once you have one digital employee working, adding the next one is even faster.

Your orchestrations become a library of capabilities. New assistants can mix and match them.

Month 1: Dana (AP Processing) Month 2: Marcus (Inventory) reuses some of Dana's notification orchestrations Month 3: Sofia (Order Management) reuses Dana's validation patterns and Marcus's inventory checks Month 4: Your team proposes three more assistants because they see what's possible

Within six months, you have a digital workforce handling routine operations while your human team focuses on exceptions, strategy, and growth.

This is the vision: A hybrid workforce where digital employees handle the predictable, and humans handle the exceptional.


The OAuth 2.0 Difference: Security That Actually Works

Let's talk about why this is fundamentally more secure than the old approaches.

Old Approach: Security Theater

Excel plugins: Shared service account, hard-coded credentials, everyone uses the same access level Custom APIs: Service account with elevated privileges, hope nobody abuses it Web portals: Separate authentication system, users forget passwords, IT resets them constantly

The result? Either too restrictive (nobody can do their job) or too permissive (everyone has admin rights).

New Approach: Real Security

OAuth 2.0 + Azure Entra ID + JDE Security:

  1. User authenticates once (they're already signed into M365)
  2. Azure validates their identity (your existing MFA, conditional access, all applies)
  3. MCP server receives their token (time-limited, cryptographically signed)
  4. Maps to their JDE user (shannon.moir@fusion5.com.au → SMOIR in JDE)
  5. Orchestration runs under THEIR credentials (with their security profile, their approvals, their limits)
  6. JDE logs it under their user ID (complete audit trail)

If you can't do it in JDE, you can't do it through the AI.

Your AP clerk can process invoices under $10K (their authority limit). Your controller can process anything (their authority is higher). Your warehouse worker can check inventory but not change prices (read-only on pricing tables).

The AI doesn't get special privileges. It impersonates the user making the request.

This means:

  • Proper separation of duties
  • Real audit trails
  • No credential sharing
  • No elevation of privilege attacks
  • SOX compliance maintained
  • Your security team can actually sleep at night

What to Actually Watch Out For

Since security is handled properly, here's what you should actually think about:

1. Change Management: Your Team Might Resist the Help

Your most experienced users might be skeptical: "I've been doing this for 15 years. Why do I need AI?"

The answer: You don't need it to do your job. You need it so you can do more than your job.

The AP clerk who's an expert at processing invoices? Now she has time to analyze vendor spend patterns and negotiate better terms.

The inventory manager who knows the system inside-out? Now he can focus on supplier relationships instead of data entry.

Digital employees handle the routine. Humans get promoted to strategic.

2. Over-Automation: Not Everything Should Be Automated

Just because you can automate something doesn't mean you should.

Good candidates for automation:

  • High volume, low complexity (invoice processing, order entry)
  • Rule-based decisions (reorder points, price updates)
  • Data validation and transformation
  • Scheduled, predictable workflows

Bad candidates for automation:

  • Strategic decisions with incomplete information
  • Edge cases requiring human judgment
  • Processes that change frequently (automate after they stabilize)
  • Anything involving complex ethical considerations

The goal isn't zero humans. It's humans working on human problems.

3. Orchestration Quality: Garbage In, Amplified Out

The AI will make your orchestrations 10x more used.

If your orchestration has a bug, you're about to discover it. Fast.

The good news: High usage means fast feedback. You'll improve your orchestrations quickly because you'll see how they're actually being used.

The bad news: You need to be ready to iterate. Don't build the perfect orchestration over 6 months. Build a good one in 2 weeks, deploy it conversationally, learn from usage, improve.

4. Documentation: It Actually Matters Now

Nobody read your orchestration documentation before because nobody used the orchestrations.

Now people will use them constantly. But they won't read documentation - they'll just ask the AI.

Make sure your orchestrations have:

  • Clear names that describe what they do
  • Good descriptions that explain their purpose
  • Defined input parameters with sensible names
  • Expected output documented

The AI uses this to match user requests to the right orchestration. "Process vendor payments" should map to "vendor_payment_batch_v2" not "BSFN_CUSTOM_JOB_17."


Getting Started: Your First Digital Employee in One Day

Morning (9 AM - 12 PM): Deploy the Infrastructure

Hour 1-2: Deploy MCP Server

  • Follow the deployment guide (it's actually straightforward)
  • Provision Azure Container App
  • Configure connection to your JDE AIS server
  • Set up API Management gateway

Hour 3: Configure OAuth

  • Register application in Azure Entra ID
  • Set up API permissions
  • Configure token validation
  • Test authentication flow

Total: 3 hours (with the guide, following the steps)

Afternoon (1 PM - 5 PM): Enable Your First Use Case

Hour 4: Inventory Your Orchestrations

  • List what orchestrations you already have
  • Pick one painful process to start with
  • Make sure the orchestration has good metadata

Hour 5: Test Conversationally

  • Connect to Copilot/Teams/Power Platform
  • Try: "Process the test vendor price update file"
  • Verify it calls the right orchestration
  • Check that security and audit trails work

Hour 6-7: Refine and Document

  • Adjust orchestration if needed based on testing
  • Create simple guidance: "Ask Dana to process AP invoices"
  • Test with real users

Hour 8: Deploy to Production

  • Enable for pilot user group
  • Monitor usage and feedback
  • Iterate based on what you learn

Next Day:

  • Your first digital employee is processing real work
  • Users are asking for it conversationally
  • You're collecting data on usage patterns
  • You're already planning the next digital employee

Total time: One day.

Week 2: Add Your Second Digital Employee

It's faster the second time because:

  • Infrastructure is already deployed
  • You understand the patterns
  • Users trust the approach
  • You have orchestrations ready to enable

Time: 2-3 hours to enable another use case.

Month 2: You Have a Digital Workforce

  • 5-10 digital employees handling routine operations
  • Your human team focusing on exceptions and strategy
  • Usage data showing ROI in real-time
  • Business units asking for their own digital assistants

This is the multiplication effect of conversational orchestrations.


The Conclusion: Your Hidden Assets Just Became Your Competitive Advantage

For years, your orchestrations have been your best-kept secret. Powerful capabilities that only a few people knew how to trigger.

That just changed.

Those orchestrations are now conversational. Anyone with the right authority can use them by describing what they need. Your business logic became accessible.

Your digital transformation didn't require replacing JDE. It required giving it a voice.

The competitive advantage isn't the AI. It's the business logic you've already built in JDE, now available to everyone who needs it.

Your competitors are still:

  • Manually entering data
  • Paying consultants $200/hour
  • Waiting months for custom integrations
  • Training users on complex systems

You're having conversations with your business processes.

Welcome to the age of the digital workforce.


Start Tomorrow Morning

9:00 AM: Identify one painful, repetitive process 10:00 AM: Check if you have an orchestration for it (you probably do) 11:00 AM: Deploy the MCP server (following the guide) 2:00 PM: Test conversationally 3:00 PM: Deploy to pilot users Next Day: Watch your first digital employee process work

Total investment: One day. Return: Immediate and ongoing.

The barrier wasn't the technology. It was the interface.

That barrier just disappeared.


Written by someone who watched brilliant orchestrations sit unused for years because they were "too hard to trigger." Not anymore.

November 2025

Tuesday, 25 November 2025

We Built a World-First: Connecting JD Edwards to AI Agents via MCP

 

And yes, you can now ask an AI "What's our inventory level for part X?" and get a live answer from JDE.


If you've been in the JD Edwards world for any length of time, you've probably had this conversation: "Can we just connect [insert shiny new technology here] to JDE?" And the answer is usually some variation of "Yes, but it'll take 6-12 months and cost more than your first house."

Well, I'm genuinely excited to share something we've been working on at Fusion5 that changes that equation entirely.

The Problem We All Know Too Well

JD Edwards is a phenomenal system of record. It's robust, it's proven, and it holds the truth about your business. But let's be honest — it wasn't built for the age of conversational AI. If you want to:

  • Let a business user ask a quick question about outstanding invoices
  • Have an AI assistant pull live order data for a customer service rep
  • Automate a workflow that needs real-time JDE data

...you've traditionally been looking at custom development, middleware, orchestrations, and a lot of billable hours.

Meanwhile, AI assistants like Microsoft Copilot, Claude, and others are revolutionising how people interact with systems. But they can't just "talk to" JDE. They don't understand PS_TOKENs, Julian dates, or why your customer table is called F0101.

Enter the Model Context Protocol (MCP)

For those who haven't come across it yet, MCP is an open standard developed by Anthropic that essentially creates a universal adapter between AI agents and external systems. Think of it as USB-C for AI — a standardised way for any AI to connect to any data source.

The catch? Nobody had built one for an ERP system. Until now.

What We Built

We've developed what we believe is the world's first MCP server for an ERP platform — specifically for JD Edwards EnterpriseOne.

Without getting into the weeds (this is a blog, not a technical manual), here's what it does:

It translates AI requests into JDE-speak. When an AI agent asks for "customers in Australia with outstanding invoices over $10,000," our MCP server figures out that means querying F0101 for country code AU, joining to F03B11, filtering on open amounts, and handling all the Julian date conversions along the way.

It handles authentication properly. Every action happens under a real user's identity. No shared service accounts, no security shortcuts. Your JDE role-based security still applies — if a user can't see payroll data in JDE, the AI can't retrieve it for them either.

It speaks JDE fluently. We've embedded comprehensive metadata about JDE's tables, fields, and business functions. The system knows that ALPH means "Alphabetic Name" and that F4211 is your Sales Order Detail. This means the AI can understand business terms and translate them correctly.

It covers the full AIS API. Data queries, form services, file attachments, business functions, reports, orchestrations — if JDE's AIS can do it, our MCP server can expose it to AI agents.

What Does This Actually Look Like?

Here are a few scenarios that are now possible:

Conversational queries: A finance controller asks their AI assistant: "Show me all customers in Australia with outstanding invoices over $10,000." The AI calls our MCP server, which handles the multi-table query, and returns a formatted summary. No JDE screens opened. No SQL written.

Report generation via chat: A sales manager says: "Generate a PDF sales report for January 2025." The AI finds the appropriate JDE batch report, submits it with the right parameters, waits for completion, and returns a download link.

Business function execution: A customer service rep asks: "Calculate the shipping cost for sales order 12345 with express delivery." The AI calls the appropriate JDE business function and returns the calculated freight amount — using JDE's own logic, so the numbers are correct.

Data discovery: A power user building a report asks: "What fields are in the Purchase Order header table?" The AI returns a list of fields with their business descriptions, making JDE's data model more accessible.

Why This Matters

JDE becomes AI-enabled without a replacement project. You don't need to migrate to a new ERP or wait for Oracle to build this. Your existing JDE investment gains AI capabilities today.

The UI becomes optional. Not every interaction needs to go through JDE screens. Business users can get what they need conversationally, through Teams, through Power Platform, through whatever interface makes sense for them.

Development time collapses. Integration projects that would have taken months can now be achieved in weeks. The MCP server handles the complexity — you just need to configure and connect.

Security stays intact. This isn't a backdoor into JDE. It's a secure, auditable extension that respects your existing security model.

The Bigger Picture

This is part of a broader shift we're seeing in how enterprises interact with their systems of record. The AI doesn't replace JDE — it makes JDE more accessible, more useful, and more integrated into modern workflows.

For JDE shops that have been worried about being "left behind" in the AI wave, this is significant. Your ERP can now participate in AI-driven processes alongside your newer cloud systems.


What's Next?

We're continuing to enhance the platform — multi-environment support, bulk operations, real-time event feeds, and tighter integrations with Power Platform and Microsoft Copilot are all on the roadmap.

If you're interested in learning more about what this could mean for your organisation, reach out to Fusion5's Innovation Labs. We're genuinely excited about where this is heading.


Shannon Moir is Director of AI at Fusion5. When not connecting legacy systems to futuristic AI, he can occasionally be found explaining to people that F0101 is actually a very sensible name for a table.

Tuesday, 24 June 2025

This risks of containerising JDE

To container or not container?

We've looked into containerising JDE for a long time.  We've had it running in the lab, we've done extensive performance testing too...  We have struggled to make the leap when it comes to our customers production environments.  The technology should be supported IMHO, but oracle do not support it.  They do not test any of their updates or patches on a containerised implementation.  So would I risk my customers uptime when I cannot get unequivocal support from my primary vendor (oracle), probably not!

Also, you might want to critically evaluate your weblogic licencing, as that can get expensive when deploying on the wrong cloud services.

Problem 1: WebLogic licences

1. Oracle Licensing Model for WebLogic

Oracle WebLogic is typically licensed in one of two ways:

  • Per Processor (CPU) License – based on the number of Oracle-licensed processor cores

  • Named User Plus (NUP) – based on the number of users, with minimums tied to processor count

When containerising, Per Processor is the model most affected.


How CPU Count is Calculated in Containers

Oracle’s policy is clear: Oracle does not recognise container limits as a licensing boundary unless you're using an Oracle-approved hard partitioning technology.

This means:

If you deploy WebLogic inside Docker or Kubernetes, Oracle may count all physical CPU cores on the host unless you use a licensing-compliant method to restrict it.

Example:

  • You run WebLogic in a container limited to 2 vCPUs on a VM with 64 cores.

  • Oracle may still require you to license all 64 cores, unless you use an approved virtualisation technology (like Oracle VM or physical partitioning on Oracle SPARC hardware).


Oracle’s Stand on Virtualisation and Containers

Oracle’s Partitioning Policy document explicitly states:

"Oracle does not recognise soft partitioning (e.g., cgroups, Docker limits, Kubernetes node selectors) as a means to limit licensing requirements."

So:

  • Docker/K8s CPU limits do not restrict licensing scope

  • Hard partitioning (e.g., Oracle LDOMs, IBM PowerVM) is required to reduce licensable CPU

Question Answer
Can you containerise WebLogic? Yes, technically, but licensing must be handled carefully.
How is CPU count calculated? Oracle counts all cores on the host unless hard-partitioned using approved methods.
What are the risks? Over-licensing or non-compliance in audit scenarios.
Best practices? Use OCI, or hard partitioned environments. Avoid relying on Docker/K8s limits alone.


If you do not licence WLS with technology foundation (and many customers do not), then you cannot use any public k8 or docker services, as their soft partitioning is not recognised by oracle.  This is putting you at risk in a licence audit.

Given the above, you pretty much need to run docker or k8 on a dedicated host, which is going to depreciate the availability gains of containers.

Problem 2: You are running an unsupported architecture

I think the risks are more apparent with the latest features,  especially for their more technical components of no downtime package deployments, filesystem integrations and file naming techniques and a few other troublesome edge cases that need additional configuration and support.  I'd do it for my customers if they did not think support was important (they stopped paying maintenance for example).

E1: OCI: Support Statement for Running Containerized JDE on Oracle Cloud Infrastructure (Doc ID 2421088.1)

... While the product development team will be available to actively collaborate with your “containerization of JD Edwards” project, we make no commitments right now that any issue that is specific to containerized deployment will be addressed under standard support model. In other words, if the issue cannot be replicated in a non-containerized environment, the product development team may or may not provide a fix for that...






Tuesday, 4 February 2025

Our AI infused JDE helper - can be yours

For a small monthly cost, we can load all of your JD Edwards manuals into our secure Azure based vector DB and all you to have a personalised JDE AI assistant.  Forget old ways of providing training and use all of the assets that you currently own.

Here is how it works - just have a turn:

https://capps-backend-7hl6h2whmhtla.jollyplant-40694b9e.australiaeast.azurecontainerapps.io/#/

It has a chat mode and a "ask a question" mode.

This is a really nice way of getting your JDE users to be better at prompting AI, which as we already know is an important life skill.  When you need to get better information, you'll get better at prompting.

Remember the RISEN acronym:

1. Role

Definition: Clarify the role or perspective the response should take. This can include specifying whether the prompt should be answered from the viewpoint of an expert, a neutral observer, or another defined persona. 

Example: For a prompt aimed at providing investment advice, the role might be defined as that of a financial advisor.

2. Instructions

Definition: Provide clear, direct instructions on what the prompt needs to accomplish. This typically involves stating explicitly what the response should include or address. 

Example: "List the top three risks of investing in emerging markets."

3. Steps

Definition: Outline the steps or the logical sequence in which the response should be structured. This helps in organizing the response in a coherent and logical manner. 

Example: "Start with a brief introduction to emerging markets, followed by a detailed analysis of each identified risk, and conclude with a summary."

4. End goal

Definition: Define the ultimate purpose or the actionable outcome expected from the prompt. This helps in aligning the prompt with the desired outcome or decision-making process. 

Example: "The end goal is to help an investor understand potential challenges in emerging markets to make an informed investment decision."

5. Narrowing

Definition: Narrow the focus of the prompt to avoid broad or overly general responses. This involves setting boundaries or constraints to hone in on the most relevant and specific information. 

Example: "Focus only on economic and political risks, excluding environmental factors."


Final Example Using RISEN

Prompt:

Role: As a financial advisor,

Instructions: provide an analysis of the current risks in investing in emerging markets.

Steps: Begin with a definition of what constitutes an emerging market. List and explain the top three economic and political risks. Use recent data to support your points and conclude with a brief summary of your analysis.

End goal: Enable potential investors to gauge whether investing in emerging markets aligns with their risk tolerance and investment goals.

Narrowing: Limit your discussion to economic and political risks; do not include social or environmental risks.

Final Prompt to the Model: 

"Assuming the role of a financial advisor, provide a comprehensive analysis of the current economic and political risks associated with investing in emerging markets. Start by defining 'emerging markets,' then identify and elaborate on the top three risks, supported by the most recent data. Conclude with a summary that helps potential investors understand these risks in the context of their personal investment strategies. Focus solely on economic and political aspects, excluding any social or environmental considerations."

My final prompt is WAY cooler.  Look how I can coach the model to use my specific JDE instance to coerce any URLs that it replies with!  It's like programming with words...

"Please be as comprehensive as you can be.  Assuming the role of a JDE administrator provide a comprehensive way of preventing users from being able to run certain applications in JDE. Start by describing the different types of security that are available in JD Edwards. conclude with options available to prevent users from running an application.   please provide a shortcut to the JDE work with user/role security application as part of your response.   If there are any URL's in what is returned that contains JDE, please substitute the domain component with https://f5dv.fusion5.cloud:443/jde/ShortcutLauncher?OID=<PROGRAM NAME>. Where <PROGRAM NAME> is the JDE application name, starting with a P."

This structured approach ensures the prompt is clear, focused, and aligned with the intended output, making it a powerful tool for guiding AI or any responsive system.




Remember that you also have a pile of options for increasing the reference count and more.  There is also a way of ensuring that you have as much default prompt as your instance can handle, which is how I'd build up my instance.




If you've made it this far - let's be honest.  I think that you want your own!  Get in contact.