Showing posts with label JDeveloper. Show all posts
Showing posts with label JDeveloper. Show all posts

Tuesday, 22 March 2011

Installing Oracle XE 10g on Ubuntu 10.10 Server and VMWare Fusion 3.1

The following blog post is purely for my own purposes, to document how to install Oracle XE 10g on Ubuntu 10.10 Server as a VM under VMWare Fusion 3.1 on Mac OS X 10.6.6.

These instructions are gathered from numerous internet resources and much of the credit must be given to these authors for their superb guides. The main difference for my guide is the inclusion of screenshots which I prefer over text or video, and the occasional fix where the original instructions didn’t work for me. Reference to the external authors is given throughout this article.

Installing Ubuntu 10.10 Server on VMWare Fusion

Opening credit must go to Ted Wise for his XE on Mac guide. Ted’s guide is very indepth detailing the exact options for installing Ubuntu’s JeOS 8.04. Ubuntu doesn’t appear to have a JeOS download for 10.10 so my instructions are the Ubuntu 10.10 Server .iso instead.

First download the Ubuntu “i386” 10.10 Server edition as the Oracle XE .deb package downloaded later is “i386” too. I’m a little unsure why, but there doesn’t appear to be a 64bit version of the i386 ISO for Ubuntu, but there is an Ubuntu amd64 server release for 10.10. While the amd64 version will run in VMWare Fusion, later on attempting to install the Oracle XE .deb package, it will complain it’s only for the i386 platform, which makes the amd64 Ubuntu unviable.

There’s obviously something I don’t know about the Ubuntu supported platforms and the relating ISOs.

iiNet (local Aussie ISP) provides a handy mirror.

Once downloaded open the VMWare Fusion. From the menu select File -> New which opens the New Virtual Machine Assistant wizard:

Select the “Continue without disc” button which opens the Installation Media page in the wizard:

Select the “Use operating system installation disc image file” radio button. This will open a select file dialog where you select the Ubuntu 10.10 Server i386 iso:

Presenting the following options in the previous dialog:

Selecting “Continue” presents the Operating System page. By default Linux and Ubuntu should already be selected under the respective Operating System and Version options:

Selecting “Continue” presents the Linux Easy Install options:

Note in the above picture we unselect the “Use Easy Install” option as this will skip many of the options we want to configure when the Ubuntu installation starts in the VM. Selecting the “Continue” button will present the Finish page:

The default options for the VM are fine. However you can click the “Customize Settings” button to change them. Note on selecting either this button or the Finish button will display a save dialog asking you to name and place the VM file on the OSX file system. The default location appears to be /Users/(your username)/Documents/Virtual Machines:

Note in the above picture I already have a number of other Ubuntu VMs that were previous trials.

Once you press Save the VM will start and the Ubuntu installer will flash through some startup screens, quickly arriving at the first option to select English as the preferred language:

At the next screen select the “Install Ubuntu Server” option:

For whatever reason we’re prompted for the language again, “English damn you, English”:

Then select your country:

Select No at the “Detect keyboard layout” option:

Select USA on the “Origin of the keyboard” screen:

Wow, Ubuntu loves it’s keyboard options. Select USA at the “Keyboard layout” screen:

Change the hostname to something more suitable at the Hostname prompt on the next screen, such as “oraclexe”:

At the timezone prompt assuming the right default has been picked, press Yes:

The next set of steps owes all its credit to Ted Wise’s instructions. As Ted notes Oracle XE will require a Linux swap partition twice the size of the available RAM. This can be done post install but it’s easier done now through the install screens with no typing required. The first screen titled “Partitioned disks” select the Manual option:

On the next screen select the SCSI3 option representing the VMWare disk available to the VM:

The screen will warn you that we’re going to drop and recreate the partition, which we select the Yes prompt:

This returns to the previous screen where under the SCSI option you’ll see that there is an entry for the empty partition entitled “pri/log 21.5GB” which we select:

In the following screen select the “Create a new partition” option to create a new partition in the empty partition we just selected:

In the following screen downgrade the partition size from 21.5GB to 20GB. The remaining size will be used for the swap partition soon.

Select “Primary” to make this the primary partition:

Allow the partition to be created at the “Beginning” of the available space:

Select “Done setting up the partition” which completes the primary partition. Next we create the swap partition:

Returning to the main partition page, select the remaining “pri/log FREE SPACE” option, which will be used for the swap:

Again select the “Create a new partition” option:

Take the default 1.5GB partition space next which will allocate the remaining free space to the swap:

Again make this a Primary partition:

Select the “Use as” option as we want to change what the partition is used for:

Select “swap area” when prompted “How to use this partition”:

Then finally “Done setting up this partition”

Returning to the partitions screen select “Finish partitioning and write changes to disk”:

A final prompt warning you on your changes will display, select Yes:

At this point the installer will start copying and configuring files:

At the “Set up users and passwords” screen you have the chance to configure the primary none-root user. The first page prompts you for the user’s name, not the account name. However they can be the same. As seen in this screenshot “administrator” is entered:

Next screen you enter the actual user account name, again “administrator” is entered in this screenshot:

Over 2 screens you’ll be asked to enter and confirm a password for the new account:

Choose not to encrypt the home directory:

Only if you have a HTTP proxy between the internet and the Mac set the following options, otherwise just select Continue. The install requires access to the internet so it’s essential this is configured if required:

Now the installer will download and install additional files:

Select your preference at the screen prompting you how to apply security updates:

At the software selection page, as we want this to be a very small server install just to run Oracle XE, leave the software package selection undone:

At the GRUB Boot prompt select Yes:

Hurray!… the installation is complete:

On a reboot the VM will display the Ubuntu command line login:

On logging in using the administrator account created in the previous steps, force the Ubuntu Server to update itself using the APT package installer via the following command:

sudo apt-get update && sudo apt-get dist-upgrade

When prompted press Y

Finally reboot the server:

sudo shutdown –r now

Install VMWare Tools

Here we depart from Ted's instructions. The Ubuntu Community documentation provides under the "Installing from Ubuntu package from VM-tools" heading the instructions for installing the VMWare Tools. A number of prescribed methods didn't work, including those requiring the VMWare Tools to be mounted via a virtual cdrom.

On logging in again as administrator, enter the following commands. Note the third command; as we've installed a UI-less Ubuntu Server we use this specific command (the Ubuntu documentation lists 2 options):

sudo apt-get install linux-headers-virtual
sudo apt-get install --no-install-recommends open-vm-dkms
sudo apt-get install --no-install-recommends open-vm-tools


Installing Oracle XE

First login as administrator and install the prerequisite libraries and packages:

sudo aptitude install libaio-dev
sudo apt-get install bc


Note that there appears to be some documentation around that indicates if Oracle XE is installed under Ubuntu 64bit, there are additional prerequisite libraries that must be installed including "bc" and "ia32-libs".

In the next step I had significant issues in using the usual method that most others used to download Oracle XE. As described in Ted's instructions, the typical manner is to download and Oracle XE is to add an entry to your /etc/apt/sources.list, use wget to retrieve the GPG key for the Oracle XE package and install it the APT repository, then finally download Oracle XE using APT.

Instead I came up with the following solution.

Via my browser I discovered the URL of the Oracle XE i386 deb package from the OTN web page was as follows:
http://download.oracle.com/otn/linux/oracle10g/xe/10201/oracle-xe_10.2.0.1-1.0_i386.deb

Still logged in as administrator, issue the following command changing the username and password to match your OTN username and password:

(Post edit: It appears the wget command can't deal with the licence prompt the Oracle website asks for. To solve this, in your Mac and your favourite browser, access the following page and accept the license condition. This will set your IP up to be allowed to download the software, such that both your Mac and the VMWare session can download the .deb file. Obviously another option is to download the .deb file onto your Mac, and then access it from your VMWare session)

wget --user=(username) --password=(password) http://download.oracle.com/otn/linux/oracle10g/xe/10201/oracle-xe_10.2.0.1-1.0_i386.deb

This will download the XE .deb file. Once completed we can install the .deb file using dpkg (more information on .deb files and dpkg can be found via Chris Buckridge's page):

sudo dpkg -i oracle-xe_10.2.0.1-1.0_i386.deb

Ensure to complete the usermod step next otherwise the administrator user will not be given correct privileges to install and start the database after a reboot:

(Post edit: there's a mistake in my notes here. Either the following command must be entered now, or, after the oracle-xe configure line next. Without reinstalling the whole VM it's currently hard for me to check this)

sudo usermod -g dba administrator

The next command configures and installs the database:

sudo /etc/init.d/oracle-xe configure

At the prompts:

1) Enter a port for Apex, the default being 8080
2) Enter a port for the Oracle Listener, the default being 1521.
3) Enter a password for the SYS/SYSTEM database accounts.
4) When prompted enter Y to allow Oracle XE to be started with the VM boots.

For reference /etc/default/oracle-xe is the configuration file which stores these options.

Once completed, edit the following file via or similar:

vi ~/.bashrc

At the end of the file enter the following:

ORACLE_HOME=/usr/lib/oracle/xe/app/oracle/product/10.2.0/server
PATH=$PATH:$ORACLE_HOME/bin
export ORACLE_HOME
export ORACLE_SID=XE
export PATH


Finally we need to login to the database as system and allow remote access to the HTTP server:

sqlplus system/(password)
EXEC DBMS_XDB.SETLISTENERLOCALACCESS(FALSE);
quit;

Accessing the APEX homepage from the VM Host (not Guest)

Finally to access the APEX homepage from the VM Host, on the guest issue the command "ifconfig" which will reveal the current IP of the VM guest, listed under the "eth0" "inet addr" entry, as example 192.168.197.131.

On the VM Host, return to your favourite browser and enter: http://192.168.197.131:8080/apex

..and the APEX home page should display. Ensure you can login using the SYSTEM account.

It's worth checking from a tool like JDeveloper installed under OSX that you can also access the database.

Voila.
Posted by Chris Muir 2 comments

Tuesday, 30 November 2010

Using ojdeploy and Ant for creating ADF library JARs

Projects and the Application itself in Oracle's JDeveloper 11g are capable of generating different deployment files at design time, including WAR files, EAR files, JAR files and other standard Java EE archive types. In addition JDev can generate a special JAR type specific to ADF development known as an ADF library. An ADF library differs from standard JARs in that the ADF library includes additional metadata files and constructs required for ADF application development.

To create an ADF library JAR you first need to setup an associated ADF library JAR via the project properties. This is well document in the Fusion Guide.

From here there are essentially 2 ways to actually generate the ADF library JAR. The first is within the IDE via the project right-click deploy options, and applicable to programmers developing in their JDev IDE.

The other option particularly important to build environments is the ojdeploy utility. This tool can be called via the command line or an Ant script, and is necessary for generating ADF library JAR files with the required metadata files and other ADF constructs.

In specifically considering the ant script option for ojdeploy, JDev will create a basic Ant script for building your applications via the New Gallery -> Ant -> Buildfile from Project option. In the dialog that displays selecting the "Include Packaging Tasks (uses ojdeploy)" checkbox ensures the generated Ant script includes an ojdeploy target:

The resulting Ant file will look something like this:
<?xml version="1.0" encoding="UTF-8" ?>
<project name="ViewController" default="all" basedir=".">
<property file="build.properties"/>
<path>....snipped....</path>
<target name="init">
<tstamp/>
<mkdir dir="${output.dir}"/>
</target>
<target name="all" description="Build the project" depends="deploy,compile,copy"/>
<target name="clean" description="Clean the project">
<delete includeemptydirs="true" quiet="true">
<fileset dir="${output.dir}" includes="**/*"/>
</delete>
</target>
<target name="deploy" description="Deploy JDeveloper profiles" depends="init,compile">
<taskdef name="ojdeploy" classname="oracle.jdeveloper.deploy.ant.OJDeployAntTask" uri="oraclelib:OJDeployAntTask"
classpath="${oracle.jdeveloper.ant.library}"/>
<ora:ojdeploy xmlns:ora="oraclelib:OJDeployAntTask" executable="${oracle.jdeveloper.ojdeploy.path}"
ora:buildscript="${oracle.jdeveloper.deploy.dir}/ojdeploy-build.xml"
ora:statuslog="${oracle.jdeveloper.deploy.dir}/ojdeploy-statuslog.xml">
<ora:deploy>
<ora:parameter name="workspace" value="${oracle.jdeveloper.workspace.path}"/>
<ora:parameter name="project" value="${oracle.jdeveloper.project.name}"/>
<ora:parameter name="profile" value="${oracle.jdeveloper.deploy.profile.name}"/>
<ora:parameter name="nocompile" value="true"/>
<ora:parameter name="outputfile" value="${oracle.jdeveloper.deploy.outputfile}"/>
</ora:deploy>
</ora:ojdeploy>
</target>
<target name="compile" description="Compile Java source files" depends="init">
<javac destdir="${output.dir}" classpathref="classpath" debug="${javac.debug}" nowarn="${javac.nowarn}"
deprecation="${javac.deprecation}" encoding="UTF-8" source="1.6" target="1.6">
<src path="src"/>
</javac>
</target>
<target name="copy" description="Copy files to output directory" depends="init">
<patternset id="copy.patterns">
<include name="**/*.gif"/>
<include name="**/*.jpg"/>
<include name="**/*.jpeg"/>
<include name="**/*.png"/>
<include name="**/*.properties"/>
<include name="**/*.xml"/>
<include name="**/*.ejx"/>
<include name="**/*.xcfg"/>
<include name="**/*.cpx"/>
<include name="**/*.dcx"/>
<include name="**/*.sva"/>
<include name="**/*.wsdl"/>
<include name="**/*.ini"/>
<include name="**/*.tld"/>
<include name="**/*.tag"/>
<include name="**/*.xlf"/>
<include name="**/*.xsl"/>
<include name="**/*.xsd"/>
</patternset>
<copy todir="${output.dir}">
<fileset dir="src">
<patternset refid="copy.patterns"/>
</fileset>
</copy>
</target>
</project>
In particular notice the Ant targets: init, clean, deploy, compile and copy. Note within the deploy target a call to the ojdeploy utility.

Now you'd think with these 5 targets generated by JDeveloper that they are all co-related and have dependencies. You can even see the deploy-ojdeploy target has dependencies on the init and compile targets. Unfortunately this is misleading.

The ojdeploy utility written by Oracle is in fact capable of doing all the other targets' tasks. The utility takes care of creating/initializing the destination directories, it cleans and compiles the specified application or project, and it takes care of copying all class files. In fact if you leave the other targets in the Ant script this can interfere with the operation of the ojdeploy utility and you should in fact remove them. Instead you should have something like this:

<project name="ViewController" default="all" basedir=".">
<property file="build.properties"/>
<path>....snipped....
<target name="deploy" description="Deploy JDeveloper profiles" depends="init,compile">
<taskdef name="ojdeploy" classname="oracle.jdeveloper.deploy.ant.OJDeployAntTask" uri="oraclelib:OJDeployAntTask"
classpath="${oracle.jdeveloper.ant.library}"/>
<ora:ojdeploy xmlns:ora="oraclelib:OJDeployAntTask" executable="${oracle.jdeveloper.ojdeploy.path}"
ora:buildscript="${oracle.jdeveloper.deploy.dir}/ojdeploy-build.xml"
ora:statuslog="${oracle.jdeveloper.deploy.dir}/ojdeploy-statuslog.xml">
<ora:deploy>
<ora:parameter name="workspace" value="${oracle.jdeveloper.workspace.path}"/>
<ora:parameter name="project" value="${oracle.jdeveloper.project.name}"/>
<ora:parameter name="profile" value="${oracle.jdeveloper.deploy.profile.name}"/>
<ora:parameter name="outputfile" value="${oracle.jdeveloper.deploy.outputfile}"/>
</ora:deploy>
</ora:ojdeploy>
</target>
</project>
If you've looked very carefully, you might have noticed I removed the nocompile option for ojdeploy. Bizarrely this boolean option takes three forms: true, false, and, the one you need to use to ensure the correct ADF constructs (such as task-flow-registry.xml) are added to the end ADF library JAR, is remove the nocompile option completely.

(This is bug 9000629 – closed by Support, not considered a bug, but confusing and not intuitive by design)

On the Windows platform you need to be careful that any paths you include in the ojdeploy target within the Ant script (either directly or indirectly via a properties file) exactly match the case of the Windows file system, otherwise similar issues can occur.

(Another bug 10028816 – confirmed bug – fixed 11.1.1.4.0 – patch available for 11.1.1.2.0)

Finally there's also a bug in the ojdeploy workspace option that it can't support the Ant script ${user.dir} property. This isn't applicable to the example above but I thought worth documenting.

(Bug 10028879 – confirmed bug – fixed 11.1.1.4.0 – patch available for 11.1.1.2.0)

Entirely separately to generating ADF libraries, if you wish to use the ojdeploy utility to create an EAR via the workspace, you do this by dropping the project option leaving the workspace, profile and outputfile options. If you do this under JDev 11.1.1.2.0 specifically you'll see the error message "Missing <workspace>, <project> or <profile> parameter in <deploy> element", caused by a bug in the ojdeploy utility, of which there is a patch available.

(Bug 9135159 – confirmed bug – fixed 11.1.1.3.0 – patch available for 11.1.1.2.0)

Addendum

All the above documented via JDev 11.1.1.2.0.
Posted by Chris Muir 3 comments

Tuesday, 9 November 2010

JDev: Programmatically capturing task flow parameters

We recently had the requirement to log all incoming and outgoing parameters from Bounded Task Flows (BTF) for JDeveloper 11g. Via the kind assistance of Simon Lessard and other OTN Forum helpers (of whom I'm very grateful) we were able to come up with the following solution. I share it here for others to benefit from Simon's advice.

Warning

As per the OTN thread the following solution makes use of "internal" ADF libraries which Oracle gives no guarantee will not change in the future. The following code was tested under 11.1.1.2.0 and is assumed to also work in 11.1.1.3.0, yet you should check carefully that this code works in future versions, as we said, there's no guarantees it will performed as required. Also note Frank Nimphius has raised ER 10198616 to request a public API for the internal ADF libraries used in this solution.

Solution

Developers who are familiar with BTFs in JDev will know that they have an initializer and finalizer property which via EL can call which ever bean code we desire. The following code solution is simply the EL method end points which are called.

I won't bother to explain the code solution, it should be fairly self explanatory:

public class SomeClass {

public void initializer() {
Map taskFlowInputParameters = TaskFlowUtils.getCurrentTaskFlowInputParameters();
logBtfParameters(taskFlowInputParameters);
}

public void finalizer() {
Map taskFlowReturnParameters = TaskFlowUtils.getCurrentTaskFlowReturnParameters();
logBtfParameters(taskFlowReturnParameters);
}

public void logBtfParameters(Map btfParameters) {
HashMap taskFlowParameterValues = new HashMap();

FacesContext facesContext = FacesContext.getCurrentInstance();
Application application = facesContext.getApplication();
AdfFacesContext adfFacesContext = AdfFacesContext.getCurrentInstance();
Map pageFlowScope = adfFacesContext.getPageFlowScope();

for (Object parameter : btfParameters.values()) {
NamedParameter namedParameter = (NamedParameter)parameter;
String parameterName = namedParameter.getName();
String parameterExpression = namedParameter.getValueExpression();
Object parameterValue;
String stringValue;

if (parameterExpression == null) {
parameterValue = pageFlowScope.get(parameterName);
} else {
parameterValue = application.evaluateExpressionGet(facesContext, parameterExpression, Object.class);
}

if (parameterValue != null) {
try {
stringValue = parameterValue.toString();
} catch (Exception e) {
stringValue = "";
}
} else {
stringValue = "";
}

taskFlowParameterValues.put(parameterName, stringValue);
}
// log the taskFlowParameterValues parameters somewhere
}

}
The code above makes use of the following custom task flow utility class:

import java.util.Map;

import oracle.adf.controller.ControllerContext;
import oracle.adf.controller.TaskFlowContext;
import oracle.adf.controller.TaskFlowId;
import oracle.adf.controller.ViewPortContext;
import oracle.adf.controller.internal.metadata.MetadataService;
import oracle.adf.controller.internal.metadata.NamedParameter;
import oracle.adf.controller.internal.metadata.TaskFlowDefinition;
import oracle.adf.controller.internal.metadata.TaskFlowInputParameter;


/*
* Note this class makes of "internal" classes that Oracle preferred we didn't use (as there's no
* guarantee they wont change. However as of JDev 11.1.1.2.0 there is no other solution for
* retrieving task flow parameter names.
*
* See: http://forums.oracle.com/forums/thread.jspa?threadID=1556568&start=0&tstart=0
*
* Oracle has raised ER 10198616 to create a public API for the internal classes in this case.
*/
public class TaskFlowUtils {

public static TaskFlowId getTaskFlowId() {
ControllerContext controllerContext = ControllerContext.getInstance();
ViewPortContext currentViewPort = controllerContext.getCurrentViewPort();
TaskFlowContext taskFlowContext = currentViewPort.getTaskFlowContext();
TaskFlowId taskFlowId = taskFlowContext.getTaskFlowId();

return taskFlowId;
}

public static TaskFlowDefinition getTaskFlowDefinition(TaskFlowId taskFlowId) {
assert taskFlowId != null;

MetadataService metadataService = MetadataService.getInstance();
TaskFlowDefinition taskFlowDefinition = metadataService.getTaskFlowDefinition(taskFlowId);

return taskFlowDefinition;
}

public static Map getCurrentTaskFlowInputParameters() {
return getInputParameters(getTaskFlowId());
}

public static Map getInputParameters(TaskFlowId taskFlowId) {
assert taskFlowId != null;

TaskFlowDefinition taskFlowDefinition = getTaskFlowDefinition(taskFlowId);
Map taskFlowInputParameters = taskFlowDefinition.getInputParameters();

return taskFlowInputParameters;
}

public static Map getCurrentTaskFlowReturnParameters() {
return getReturnParameters(getTaskFlowId());
}

public static Map getReturnParameters(TaskFlowId taskFlowId) {
assert taskFlowId != null;

TaskFlowDefinition taskFlowDefinition = getTaskFlowDefinition(taskFlowId);
Map namedParameters = taskFlowDefinition.getReturnValues();

return namedParameters;
}
}
Caveat

This code hasn't yet been extensively tested, and in general just shows the programmatic technique for others to understand. Internally we've already generalized this code further to suit our own use case. If you find any bugs or obvious errors with the technique, leave a comment describing the problem you found and any solutions please.
Posted by Chris Muir 0 comments
Labels:

Wednesday, 3 November 2010

ADF UI Shell: Supporting global hotkeys for the activity tabs

We recently had the requirement to provide our users global hotkeys to switch between tabs within the ADF UI Shell (a.k.a. Dynamic Tab Shell) in JDev 11g. Luckily I caught a presentation at Open World this year where Frank Nimphius showed off support for creating global hotkeys in ADF applications. Frank was kind enough to give me his source code allowing me to modify it to suit the ADF UI Shell specifically. The following code shows the general technique for getting global hotkeys workings, as well as specific code to suit the ADF UI Shell implementation. The following code was built and tested under JDev 11.1.1.2.0.

ADF UI Shell Extension – global hotkeys

Readers familiar with the current incarnation of the ADF UI Shell will know it has the ability to spawn 1 to 15 "activity" tabs displaying whatever the programmer chooses. This provides an ideal framework for users to spawn multiple bounded task flows (BTFs) and work on several data sets all within the one browser window.

Our users have taken to the UI Shell and have been asking for extensions ever since. Of particular note they wanted the ability to open a set of activity tabs, and then flip between them using keyboard shortcuts rather than mouse clicks.

Frank's solution to the rescue.

Solution

The solution requires 3 working parts:

a) JavaScript
b) ViewScoped Managed Bean
c) Changes to the ADF UI Shell derived page to support the hotkeys to call "a" and "b"

The following code shows the general technique.

JavaScript
var keyRegistry = new Array();

keyRegistry[0] = "alt 1";
keyRegistry[1] = "alt 2";
keyRegistry[2] = "alt 3";
keyRegistry[3] = "alt 4";
keyRegistry[4] = "alt 5";
keyRegistry[5] = "alt 6";
keyRegistry[6] = "alt 7";
keyRegistry[7] = "alt 8";
keyRegistry[8] = "alt 9";
keyRegistry[9] = "alt 0";

function registerKeyBoardHandler(serverListener, afdocument) {

_serverListener = serverListener;
var document = AdfPage.PAGE.findComponentByAbsoluteId(afdocument);
_document = document;

for (var i = keyRegistry.length - 1; i>= 0; i--) {
var keyStroke = AdfKeyStroke.getKeyStrokeFromMarshalledString(keyRegistry[i]);
AdfRichUIPeer.registerKeyStroke(document, keyStroke, callBack);
}
}

function callBack(keyCode) {
var activeComponentClientId = AdfPage.PAGE.getActiveComponentId();

// Send the marshalled key code to the server listener for the developer
// To handle the function key in a managed bean method
var marshalledKeyCode = keyCode.toMarshalledString();

// {AdfUIComponent} component Component to queue the custom event on
// {String} name of serverListener
// {Object} params a set of parameters to include on the event.
// {Boolean} immediate whether the custom event is "immediate" -
// which will cause it to be delivered during Apply Request
// Values on the server, or not immediate, in which
// case it will be delivered during Invoke Application.

// Note that if one of the keyboard functions is to create ADF
// bound rows, immediate must be set to false. There is no
// option yet for the ClientEvent to be queued for later -
// InvokeApplication - on the server.
AdfCustomEvent.queue(_document,
_serverListener, {keycode:marshalledKeyCode,
activeComponentClientId:activeComponentClientId}, false);

// indicate to the client that the key was handled and that there
// is no need to pass the event to the browser to handle it
return true;
}
You'll note the hotkey mapping only goes upto 10 (1 to 9 plus zero). The UI Shell does support 15 tabs, but I couldn't think of a nice key combination beyond the 10th.... that can be left up to you.

ViewScoped Managed Bean
import java.util.List;

import javax.faces.component.UIComponent;
import javax.faces.context.FacesContext;
import javax.faces.event.PhaseEvent;
import javax.faces.event.PhaseId;

import oracle.adf.view.rich.render.ClientEvent;

import org.apache.myfaces.trinidad.render.ExtendedRenderKitService;
import org.apache.myfaces.trinidad.util.Service;

import waosr.ui.pattern.dynamicShell.TabContext;


public class KeyboardHandler {

public void registerKeyboardMapping(PhaseEvent phaseEvent) {
if (phaseEvent.getPhaseId() == PhaseId.RENDER_RESPONSE) {

FacesContext facesContext = FacesContext.getCurrentInstance();
ExtendedRenderKitService extRenderKitService =
Service.getRenderKitService(facesContext, ExtendedRenderKitService.class);
List<UIComponent> childComponents = facesContext.getViewRoot().getChildren();
//First child component in an ADF Faces page - and the only child - is af:document
//Thus no need to parse the child components and check for their component family
//type
String id = ((UIComponent)childComponents.get(0)).getClientId(facesContext);

StringBuffer script = new StringBuffer();
script.append("window.registerKeyBoardHandler('keyboardToServerNotify','" + id + "')");
extRenderKitService.addScript(facesContext, script.toString());
}
}

public void handleKeyboardEvent(ClientEvent clientEvent) {

String keyCode = (String)clientEvent.getParameters().get("keycode");

// The alt+<number> keyboard combination opens the relating ADF UI Shell tab if open
if (keyCode.equalsIgnoreCase("alt 1") || keyCode.equalsIgnoreCase("alt 2") || keyCode.equalsIgnoreCase("alt 3") ||
keyCode.equalsIgnoreCase("alt 4") || keyCode.equalsIgnoreCase("alt 5") || keyCode.equalsIgnoreCase("alt 6") ||
keyCode.equalsIgnoreCase("alt 7") || keyCode.equalsIgnoreCase("alt 8") || keyCode.equalsIgnoreCase("alt 9")) {

String keyIndexStr = keyCode.substring(keyCode.length() - 1, keyCode.length());
int keyIndex = Integer.parseInt(keyIndexStr);
TabContext.getCurrentInstance().setSelectedTabIndex(keyIndex - 1);
}
}
}
Of specific importance the handleKeyboardEvent() method makes use of the ADF UI Shell TabContext class to set the current activity-tab according to the hotkey pressed.

ADF UI Shell page

xmlns:h="http://java.sun.com/jsf/html" xmlns:af="http://xmlns.oracle.com/adf/faces/rich">







method="#{viewScope.keyboardHandler.handleKeyboardEvent}"/>
value="#{bindings.pageTemplateBinding}" id="pt1">






Note:

a) The beforePhase property in the view tag
b) The loading of the JavaScript via the resource tag
c) The serverlistener that calls the bean methods

How This Works

1) Before the page is rendered, JavaScript is added to call the registerKeyBoardHandler method in the attached JavaScript library when the page is rendered
2) The JavaScript method registers (but does not invoke) for the defined keys a server side event
3) On the page rendering, if the user clicks one of the keys, the interaction of the JavaScript server side event *and* the serverListener in the page calls the bean handleKeyboardEvent method
4) Finally the method calls the ADF UI Shell TabContext class to switch tabs based on the hotkey number

Caveat

Frank notes some minor cross browser compatibility issues and certain hotkey combinations not working. Rather than highlighting the specific problems in this release, as this solution is reliant on JavaScript the ever unreliable-pain-in-the-butt-that-it-is-across-different-browser-versions, readers are highly recommended to do their own cross-browser testing.
Posted by Chris Muir 1 comments
Labels:

Tuesday, 26 October 2010

Closing Applications (Correctly) in JDeveloper

The following documents a somewhat small issue we're having with JDeveloper 11.1.1.2.0. At this time I'm unable to lodge an SR with Oracle Support as we don't understand the circumstances under which it occurs, thus we can't build the usual simple "Hello World" test case that Support so thrives on. However as we've identified the symptoms of the problem and the associated workaround I'll publish them here for others to benefit (including my team) so we don't hit the issue again.

Note this issue is only verified under JDev 11.1.1.2.0. I have no knowledge if the issue can be replicated in other JDev versions as this stage.

Issue Symptoms

From time to time after checking out an application from SVN inside JDeveloper:



....we'd discover JDev has incidentally modified one of the project files. In the following example the ViewController project is italicized meaning it has been modified:


Pressing "Save all" and then comparing the modified file against the previous revision reveals that the ViewController project has reverted back to the default settings, losing all the changes from the previous ViewController revision:


With the ViewController modified, the project can no longer compile (because among other things it's lost the attached libraries), and the change is effectively destructive. Whatever you do, don't check the ViewController project file back into SVN as it's just plain wrong.

Workaround #1

If you've arrived at this point where a project has been automatically modified by JDeveloper when checked out of SVN, the workaround in this situation is to Revert the project file back to the previous revision.

A Known Unknown Bug

In the example above I've identified the issue with a ViewController project in an ADF application. However note this problem isn't specific to any project type (e.g. Model, ViewController, whatever). We've experienced this issue across different projects in different applications from time to time. The converse of this issue is there's been a whole lot of projects this issue doesn't occur for too (i.e. in the above example, the Model project wasn't incidentally modified too), so the problem has something to do with the specific project configuration where the error occurs. Unfortunately we're unsure what that specific configuration is. However once the problem is detected it is consistently replicable on the problematic project at hand.

Bug Preconditions

While we haven't been able to work out what configuration in the project files causes the bug, we are aware of a set of steps that do lead up to the bug.

From a day by day basis our programmers will have an application open in JDeveloper, synced with an SVN repository:


From time to time the programmer makes a decision that rather than syncing each individual file change coming from the SVN repository into the application, it's just easier to delete the local working copy of the application and check out a whole brand new copy. To do this the developer:

a) Closes JDeveloper
b) Identifies the working copy application directory on the filesystem
c) Deletes it
d) Opens JDeveloper
e) Checks out the same application from the SVN Repository into the *same* directory that the previous working copy was checked out

It's at this point we see the issue described, but as mentioned, not for all application projects, just some projects, sometimes. However once this problem occurs, repeating steps "a" to "e" from above the issue is consistently reproducible.

From what we can see for step "e" is if we check out to any other directory, the problem isn't replicated.

What we found odd about this issue is if another developer on another machine checks out the same application, the said problematic project file wouldn't be automatically modified by JDeveloper. Yet this gave us the spark of an idea of what's going wrong and how to avoid it in the first place.

Workaround #2

You'll note in the "a" to "e" description above, the developer closes JDev, then deletes the application from the file system, then re-opens JDev. What we're not giving the JDev IDE here is a chance to recognize that one of the applications it had opened has been removed completely from the file system.

The IDE seems to be partially smart in that the application is removed from the application poplist at the top of the Application Navigator when we reopen JDeveloper. Yet if we then proceed to check out the same application as described in "a" to "e" we hit the described problem.

A workaround to the problem is before closing the JDev IDE, via the Application menu selecting Close, to close the selected application. If this option is done first the end problem is not seen. The conclusion is this gives JDev a chance to tidy up it's internal state about which applications and projects are open, and somehow this avoids the bug we're experiencing.

Arguably the IDE should be able to handle this situation regardless. The fact that the bug is destructive to the project configuration is a major concern, especially if a junior programmer doesn't understand what's happening and checks in the changed project file regardless, but in the end following the workaround steps here avoid the issue in the first place.
Posted by Chris Muir 2 comments
Labels:
Subscribe to: Comments (Atom)

AltStyle によって変換されたページ (->オリジナル) /