Archive for November, 2011
Spring / Mule / CheckExclusiveAttributesException: The attributes of Element … do not match the exclusive groups …
Case
I have the following block in my Mule config file:
<servlet:inbound-endpoint path="authenticationService" address="http://localhost:1234">
. Even though this block matches XSD constraints, it is illegal from a functionnal point of view.
I get the following stacktrace:
Offending resource: mule-config.xml; nested exception is org.springframework.beans.factory.BeanDefinitionStoreException: Unexpected exception parsing XML document from class path resource [lalou/jonathan/mule-config.xml]; nested exception is org.mule.config.spring.parsers.processors.CheckExclusiveAttributes$CheckExclusiveAttributesException: The attributes of Element servlet:inbound-endpoint{address=http://localhost:1234, name=.authenticationService:inbound.157:inbound-endpoint.158, path=authenticationService} do not match the exclusive groups [address] [ref] [path]
Explanation and Fix
The exception is meaningful: for the tag <servlet:inbound-endpoint
>, only one among the three following attributes is needed: path, ref and address.
To fix the issue, keep only the relevant attribute.
Spring: Failed to read schema document
Case
I try to deploy a Mule ESB configuration, using this XML:
<?xml version="1.0" encoding="UTF-8"?> <mule xmlns="http://www.mulesoft.org/schema/mule/core" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:pattern="http://www.mulesoft.org/schema/mule/pattern" xsi:schemaLocation="http://www.mulesoft.org/schema/mule/core http://www.mulesoft.org/schema/mule/core/3.1/mule.xsd http://www.mulesoft.org/schema/mule/pattern http://www.mulesoft.org/schema/mule/pattern/3.1/mule-pattern.xsd "> <pattern:simple-service name="authenticationService" address="http://localhost:1234/authenticationService" component-class="lalou.jonathan.esb.components.AuthenticationComponent" type="direct" /> </mule>
I get the following error:
Ignored XML validation warning org.xml.sax.SAXParseException: schema_reference.4: Failed to read schema document 'http://www.mulesoft.org/schema/mule/pattern/3.1/mule-pattern.xsd'
Extended Stacktrace
2011-11-22 16:10:25,375 WARN xml.XmlBeanDefinitionReader - Ignored XML validation warning org.xml.sax.SAXParseException: schema_reference.4: Failed to read schema document 'http://www.mulesoft.org/schema/mule/pattern/3.1/mule-pattern.xsd', because 1) could not find the document; 2) the document could not be read; 3) the root element of the document is not <xsd:schema>. at com.sun.org.apache.xerces.internal.util.ErrorHandlerWrapper.createSAXParseException(ErrorHandlerWrapper.java:195) at com.sun.org.apache.xerces.internal.util.ErrorHandlerWrapper.warning(ErrorHandlerWrapper.java:96) at com.sun.org.apache.xerces.internal.impl.XMLErrorReporter.reportError(XMLErrorReporter.java:380)
Notice that Mule ESB files are similar to classic Spring files. Of course, I first checked the pointed XSD was actually reachable.
Anyway, this error should be raised when your application, for any reason -firewall, proxies, network interruption-, cannot access the remote site where the XSD is hosted.
Fix
- Copy the XSD to a local folder
- Create a file
spring.schemas
- Make it available in the classpath in
META-INF
. - Add the following line
-
http\://www.mulesoft.org/schema/mule/pattern/3.1/mule-pattern.xsd=WEB-INF/classes/mule-pattern.xsd
The pattern is:
missing resource (beware of escaping colon) = path in classpath of the local XSD
- Rebuild, pack and run!
How to populate/insert/update a CLOB larger than 4000 or 32767 bytes?
A Short String
I have a table of which one field is a CLOB. Let’s say I have to insert one record with a short text. The following command is allowed:
INSERT INTO jonathan_table VALUES (1, 'hello world!');
A Bigger Text
Error ORA-01704
Now, my text is larger, let’s say 5000 characters. When I launch the same query, I get the following error:
ORA-01704: string literal too long
Indeed, Oracle/SQL*Plus have a limit on CLOB inserts: 4000 bytes.
Workaround
To pass through the limit on canonical SQL, you’ll have to use a PL/SQL procedure. The following command will be successful for any text larger than 4000 bytes, but shorter than 32767:
DECLARE bigtext1 VARCHAR2 (32767); BEGIN bigtext1 := lpad('X', 32000, 'X') INSERT INTO jonathan_table VALUES (1, bigtext1); END;
An Even Bigger Text
Errors ORA-06550 and PLS-00103
You guess it: beyond this limit of 32 KB, an error occurs. So the following script:
DECLARE bigtext1 VARCHAR2 (42000); BEGIN bigtext1 := lpad('X', 42000, 'X') INSERT INTO jonathan_table VALUES (1, bigtext1); END;
raises such an error:
Error at line 1 ORA-06550: line 5, column 4: PLS-00103: Encountered the symbol "INSERT" when expecting one of the following: . ( * % & = - + ; < / > at in is mod remainder not rem <an exponent (**)> <> or != or ~= >= <= <> and or like like2 like4 likec between || multiset member submultiset The symbol ";" was substituted for "INSERT" to continue.
Fix this issue
I searched a lot to find an easy fix to go beyond the limit of 32KB. My point was that with Java for instance there is no limit of 32KB. In the same way, with TOAD I was able to update the record with many mega bytes of text, via the clipboard. After further search, I learnt that the 32KB barrier was a SQL*Plus limitation on actual strings, but the patterns insert into ... select ... from
were not affected.
Here is the idea:
- create a temporary table
- split the text into blocks shorter than 32KB
- insert the blocks into the temporary table
- perform a first insert with a null CLOB
- update the record using a
select
on the temporary table (yet you can insert the actual value since previous step)
Here is an example:
DROP TABLE tt_jonathan_table; CREATE GLOBAL TEMPORARY TABLE tt_jonathan_table ( ID NUMBER(10), pdlsuffix CLOB ) ON COMMIT PRESERVE ROWS; TRUNCATE TABLE tt_jonathan_table; DECLARE bigtext1 VARCHAR2 (32767); bigtext2 VARCHAR2 (32767); BEGIN bigtext1 := lpad('X', 32000, 'X') bigtext2 := lpad('Y', 32000, 'Y') INSERT INTO tt_jonathan_table VALUES (1, bigtext1); INSERT INTO tt_jonathan_table VALUES (2, bigtext2); INSERT INTO jonathan_table (id, myClobField) VALUES (jonathan_seq.NEXTVAL, NULL); UPDATE jonathan_table SET myClobField = (SELECT CONCAT (rls1.myClobField, rls2.myClobField) FROM tt_jonathan_table rls1, tt_jonathan_table rls2 WHERE rls1.ID = 1 AND rls2.ID = 2) WHERE myClobField is null; END; / TRUNCATE TABLE tt_jonathan_table;
How to unit test Logger calls?
Case
Sometimes, you need test the Log4J’s loggers are called with the right parameters. How to perform these tests from with JUnit?
Let’s take an example: how to test these simple class and method?
public class ClassWithLogger { private static final Logger LOGGER = Logger.getLogger(ClassWithLogger.class); public void printMessage(Integer foo){ LOGGER.warn("this is the message#" + foo); } }
Example
Define an almost empty Log4J appender, such as:
public class TestAimedAppender extends ArrayList<String> implements Appender { private final Class clazz; public TestAimedAppender(Class clazz) { super(); this.clazz = clazz; } @Override public void addFilter(Filter newFilter) { } @Override public Filter getFilter() { return null; } @Override public void clearFilters() { } public void close() { } @Override public void doAppend(LoggingEvent event) { add(event.getRenderedMessage()); } @Override public String getName() { return "TestAppender for " + clazz.getSimpleName(); } @Override public void setErrorHandler(ErrorHandler errorHandler) { } @Override public ErrorHandler getErrorHandler() { return null; } @Override public void setLayout(Layout layout) { } @Override public Layout getLayout() { return null; } @Override public void setName(String name) { } public boolean requiresLayout() { return false; } }
Then create a TestCase with two fields:
public class ClassWithLoggerUnitTest { private ClassWithLogger classWithLogger; private TestAimedAppender appender; ... }
In the setup, remove all appenders, create an instance of our appender, and then add it to the logger related to the class which we want to test:
@Before public void setUp() throws Exception { final Logger classWithLoggerLogger = Logger.getLogger(ClassWithLogger.class); classWithLoggerLogger.removeAllAppenders(); appender = new TestAimedAppender(ClassWithLogger.class); classWithLoggerLogger.addAppender(appender); appender.clear(); classWithLogger = new ClassWithLogger(); }
Then write the following test. The code is documented:
@Test public void testPrintMessage() throws Exception { final String expectedMessage = "this is the message#18"; // empty the appender appender.clear(); // check it is actually empty before any call to the tested class assertTrue(appender.isEmpty()); // call to the tested class classWithLogger.printMessage(18); // check the appender is no more empty assertFalse(appender.isEmpty()); assertEquals(1, appender.size()); // check the content of the appender assertEquals(expectedMessage, appender.get(0)); }
Conclusion
This basic example shows how to perform tests on logger, without overriding the original code or using mocks. Of course, you can improve this basic example, for instance in discriminating owing to the log level (INFO
, WARN
, ERROR
, etc.), use generics, and even any other fantasy ;-).
Jonathan LALOU recommends… Karim CHORFI
I wrote the following notice on Karim CHORFI‘s profile on LinkedIn:
Working with Karim is a pleasure: always available, smiling, with a sense of humor. This is for the human profile ; concerning his professionnal abilities, let’s mention his double skills -finance and IT- make him able to handle various problems, as complex technically as functionnaly.
I recommand Karim warmly.