Just for some background, our cluster (HDP 2.3.4, not virtualized) uses a local KDC which has a trust established with our corporate KDC. That is, I can kinit as MyUserName@CorporateRealm and Hadoop treats me as if I was MyUserName@LocalKDCRealm. Works great, and keeps our users from having to enter in their credentials when connecting, for example, to Hive over ODBC.
I haven't quite found a way to configure this in Toad.
I've got things set up and working this way:
- Exported a keytab from the Hadoop cluster's local KDC, copied it to my laptop and pointed Toad to it in ecosystem configuration
- Local KDC realm used in ecosystem configuration
- Local KDC host used in ecosystem configuration
No problems there; everything works great.
But I'd much rather use my corporate KDC primarily because then I don't need to either export the local KDC's keytab or remember a separate password for that KDC. So here's what I do in ecosystem configuration:
- Uncheck 'Use Keytab'
- Replace the local KDC realm name with the corporate realm name
- Replace the local KDC host name with the corporate host name
- Enter my corporate password (is there a way for it to just inherit my existing credentials? It gets mad at me when I don't enter a password)
It works (for now):
The problems start when I head to the 'SQL Configuration' screen. For starters, it overwrites the "Realm" name in the 'Hive2 JDBC Configuration' section with the corporate realm, when I need it to stay as the local KDC realm. It does allow me to overwrite that with the proper, local KDC realm name, but then 'Check State' fails with the following error message (edited out the Oozie URL):
"
Oozie configuration:
A problem has occured. Server returned HTTP response code: 500 for URL: http://<OozieURL>:11000/oozie/
Possible cause:
- incorrect port or host address.
- Hadoop is not running
"
I get a similar error message when I try to hit that URL from a browser:
Check State works fine when I use the local KDC keytab on the previous screen, however.
FYI - HDFS, Charts and Logs all seem to work fine. It's only SQL and Transfer that throw that error message and they're both the same error (about Oozie). There are no entries in the log4j.log file and this is the only entry that shows up in the metadata log:
!ENTRY com.dell.tfh.library.hadoop.hdp2 4 0 2016-09-14 14:06:37.265
!MESSAGE FrameworkEvent ERROR
!STACK 0
java.io.IOException: Exception in opening zip file: C:\Users\<my username>\.eclipse\1428832045_win32_win32_x86_64\configuration\org.eclipse.osgi\22\14\bundleFile
at org.eclipse.osgi.framework.util.SecureAction.getZipFile(SecureAction.java:305)
at org.eclipse.osgi.storage.bundlefile.ZipBundleFile.basicOpen(ZipBundleFile.java:85)
at org.eclipse.osgi.storage.bundlefile.ZipBundleFile.getZipFile(ZipBundleFile.java:98)
at org.eclipse.osgi.storage.bundlefile.ZipBundleFile.checkedOpen(ZipBundleFile.java:65)
at org.eclipse.osgi.storage.bundlefile.ZipBundleFile.getEntry(ZipBundleFile.java:232)
at org.eclipse.osgi.internal.loader.classpath.ClasspathManager.findClassImpl(ClasspathManager.java:562)
at org.eclipse.osgi.internal.loader.classpath.ClasspathManager.findLocalClassImpl(ClasspathManager.java:540)
at org.eclipse.osgi.internal.loader.classpath.ClasspathManager.findLocalClass(ClasspathManager.java:527)
at org.eclipse.osgi.internal.loader.ModuleClassLoader.findLocalClass(ModuleClassLoader.java:324)
at org.eclipse.osgi.internal.loader.BundleLoader.findLocalClass(BundleLoader.java:327)
at org.eclipse.osgi.internal.loader.BundleLoader.findClassInternal(BundleLoader.java:402)
at org.eclipse.osgi.internal.loader.BundleLoader.findClass(BundleLoader.java:352)
at org.eclipse.osgi.internal.loader.BundleLoader.findClass(BundleLoader.java:344)
at org.eclipse.osgi.internal.loader.ModuleClassLoader.loadClass(ModuleClassLoader.java:160)
at java.lang.ClassLoader.loadClass(Unknown Source)
at com.google.common.collect.Iterators.<clinit>(Iterators.java:87)
at com.google.common.cache.LocalCache$2.iterator(LocalCache.java:1033)
at java.util.AbstractCollection.remove(Unknown Source)
at com.google.common.cache.LocalCache$Segment.removeValueFromChain(LocalCache.java:3271)
at com.google.common.cache.LocalCache$Segment.remove(LocalCache.java:3122)
at com.google.common.cache.LocalCache.remove(LocalCache.java:4206)
at com.google.common.cache.LocalCache$LocalManualCache.invalidate(LocalCache.java:4785)
at org.apache.hadoop.ipc.Client$Connection.close(Client.java:1160)
at org.apache.hadoop.ipc.Client$Connection.run(Client.java:987)
Caused by: java.io.FileNotFoundException: C:\Users\<my username>\.eclipse\1428832045_win32_win32_x86_64\configuration\org.eclipse.osgi\22\14\bundleFile (The system cannot find the file specified)
at java.util.zip.ZipFile.open(Native Method)
at java.util.zip.ZipFile.<init>(Unknown Source)
at java.util.zip.ZipFile.<init>(Unknown Source)
at java.util.zip.ZipFile.<init>(Unknown Source)
at org.eclipse.osgi.framework.util.SecureAction.getZipFile(SecureAction.java:288)
... 23 more
Any tips?
Thanks,
JP