精华内容
下载资源
问答
  • java调用shell脚本 public static String bashCommand(String command) {Process process = null;String stringBack = null;List processList = new ArrayList();try {process = Runtime.getRuntime().exec(command)...

    java调用shell脚本 public static String bashCommand(String command) {

    Process process = null;

    String stringBack = null;

    List processList = new ArrayList();

    try {

    process = Runtime.getRuntime().exec(command);

    BufferedReader input = new BufferedReader(new InputStreamReader(process.getInputStream()));

    String line = "";

    while ((line = input.readLine()) != null) {

    processList.add(line);

    }

    input.close();

    } catch (IOException e) {

    e.printStackTrace();

    }

    for (String line : processList) {

    stringBack += line;

    stringBack +="\n";

    }

    return stringBack;

    }

    But我发现了一个问题,就是如果将项目打包成jar包的话(项目是一个桌面软件),jar包内的资源不能直接被外部引用,例如:如果把一个脚本放在resource下,通过getResoures来获得path,然后执行“bash ”就无法运行,因为此时这个脚本文件是在jar包内的。我想到的解决办法就是要执行脚本时先通过getClass().getClassLoader().getResource().openStream();获得输入流,然后创建一个文件,将原脚本的文件通过流写入到新文件(此时这个文件是在jar包外的),然后执行新文件,执行完后删除掉。

    并且发生了一个使事情:有的命令在shell中确实是有返回值的,但是用上面的函数返回的却总是null,后来我想了个笨办法,就是写一个shell脚本,将命令返回值赋值给一个变量,再echo这个变量: #!/bin/bash

    ip=$(ifconfig | grep "inet 192*")

    echo $ip

    展开全文
  • 直接进入linux执行脚本语句 直接进入linux执行脚本语句 1.登录 linux oracle su - oracle -c 2. 进入目录 /home/app/oracle/product/11.2.0/db_1/bin/ bin路径 执行 su -oracle -c &...

    脚本内容:

    一、导出

    1、库模式:整个数据库:

    2、用户模式:

    EXP CFG/CFG@ZQCREDIT BUFFER=64000 FILE=C:\CFG.DMP OWNER=CFG

    3、表模式:
    exp bsd/bsd@JSCREDIT_142 file=D:\数据文件备份\JSCREDIT_BSD_20151231_2.dmp log=D:\数据文件备份\JSCREDIT_BSD_20151231_2.log TABLES=(CORP_ATTEND_DETAIL,CORP_PRESON_FINGERIMG)

    没有配置tns文件的方式:

    exp bsd/bsd@IP/DBNAME file=D:\数据文件备份\JSCREDIT_BSD_20151231_2.dmp log=D:\数据文件备份\JSCREDIT_BSD_20151231_2.log TABLES=(CORP_ATTEND_DETAIL,CORP_PRESON_FINGERIMG)

    限制数据行:

    EXP name/pwd@MYDB FILE=D:\XXX.DMP LOG=D:\XXX.LOG TABLES=(DW_MDY_QYSPECIALCHECKINFO, DW_MDY_QYBADCREDITINFO) query=“‘where rownum<=100’”

    二、导入
    1.登录 linux
    oracle su - oracle -c
    2. 进入目录 /home/app/oracle/product/11.2.0/db_1/bin/ bin路径 执行
    su -oracle -c "/home/app/oracle/product/11.2.0/db_1/bin/impdp 用户名/密码 director=DUMP1 dumpfile='文件路径' remap_schema=SBOD_WBEP:DATA_IMPORT remap_tablespace=SBOD_WBEP:DATA_IMPORT logfile='存储日志文件路径'"

    imp导入模式 IMP NAME/PASWD@DBSHILIE FILE=D:path\DW.DMP BUFFER=64000 TABLES=(table1,table2)

    码片段:

    package com.tydic.cia.fileimport;

    import java.io.IOException;
    import java.io.InputStream;
    import java.io.UnsupportedEncodingException;
    import java.util.Date;

    import ch.ethz.ssh2.Connection;
    import ch.ethz.ssh2.Session;

    public class RemoteShellTool {
    private Connection conn;
    private String ipAddr;
    private String charset = “uft-8”;
    private String userName;
    private String password;
    public RemoteShellTool(String ipAddr, String userName, String password, String charset) {
    super();
    this.ipAddr = ipAddr;
    this.userName = userName;
    this.password = password;
    this.charset = charset;
    }

    public boolean login() throws IOException {
        conn = new Connection(ipAddr);
        conn.connect();
        return conn.authenticateWithPassword(userName, password);
    }
    
    public String exec(String cmd) {
        InputStream in = null;
        String result = "";
        try {
            if (this.login()) {
                Session sessoin = conn.openSession();
                sessoin.execCommand(cmd);
                in = sessoin.getStdout();
                result = this.processStdout(in, this.charset);
                if (result == null || "".equals(result.trim())) {
                    in = sessoin.getStderr();
                    result = this.processStdout(in, this.charset);
                }
                sessoin.close();
                conn.close();
            }
        } catch (IOException e) {
            e.printStackTrace();
        }
        return result;
    }
    
    public String processStdout(InputStream in, String charset) {
        byte[] buf = new byte[1024];
        StringBuffer sb = new StringBuffer();
    
        try {
            while (in.read(buf) != -1) {
                sb.append(new String(buf, charset));
            }
        } catch (UnsupportedEncodingException e) {
            e.printStackTrace();
        } catch (IOException e) {
            e.printStackTrace();
        }
        return sb.toString();
    }
    
    /**
     * @param args
     */
    public static void main(String[] args) {
        // RemoteShellTool tool = new RemoteShellTool("ip", "oracle", "oracle@clean", "utf-8");
        RemoteShellTool tool = new RemoteShellTool("ip", "root", "@admin", "utf-8");
        String cmd = "./myshell.sh";
        cmd = "/home/oracle/myshell.sh";
        long start = System.currentTimeMillis();
        System.out.println(new Date());
        String result = tool.exec(cmd);
        System.out.println("tes....." + System.currentTimeMillis());
        long end = System.currentTimeMillis();
        System.out.println((end - start) + "ms");
        System.out.println(result);
    }
    

    }
    “`

    展开全文
  • java调用shell脚本执行spark任务 使用java通过 Runtime.getRuntime().exec(); 调用shell 脚本,在脚本执行 spark 任务会报错, 如下: 19/05/15 15:46:47 WARN StandaloneAppClient$ClientEndpoint: Failed to ...

    java调用shell脚本执行spark任务

    使用java通过 Runtime.getRuntime().exec(); 调用shell 脚本,在脚本中执行 spark 任务会报错,

    如下:

    19/05/15 15:46:47 WARN StandaloneAppClient$ClientEndpoint: Failed to connect to master 192.168.9.2:7077
    org.apache.spark.SparkException: Exception thrown in awaitResult: 
    	at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:205)
    	at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75)
    	at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:100)
    	at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:108)
    	at org.apache.spark.deploy.client.StandaloneAppClient$ClientEndpoint$$anonfun$tryRegisterAllMasters$1$$anon$1.run(StandaloneAppClient.scala:106)
    	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    	at java.lang.Thread.run(Thread.java:748)
    Caused by: java.lang.RuntimeException: java.io.StreamCorruptedException: invalid stream header: 01000B31
    	at java.io.ObjectInputStream.readStreamHeader(ObjectInputStream.java:857)
    	at java.io.ObjectInputStream.<init>(ObjectInputStream.java:349)
    	at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.<init>(JavaSerializer.scala:63)
    	at org.apache.spark.serializer.JavaDeserializationStream.<init>(JavaSerializer.scala:63)
    	at org.apache.spark.serializer.JavaSerializerInstance.deserializeStream(JavaSerializer.scala:122)
    	at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:107)
    	at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$1$$anonfun$apply$1.apply(NettyRpcEnv.scala:259)
    	at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
    	at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:308)
    	at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$1.apply(NettyRpcEnv.scala:258)
    	at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
    	at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:257)
    	at org.apache.spark.rpc.netty.NettyRpcHandler.internalReceive(NettyRpcEnv.scala:577)
    	at org.apache.spark.rpc.netty.NettyRpcHandler.receive(NettyRpcEnv.scala:562)
    	at org.apache.spark.network.server.TransportRequestHandler.processRpcRequest(TransportRequestHandler.java:159)
    	at org.apache.spark.network.server.TransportRequestHandler.handle(TransportRequestHandler.java:107)
    	at org.apache.spark.network.server.TransportChannelHandler.channelRead(TransportChannelHandler.java:118)
    	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
    	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
    	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346)
    	at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:266)
    	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
    	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
    	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346)
    	at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
    	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
    	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
    	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346)
    	at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:85)
    	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
    	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
    	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346)
    	at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294)
    	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
    	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
    	at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911)
    	at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
    	at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:652)
    	at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:575)
    	at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:489)
    	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:451)
    	at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:140)
    	at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
    	at java.lang.Thread.run(Thread.java:748)
    
    	at org.apache.spark.network.client.TransportResponseHandler.handle(TransportResponseHandler.java:207)
    	at org.apache.spark.network.server.TransportChannelHandler.channelRead(TransportChannelHandler.java:120)
    	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
    	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
    	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
    	at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:287)
    	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
    	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
    	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
    	at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
    	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
    	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
    	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
    	at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:85)
    	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
    	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
    	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
    	at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294)
    	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
    	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
    	at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911)
    	at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
    	at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:643)
    	at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:566)
    	at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:480)
    	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:442)
    	at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131)
    	at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
    	... 1 more
    

    原因是java 通过脚本执行 spark 任务使用的环境变量 与系统不一致需要重新加载。所以要在脚本中调用spark 任务之前 加载环境变量. ./.bash_profile

    类比于 使用 crontab 调度 spark 的任务 也需要入环境变量,因 crontab 的环境变量与系统不一致。

    原因 :
    https://stackoverflow.com/questions/2388087/how-to-get-cron-to-call-in-the-correct-paths

    当然 此报错也可能是因为 spark 启动的时候--master spark://host1:7077写的是 host 不是 ip 造成的;

    展开全文
  • 在Java后端调用服务器上Shell脚本,而Shell脚本执行时一条Scp执行结果的提示报 Authorized users only. All activity may be monitored and reported. 这个是服务器之间的提示信息。但是这个提示信息会被Java调用...

    近期做了个小项目主要是关于数据处理这方面的。

    在Java后端调用服务器上Shell脚本,而Shell脚本执行时一条Scp执行结果的提示报

    Authorized users only. All activity may be monitored and reported.

    这个是服务器之间的提示信息。但是这个提示信息会被Java调用时捕获到错误信息(Error)中,进而导致程序出错。

    解决方法:Shell脚本中Scp命令加入参数

    命令:scp -q zhenxiang.tar.gz cidata@192.100.9.5:/usr/local/tomcat-consistency/uploadFile/mail

    执行这个命令后便不会出现这个提示 Authorized users only. All activity may be monitored and reported.

    希望可以帮助到你

    一个不会敲代码的程序员

    展开全文
  • java代码中调用执行shell脚本,sqlldr导数与使用sqlplus在shell调用执行存储过程。 linux环境中有2个dba的用户:oracle、erm 还有1个web用户:erm 在linux环境中,这三个用户都可以直接在任意目录下执行shell...
  • Java调用Shell命令脚本

    千次阅读 2017-07-23 15:03:21
    Java调用Shell命令脚本 http://blog.csdn.net/u010376788/article/details/51337312    Java中ProcessBuilder使用 http://shensy.iteye.com/blog/1756756    深入研究java.lang.Process类&amp;...
  • java调用shell脚本

    热门讨论 2010-10-29 21:08:20
    java调用shell脚本java调用shell脚本java调用shell脚本java调用shell脚本java调用shell脚本java调用shell脚本
  • Java 应用程序调用本地 shell 命令shell 执行过程中,需要接受入参来进行后续的 shell 中的逻辑。 **脚本demo** ```shell #!/bin/sh echo "What's your name? : " read name echo "Hello $name!" ...
  • 有些项目不缺乏有个别需求,需要通过java调用shell脚本或者某些命令进行操作, 翻阅了不少网上,方法好几种,本次我把我用过的测试过的记录一下; pom文件添加 <!-- ssh远程调用的依赖 --> <...
  • 今天小编就为大家分享一篇java调用shell命令并获取执行结果的示例,具有很好的参考价值,希望对大家有所帮助。一起跟随小编过来看看吧
  • Java调用shell脚本

    2018-06-28 18:30:39
    这是Java调用shell脚本的程序,其中有关于shell连接sftp的shell脚本和简单的讲解,仅供参考
  • shell脚本在处理文本及管理操作系统时强大且简单,将shell脚本结合到应用程序中则是一种快速实现的不错途径本文介绍使用java代码调用执行shell 我在 ~/bin/ 目录下写了jbossLogDelivery.sh,有两个功能{./...
  • ProcessBuilder processBuilder = new ProcessBuilder(command); processBuilder.redirectErrorStream(true);
  • java调用shell脚本、windows命令

    千次阅读 2017-05-12 20:47:25
    package com.autotest.util... import java.io.BufferedReader; import java.io.InputStreamReader; import org.apache.log4j.Logger; public class ShellUtil {  private static Logger logger = Logger.getLogg
  • 一。项目需求: 从某一机构获取证书,证书机构提供小工具...解决方案:java调用shell命令,利用spring容器启动即执行方案。 参考博文:http://zohan.iteye.com/blog/1709136 项目结构: 原码: 1。RuntimeUt...
  • Java调用shell脚本命令时,可使用下面的两种方式: 1、Runtime.getRuntime().exec(); 2、通过ProcessBuilder进行调度。 第一种方式比较直接,具体使用方式见...
  • JAVA调用Shell脚本

    2019-02-09 15:13:21
    JAVA调用Shell脚本

空空如也

空空如也

1 2 3 4 5 ... 20
收藏数 108,458
精华内容 43,383
关键字:

java调用shell脚本执行命令

java 订阅