精华内容
下载资源
问答
  •  最近性能压测执行过程中,经常看到很多小伙伴执行性能测试,要寻找拐点,但是效率太低,本文就介绍下,如何高效确定性能测试拐点  所谓性能测试拐点,就是指并发用户达到一定数量,平均响应时间递增,TPS不增反降...

     最近性能压测执行过程中,经常看到很多测试人员执行性能测试,要寻找拐点,但是效率太低,本文就介绍下,如何高效确定性能测试拐点
     所谓性能测试拐点,就是指并发用户达到一定数量,平均响应时间递增,TPS不增反降,报错率递增,当前并发用户就是该测试案例的拐点

     寻找拐点的意义就是当前并发用户下,系统的平均响应时间、TPS、报错率是否满足性能要求,如果满足,该并发用户就是满足用户需求下所能承受的最大并发用户数,在去考虑并发用户是否满足系统用户需求,可以结合系统总用户数、在线用户数去判断,他们的关系大致如下:
    在线用户数=系统总用户数*20%
    并发用户数=在线用户数*30%
    比如系统总用户数是10000,则在线用户数就是2000,并发用户数就是600

    一、脚本开发

    1. 首先给大家介绍如何开发高效执行的性能测试脚本,目前多数用户都是分不同并发用户单次执行,该方法执行效率低,并且不方便数据比对,如下
    

    在这里插入图片描述
    2. 首先开发好测试案例,然后把案例复制成多个,每个线程修改线程数、用例名称即可,如下所示,修改用例名称和线程数对应,这样生成的测试结果就会区分不同并发下同一个案例的响应时间,方便比对
    在这里插入图片描述
    在这里插入图片描述
    3. 如果有多个接口实现了一个用例,则需要把所有接口放置在事务控制器下即可,这样就能生成一个汇总结果(统计多个请求的响应时间、tps等值)
    在这里插入图片描述
    4. 最后在测试计划记得勾选独立运行每个线程组选项,勾选该选项的意义就是依次并发执行10、20、30、50线程,直到压测结束
    在这里插入图片描述
    二、执行性能测试
    1. 性能测试都是通过命令执行,不建议使用界面压测,命令如下:

    jmeter -n -t rps.jmx -l summary.jtl -e -o report
    

    在这里插入图片描述
    2. 命令解析,summary.jtl生成的是聚合报告,report生成的是jmeter自带的html报告,里面有多种图形报表
    在这里插入图片描述
    三、执行结果分析
    1. 执行完毕,双击index.html即可打开测试报告
    在这里插入图片描述
    2. 汇总结果如下,随着并发用户增加,平均响应时间在递增,报错率也在递增,TPS也随着用户数的增加和增加,到了500用户为最高点,1000并发用户反而降低
    在这里插入图片描述
    3. 查看Response Times Over Time图表可以看到响应时间随着并发用户数递增,平均响应时间一直增加,当从500并发改成1000并发时,响应时间增加幅度最大
    在这里插入图片描述
    在这里插入图片描述

    4. 查看Transactions Per Second,从图表可以看出当并发用户从10递增到400,一直是递增趋势,然后500-1000,开始慢慢降低
    在这里插入图片描述
    在这里插入图片描述


    5. 查看Response Time Percentiles,由图可见,1000并发用户只有2.5%的事物响应时间小于5秒,其它都大于5秒,并且50%大于7秒,对于一个简单查询,该响应时间用户已经不能接受
    在这里插入图片描述
    在这里插入图片描述
    6. 查看 Hits Per Second,从图表可以看出当并发用户从10递增到400的时间段,每秒请求数一直是递增趋势,然后500-1000,请求数开始不增反降
    在这里插入图片描述在这里插入图片描述

    本文测试案例执行的并发场景,拐点就是500并发,具体案例还需要结合测试结果具体分析


    如果文章对你有帮助,欢迎关注本人公众号,公众号与本平台文章同步,方便大家查阅,本人会持续推出与测试有关的文章,与大家分享测试技术,每一篇原创文章都是用心编写,杜绝抄袭复制


    QQ技术交流群:加群请输入验证信息 CSDN
                  在这里插入图片描述


    微信二维码关注公众号:

    在这里插入图片描述


    关注之后,回复资源下载,即可获取本人共享的各种资源下载地址

    [外链图片转存失败,源站可能有防盗链机制,建议将图片保存下来直接上传(img-dkyHNY2D-1569554485737)(https://s1.51cto.com/images/blog/201908/06/2e62f8806e1dc1c391c4332ac7fd70b1.png?x-oss-process=image/watermark,size_16,text_QDUxQ1RP5Y2a5a6i,color_FFFFFF,t_100,g_se,x_10,y_10,shadow_90,type_ZmFuZ3poZW5naGVpdGk=)]

    展开全文
  • jmeter学习-确认拐点

    2020-01-15 11:53:32
    D:\software\apache-jmeter-3.3\bin>jmeter -n -t 拐点测试.jmx -l summary.jtl -e -o report SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/D:/software/apache-jmeter...

    1、使用场景

    寻找最大并发用户数 ,最大并发用户数有一个公式:c=nL/T.其中n为每天访问系统的用户数,L为在线用户从登陆到退出的时间,T为用户每天使用系统大概多长时间。将我们测试找到的最大用户数跟该公式计算出来的对比,就可判断系统性能是否满足条件了。

    2、脚本

    针对某个使用频率特别高的接口进行测试。如图:

    在这里插入图片描述

    (1)线程数:为并发数,每个线程会完全独立的运行测试计划,互不干扰。
    (2)ramp-up period:用户设置启动所有线程所需要的时间,如果选择了10个线程,并且ramp-up period为100秒,那么jmeter将使用100秒是10个线程启动,每个线程将在前一个线程启动后10(100/10)秒后启动
    (3)设置循环次数:设置线程组结束前每个线程循环的次数,如果次数设置为1,那么jmeter在计划停止前只执行一次,若勾选了永远,则会一直执行(若下面设置了持续时间,则循环执行持续的时间后停止)
    (4)持续时间,控制测试执行的持续时间,若循环次数设置为1,执行该线程组的时间小于持续时间线程组也会停止。

    注意:该线程组使用了事务控制器,其作用为当该线程组多个接口实现了一个用例,使用事务控制器会生成一个汇总的结果。

    D:\software\apache-jmeter-3.3\bin>jmeter -n -t 拐点测试.jmx -l summary.jtl -e -o report
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:file:/D:/software/apache-jmeter-3.3/lib/log4j-slf4j-impl-2.8.2.jar!/org/slf4j/impl/StaticLoggerBinder.clas
    SLF4J: Found binding in [jar:file:/D:/software/apache-jmeter-3.3/lib/ext/jmeter-plugins-dubbo-1.3.6-jar-with-dependencies.jar!/org/slf4
    SLF4J: Found binding in [jar:file:/D:/software/apache-jmeter-3.3/lib/ext/jmeter-test.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
    Creating summariser <summary>
    Created the tree successfully using 拐点测试.jmx
    Starting the test @ Wed Jan 15 10:58:47 CST 2020 (1579057127796)
    Waiting for possible Shutdown/StopTestNow/Heapdump message on port 4445
    summary +   1856 in 00:00:12 =  156.4/s Avg:    14 Min:     8 Max:    94 Err:     0 (0.00%) Active: 7 Started: 7 Finished: 0
    summary +  16179 in 00:00:30 =  539.3/s Avg:    18 Min:    10 Max:    55 Err:     0 (0.00%) Active: 10 Started: 10 Finished: 0
    summary =  18035 in 00:00:42 =  430.8/s Avg:    17 Min:     8 Max:    94 Err:     0 (0.00%)
    summary +  15647 in 00:00:30 =  521.5/s Avg:    18 Min:    11 Max:    53 Err:     0 (0.00%) Active: 10 Started: 10 Finished: 0
    summary =  33682 in 00:01:12 =  468.7/s Avg:    18 Min:     8 Max:    94 Err:     0 (0.00%)
    summary +  16791 in 00:00:30 =  559.7/s Avg:    17 Min:    12 Max:    35 Err:     0 (0.00%) Active: 10 Started: 10 Finished: 0
    summary =  50473 in 00:01:42 =  495.5/s Avg:    17 Min:     8 Max:    94 Err:     0 (0.00%)
    summary +  15632 in 00:00:30 =  521.1/s Avg:    19 Min:    11 Max:    82 Err:     0 (0.00%) Active: 10 Started: 10 Finished: 0
    summary =  66105 in 00:02:12 =  501.3/s Avg:    18 Min:     8 Max:    94 Err:     0 (0.00%)
    summary +  16662 in 00:00:30 =  555.2/s Avg:    17 Min:    10 Max:    50 Err:     0 (0.00%) Active: 10 Started: 10 Finished: 0
    summary =  82767 in 00:02:42 =  511.3/s Avg:    18 Min:     8 Max:    94 Err:     0 (0.00%)
    summary +  14862 in 00:00:30 =  495.6/s Avg:    20 Min:    11 Max:   146 Err:     0 (0.00%) Active: 10 Started: 10 Finished: 0
    summary =  97629 in 00:03:12 =  508.8/s Avg:    18 Min:     8 Max:   146 Err:     0 (0.00%)
    summary +  16667 in 00:00:30 =  555.5/s Avg:    17 Min:    11 Max:    81 Err:     0 (0.00%) Active: 10 Started: 10 Finished: 0
    summary = 114296 in 00:03:42 =  515.2/s Avg:    18 Min:     8 Max:   146 Err:     0 (0.00%)
    summary +  14111 in 00:00:30 =  470.3/s Avg:    20 Min:    10 Max:   126 Err:     0 (0.00%) Active: 10 Started: 10 Finished: 0
    summary = 128407 in 00:04:12 =  509.8/s Avg:    18 Min:     8 Max:   146 Err:     0 (0.00%)
    summary +  16451 in 00:00:30 =  548.4/s Avg:    18 Min:    11 Max:    80 Err:     0 (0.00%) Active: 10 Started: 10 Finished: 0
    summary = 144858 in 00:04:42 =  513.9/s Avg:    18 Min:     8 Max:   146 Err:     0 (0.00%)
    summary +  13647 in 00:00:30 =  454.9/s Avg:    18 Min:     9 Max:    37 Err:     0 (0.00%) Active: 19 Started: 29 Finished: 10
    summary = 158505 in 00:05:12 =  508.2/s Avg:    18 Min:     8 Max:   146 Err:     0 (0.00%)
    summary +  28585 in 00:00:30 =  952.9/s Avg:    93 Min:    13 Max:   909 Err:     0 (0.00%) Active: 100 Started: 110 Finished: 10
    summary = 187090 in 00:05:42 =  547.3/s Avg:    29 Min:     8 Max:   909 Err:     0 (0.00%)
    summary +  31629 in 00:00:30 = 1054.3/s Avg:    94 Min:    34 Max:   240 Err:     0 (0.00%) Active: 100 Started: 110 Finished: 10
    summary = 218719 in 00:06:12 =  588.2/s Avg:    39 Min:     8 Max:   909 Err:     0 (0.00%)
    summary +  31443 in 00:00:30 = 1048.1/s Avg:    95 Min:    33 Max:   450 Err:     0 (0.00%) Active: 100 Started: 110 Finished: 10
    summary = 250162 in 00:06:42 =  622.5/s Avg:    46 Min:     8 Max:   909 Err:     0 (0.00%)
    summary +  29705 in 00:00:30 =  990.2/s Avg:   100 Min:    31 Max:  1007 Err:     0 (0.00%) Active: 100 Started: 110 Finished: 10
    summary = 279867 in 00:07:12 =  648.0/s Avg:    52 Min:     8 Max:  1007 Err:     0 (0.00%)
    summary +  31843 in 00:00:30 = 1061.4/s Avg:    94 Min:    35 Max:   221 Err:     0 (0.00%) Active: 100 Started: 110 Finished: 10
    summary = 311710 in 00:07:42 =  674.9/s Avg:    56 Min:     8 Max:  1007 Err:     0 (0.00%)
    summary +  29381 in 00:00:30 =  979.3/s Avg:   102 Min:    18 Max:  1046 Err:     0 (0.00%) Active: 100 Started: 110 Finished: 10
    summary = 341091 in 00:08:12 =  693.5/s Avg:    60 Min:     8 Max:  1046 Err:     0 (0.00%)
    summary +  31766 in 00:00:30 = 1058.9/s Avg:    94 Min:    33 Max:   295 Err:     0 (0.00%) Active: 100 Started: 110 Finished: 10
    summary = 372857 in 00:08:42 =  714.5/s Avg:    63 Min:     8 Max:  1046 Err:     0 (0.00%)
    summary +  31819 in 00:00:30 = 1060.6/s Avg:    94 Min:    32 Max:   247 Err:     0 (0.00%) Active: 100 Started: 110 Finished: 10
    summary = 404676 in 00:09:12 =  733.3/s Avg:    65 Min:     8 Max:  1046 Err:     0 (0.00%)
    summary +  31804 in 00:00:30 = 1060.1/s Avg:    94 Min:    33 Max:   245 Err:     0 (0.00%) Active: 100 Started: 110 Finished: 10
    summary = 436480 in 00:09:42 =  750.1/s Avg:    67 Min:     8 Max:  1046 Err:     0 (0.00%)
    summary +  29692 in 00:00:33 =  892.5/s Avg:    95 Min:    21 Max:   232 Err:     0 (0.00%) Active: 1 Started: 111 Finished: 110
    summary = 466172 in 00:10:15 =  757.8/s Avg:    69 Min:     8 Max:  1046 Err:     0 (0.00%)
    summary +  26821 in 00:00:27 = 1003.3/s Avg:   121 Min:     8 Max:   745 Err:     0 (0.00%) Active: 150 Started: 260 Finished: 110
    summary = 492993 in 00:10:42 =  768.1/s Avg:    72 Min:     8 Max:  1046 Err:     0 (0.00%)
    summary +  31785 in 00:00:30 = 1059.6/s Avg:   141 Min:    34 Max:   478 Err:     0 (0.00%) Active: 150 Started: 260 Finished: 110
    summary = 524778 in 00:11:12 =  781.1/s Avg:    76 Min:     8 Max:  1046 Err:     0 (0.00%)
    summary +  31739 in 00:00:30 = 1057.9/s Avg:   141 Min:    33 Max:   505 Err:     0 (0.00%) Active: 150 Started: 260 Finished: 110
    summary = 556517 in 00:11:42 =  792.9/s Avg:    80 Min:     8 Max:  1046 Err:     0 (0.00%)
    summary +  31257 in 00:00:30 = 1041.9/s Avg:   143 Min:    35 Max:   720 Err:     0 (0.00%) Active: 150 Started: 260 Finished: 110
    summary = 587774 in 00:12:12 =  803.1/s Avg:    83 Min:     8 Max:  1046 Err:     0 (0.00%)
    summary +  31746 in 00:00:30 = 1058.1/s Avg:   141 Min:    35 Max:   449 Err:     0 (0.00%) Active: 150 Started: 260 Finished: 110
    summary = 619520 in 00:12:42 =  813.2/s Avg:    86 Min:     8 Max:  1046 Err:     0 (0.00%)
    summary +  31620 in 00:00:30 = 1054.1/s Avg:   142 Min:    35 Max:   487 Err:     0 (0.00%) Active: 150 Started: 260 Finished: 110
    summary = 651140 in 00:13:12 =  822.3/s Avg:    89 Min:     8 Max:  1046 Err:     0 (0.00%)
    summary +  31312 in 00:00:30 = 1043.7/s Avg:   143 Min:    32 Max:   404 Err:     0 (0.00%) Active: 150 Started: 260 Finished: 110
    summary = 682452 in 00:13:42 =  830.4/s Avg:    91 Min:     8 Max:  1046 Err:     0 (0.00%)
    summary +  31187 in 00:00:30 = 1039.6/s Avg:   144 Min:    44 Max:   453 Err:     0 (0.00%) Active: 150 Started: 260 Finished: 110
    summary = 713639 in 00:14:12 =  837.7/s Avg:    94 Min:     8 Max:  1046 Err:     0 (0.00%)
    summary +  31760 in 00:00:30 = 1058.7/s Avg:   141 Min:    39 Max:   412 Err:     0 (0.00%) Active: 150 Started: 260 Finished: 110
    summary = 745399 in 00:14:42 =  845.3/s Avg:    96 Min:     8 Max:  1046 Err:     0 (0.00%)
    summary +  31613 in 00:00:30 = 1053.8/s Avg:   142 Min:    44 Max:   420 Err:     0 (0.00%) Active: 150 Started: 260 Finished: 110
    summary = 777012 in 00:15:12 =  852.1/s Avg:    97 Min:     8 Max:  1046 Err:     0 (0.00%)
    summary +  25591 in 00:00:30 =  853.0/s Avg:   149 Min:    10 Max:   536 Err:     0 (0.00%) Active: 200 Started: 460 Finished: 260
    summary = 802603 in 00:15:42 =  852.1/s Avg:    99 Min:     8 Max:  1046 Err:     0 (0.00%)
    summary +  31807 in 00:00:30 = 1060.2/s Avg:   188 Min:    36 Max:   679 Err:     0 (0.00%) Active: 200 Started: 460 Finished: 260
    summary = 834410 in 00:16:12 =  858.6/s Avg:   103 Min:     8 Max:  1046 Err:     0 (0.00%)
    summary +  31887 in 00:00:30 = 1062.9/s Avg:   188 Min:    35 Max:   762 Err:     0 (0.00%) Active: 200 Started: 460 Finished: 260
    summary = 866297 in 00:16:42 =  864.7/s Avg:   106 Min:     8 Max:  1046 Err:     0 (0.00%)
    summary +  31635 in 00:00:30 = 1054.5/s Avg:   189 Min:    38 Max:   586 Err:     0 (0.00%) Active: 200 Started: 460 Finished: 260
    summary = 897932 in 00:17:12 =  870.2/s Avg:   109 Min:     8 Max:  1046 Err:     0 (0.00%)
    summary +  31422 in 00:00:30 = 1047.3/s Avg:   190 Min:    35 Max:   669 Err:     0 (0.00%) Active: 200 Started: 460 Finished: 260
    summary = 929354 in 00:17:42 =  875.2/s Avg:   111 Min:     8 Max:  1046 Err:     0 (0.00%)
    summary +  31672 in 00:00:30 = 1055.9/s Avg:   189 Min:    30 Max:   608 Err:     0 (0.00%) Active: 200 Started: 460 Finished: 260
    summary = 961026 in 00:18:12 =  880.2/s Avg:   114 Min:     8 Max:  1046 Err:     0 (0.00%)
    summary +  31598 in 00:00:30 = 1053.2/s Avg:   189 Min:    32 Max:   609 Err:     0 (0.00%) Active: 200 Started: 460 Finished: 260
    summary = 992624 in 00:18:42 =  884.8/s Avg:   116 Min:     8 Max:  1046 Err:     0 (0.00%)
    summary +  31865 in 00:00:30 = 1062.2/s Avg:   188 Min:    34 Max:   699 Err:     0 (0.00%) Active: 200 Started: 460 Finished: 260
    summary = 1024489 in 00:19:12 =  889.4/s Avg:   119 Min:     8 Max:  1046 Err:     0 (0.00%)
    summary +  31821 in 00:00:30 = 1060.7/s Avg:   188 Min:    30 Max:   777 Err:     0 (0.00%) Active: 200 Started: 460 Finished: 260
    summary = 1056310 in 00:19:42 =  893.8/s Avg:   121 Min:     8 Max:  1046 Err:     0 (0.00%)
    summary +  31780 in 00:00:30 = 1059.3/s Avg:   188 Min:    36 Max:   707 Err:     0 (0.00%) Active: 200 Started: 460 Finished: 260
    summary = 1088090 in 00:20:12 =  897.9/s Avg:   123 Min:     8 Max:  1046 Err:     0 (0.00%)
    summary +  25390 in 00:00:30 =  846.3/s Avg:   201 Min:    10 Max:  1094 Err:     0 (0.00%) Active: 300 Started: 760 Finished: 460
    summary = 1113480 in 00:20:42 =  896.6/s Avg:   124 Min:     8 Max:  1094 Err:     0 (0.00%)
    summary +  31560 in 00:00:30 = 1052.0/s Avg:   284 Min:   121 Max:   860 Err:     0 (0.00%) Active: 300 Started: 760 Finished: 460
    summary = 1145040 in 00:21:12 =  900.3/s Avg:   129 Min:     8 Max:  1094 Err:     0 (0.00%)
    summary +  31414 in 00:00:30 = 1047.1/s Avg:   286 Min:   122 Max:   743 Err:     0 (0.00%) Active: 300 Started: 760 Finished: 460
    summary = 1176454 in 00:21:42 =  903.7/s Avg:   133 Min:     8 Max:  1094 Err:     0 (0.00%)
    summary +  30907 in 00:00:30 = 1030.2/s Avg:   285 Min:    65 Max:   782 Err:     0 (0.00%) Active: 300 Started: 760 Finished: 460
    summary = 1207361 in 00:22:12 =  906.5/s Avg:   137 Min:     8 Max:  1094 Err:     0 (0.00%)
    summary +  37188 in 00:00:30 = 1239.6/s Avg:   241 Min:     2 Max:  2008 Err:  6155 (16.55%) Active: 300 Started: 760 Finished: 460
    summary = 1244549 in 00:22:42 =  913.9/s Avg:   140 Min:     2 Max:  2008 Err:  6155 (0.49%)
    summary +  33980 in 00:00:30 = 1132.7/s Avg:   264 Min:     2 Max:   753 Err:  2373 (6.98%) Active: 300 Started: 760 Finished: 460
    summary = 1278529 in 00:23:12 =  918.6/s Avg:   143 Min:     2 Max:  2008 Err:  8528 (0.67%)
    summary +  31904 in 00:00:30 = 1054.0/s Avg:   276 Min:     2 Max:  1385 Err:  1061 (3.33%) Active: 300 Started: 760 Finished: 460
    summary = 1310433 in 00:23:42 =  921.5/s Avg:   147 Min:     2 Max:  2008 Err:  9589 (0.73%)
    summary +  30998 in 00:00:30 = 1042.7/s Avg:   295 Min:   120 Max:  1881 Err:     0 (0.00%) Active: 300 Started: 760 Finished: 460
    summary = 1341431 in 00:24:12 =  923.9/s Avg:   150 Min:     2 Max:  2008 Err:  9589 (0.71%)
    summary +  36971 in 00:00:30 = 1232.3/s Avg:   243 Min:     2 Max:  1885 Err:  5534 (14.97%) Active: 300 Started: 760 Finished: 460
    summary = 1378402 in 00:24:42 =  930.2/s Avg:   152 Min:     2 Max:  2008 Err: 15123 (1.10%)
    summary +  35667 in 00:00:30 = 1188.9/s Avg:   252 Min:     2 Max:   828 Err:  3989 (11.18%) Active: 300 Started: 760 Finished: 460
    summary = 1414069 in 00:25:12 =  935.3/s Avg:   155 Min:     2 Max:  2008 Err: 19112 (1.35%)
    summary +  26708 in 00:00:30 =  890.2/s Avg:   246 Min:     3 Max:   965 Err:   740 (2.77%) Active: 400 Started: 1160 Finished: 760
    summary = 1440777 in 00:25:42 =  934.4/s Avg:   157 Min:     2 Max:  2008 Err: 19852 (1.38%)
    summary +  34526 in 00:00:30 = 1150.9/s Avg:   347 Min:     2 Max:   884 Err:  2953 (8.55%) Active: 400 Started: 1160 Finished: 760
    summary = 1475303 in 00:26:12 =  938.6/s Avg:   161 Min:     2 Max:  2008 Err: 22805 (1.55%)
    summary +  43699 in 00:00:30 = 1456.6/s Avg:   274 Min:     2 Max:  2066 Err: 12138 (27.78%) Active: 400 Started: 1160 Finished: 760
    summary = 1519002 in 00:26:42 =  948.3/s Avg:   164 Min:     2 Max:  2066 Err: 34943 (2.30%)
    summary +  42741 in 00:00:30 = 1424.7/s Avg:   280 Min:     2 Max:   798 Err: 11360 (26.58%) Active: 400 Started: 1160 Finished: 760
    summary = 1561743 in 00:27:12 =  957.0/s Avg:   168 Min:     2 Max:  2066 Err: 46303 (2.96%)
    summary +  36867 in 00:00:30 = 1228.9/s Avg:   323 Min:     2 Max:  2284 Err:  9606 (26.06%) Active: 400 Started: 1160 Finished: 760
    summary = 1598610 in 00:27:42 =  961.9/s Avg:   171 Min:     2 Max:  2284 Err: 55909 (3.50%)
    summary +  35582 in 00:00:30 = 1186.1/s Avg:   337 Min:     2 Max:   861 Err:  4119 (11.58%) Active: 400 Started: 1160 Finished: 760
    summary = 1634192 in 00:28:12 =  965.9/s Avg:   175 Min:     2 Max:  2284 Err: 60028 (3.67%)
    summary +  43598 in 00:00:30 = 1453.3/s Avg:   275 Min:     2 Max:  2211 Err: 12075 (27.70%) Active: 400 Started: 1160 Finished: 760
    summary = 1677790 in 00:28:42 =  974.4/s Avg:   177 Min:     2 Max:  2284 Err: 72103 (4.30%)
    summary +  43611 in 00:00:30 = 1453.7/s Avg:   274 Min:     2 Max:   797 Err: 11925 (27.34%) Active: 400 Started: 1160 Finished: 760
    summary = 1721401 in 00:29:12 =  982.6/s Avg:   180 Min:     2 Max:  2284 Err: 84028 (4.88%)
    summary +  37447 in 00:00:30 = 1248.2/s Avg:   320 Min:     2 Max:  2263 Err: 10223 (27.30%) Active: 400 Started: 1160 Finished: 760
    summary = 1758848 in 00:29:42 =  987.1/s Avg:   183 Min:     2 Max:  2284 Err: 94251 (5.36%)
    summary +  36504 in 00:00:30 = 1216.8/s Avg:   328 Min:     2 Max:   982 Err:  4915 (13.46%) Active: 400 Started: 1160 Finished: 760
    summary = 1795352 in 00:30:12 =  990.9/s Avg:   186 Min:     2 Max:  2284 Err: 99166 (5.52%)
    summary +  33451 in 00:00:30 = 1115.0/s Avg:   252 Min:     2 Max:   854 Err:  7484 (22.37%) Active: 307 Started: 1467 Finished: 1160
    summary = 1828803 in 00:30:42 =  992.9/s Avg:   187 Min:     2 Max:  2284 Err: 106650 (5.83%)
    summary +  41694 in 00:00:30 = 1389.8/s Avg:   348 Min:     2 Max:  1712 Err: 10112 (24.25%) Active: 500 Started: 1660 Finished: 1160
    summary = 1870497 in 00:31:12 =  999.3/s Avg:   191 Min:     2 Max:  2284 Err: 116762 (6.24%)
    summary +  36909 in 00:00:30 = 1230.3/s Avg:   405 Min:     3 Max:  2820 Err: 10518 (28.50%) Active: 500 Started: 1660 Finished: 1160
    summary = 1907406 in 00:31:42 = 1002.9/s Avg:   195 Min:     2 Max:  2820 Err: 127280 (6.67%)
    summary +  42118 in 00:00:30 = 1403.9/s Avg:   357 Min:     2 Max:  2264 Err: 10503 (24.94%) Active: 500 Started: 1660 Finished: 1160
    summary = 1949524 in 00:32:12 = 1009.1/s Avg:   198 Min:     2 Max:  2820 Err: 137783 (7.07%)
    summary +  38779 in 00:00:30 = 1292.6/s Avg:   385 Min:     2 Max:  2260 Err: 12096 (31.19%) Active: 500 Started: 1660 Finished: 1160
    summary = 1988303 in 00:32:42 = 1013.5/s Avg:   202 Min:     2 Max:  2820 Err: 149879 (7.54%)
    summary +  43672 in 00:00:30 = 1455.7/s Avg:   343 Min:     2 Max:  1945 Err: 12053 (27.60%) Active: 500 Started: 1660 Finished: 1160
    summary = 2031975 in 00:33:12 = 1020.1/s Avg:   205 Min:     2 Max:  2820 Err: 161932 (7.97%)
    summary +  37380 in 00:00:30 = 1246.0/s Avg:   398 Min:     3 Max:  2974 Err: 10396 (27.81%) Active: 500 Started: 1660 Finished: 1160
    summary = 2069355 in 00:33:42 = 1023.5/s Avg:   208 Min:     2 Max:  2974 Err: 172328 (8.33%)
    summary +  41902 in 00:00:30 = 1396.7/s Avg:   360 Min:     2 Max:  2179 Err: 10557 (25.19%) Active: 500 Started: 1660 Finished: 1160
    summary = 2111257 in 00:34:12 = 1028.9/s Avg:   211 Min:     2 Max:  2974 Err: 182885 (8.66%)
    summary +  38501 in 00:00:30 = 1283.4/s Avg:   387 Min:     2 Max:  2378 Err: 11831 (30.73%) Active: 500 Started: 1660 Finished: 1160
    summary = 2149758 in 00:34:42 = 1032.6/s Avg:   214 Min:     2 Max:  2974 Err: 194716 (9.06%)
    summary +  44312 in 00:00:30 = 1477.1/s Avg:   339 Min:     2 Max:  2143 Err: 12682 (28.62%) Active: 500 Started: 1660 Finished: 1160
    summary = 2194070 in 00:35:12 = 1038.9/s Avg:   217 Min:     2 Max:  2974 Err: 207398 (9.45%)
    summary +  34259 in 00:00:30 = 1142.0/s Avg:   353 Min:     3 Max:  1815 Err:  8194 (23.92%) Active: 91 Started: 1751 Finished: 1660
    summary = 2228329 in 00:35:42 = 1040.4/s Avg:   219 Min:     2 Max:  2974 Err: 215592 (9.68%)
    summary +  46247 in 00:00:30 = 1541.6/s Avg:   543 Min:     2 Max:  3542 Err: 14651 (31.68%) Active: 1000 Started: 2660 Finished: 1660
    summary = 2274576 in 00:36:12 = 1047.3/s Avg:   226 Min:     2 Max:  3542 Err: 230243 (10.12%)
    summary +  43564 in 00:00:30 = 1452.1/s Avg:   680 Min:     3 Max:  4335 Err: 17180 (39.44%) Active: 1000 Started: 2660 Finished: 1660
    summary = 2318140 in 00:36:42 = 1052.8/s Avg:   234 Min:     2 Max:  4335 Err: 247423 (10.67%)
    summary +  47173 in 00:00:30 = 1572.4/s Avg:   627 Min:     2 Max:  3487 Err: 16020 (33.96%) Active: 1000 Started: 2660 Finished: 1660
    summary = 2365313 in 00:37:12 = 1059.8/s Avg:   242 Min:     2 Max:  4335 Err: 263443 (11.14%)
    summary +  39308 in 00:00:30 = 1310.3/s Avg:   719 Min:     4 Max:  4907 Err: 11648 (29.63%) Active: 1000 Started: 2660 Finished: 1660
    summary = 2404621 in 00:37:42 = 1063.1/s Avg:   250 Min:     2 Max:  4907 Err: 275091 (11.44%)
    summary +  45027 in 00:00:30 = 1500.2/s Avg:   701 Min:     8 Max:  6175 Err: 14443 (32.08%) Active: 1000 Started: 2660 Finished: 1660
    summary = 2449648 in 00:38:12 = 1068.8/s Avg:   258 Min:     2 Max:  6175 Err: 289534 (11.82%)
    summary +  43309 in 00:00:30 = 1444.2/s Avg:   655 Min:     4 Max:  3946 Err: 14729 (34.01%) Active: 1000 Started: 2660 Finished: 1660
    summary = 2492957 in 00:38:42 = 1073.7/s Avg:   265 Min:     2 Max:  6175 Err: 304263 (12.20%)
    summary +  46909 in 00:00:30 = 1563.6/s Avg:   644 Min:     5 Max:  4922 Err: 17650 (37.63%) Active: 1000 Started: 2660 Finished: 1660
    summary = 2539866 in 00:39:12 = 1079.9/s Avg:   272 Min:     2 Max:  6175 Err: 321913 (12.67%)
    summary +  42572 in 00:00:30 = 1418.8/s Avg:   663 Min:     4 Max:  3914 Err: 12785 (30.03%) Active: 1000 Started: 2660 Finished: 1660
    summary = 2582438 in 00:39:42 = 1084.2/s Avg:   279 Min:     2 Max:  6175 Err: 334698 (12.96%)
    summary +  41909 in 00:00:30 = 1397.2/s Avg:   752 Min:     3 Max:  5582 Err: 13462 (32.12%) Active: 1000 Started: 2660 Finished: 1660
    summary = 2624347 in 00:40:12 = 1088.1/s Avg:   286 Min:     2 Max:  6175 Err: 348160 (13.27%)
    summary +  45799 in 00:00:30 = 1547.9/s Avg:   646 Min:    10 Max:  3147 Err: 15120 (33.01%) Active: 0 Started: 2660 Finished: 2660
    summary = 2670146 in 00:40:41 = 1093.7/s Avg:   292 Min:     2 Max:  6175 Err: 363280 (13.61%)
    Tidying up ...    @ Wed Jan 15 11:39:29 CST 2020 (1579059569592)
    ... end of run
    D:\software\apache-jmeter-3.3\bin>
    
    

    分析见

    https://blog.51cto.com/6183574/2445386

    注意:高并发时,查看jmeter日志可能会看到如下错误:
    Non HTTP response code: java.net.BindException/Non HTTP response message: Address already in use: connect
    这个问题的原因是windows端口被耗尽了(默认1024-5000),而且操作系统要 2~4分钟才会重新释放这些端口,所以可以增加windows的可用端口来解决。windows端口最大数为65534

    解决方法见
    http://www.bubuko.com/infodetail-3306240.html

    展开全文
  • 虽然我了解衍生物可用于确定拐点,但请注意,数据很嘈杂,我不确定该方法能否让我明确识别“真正的拐点”(本例中x = 15)。我想知道,如果一个简单的方法是可行的,如找到4个数据点,其中x1 < X2 < X3 X4 <...

    aS61Y.png

    我想确定的x数据点的索引,其中,曲线开始认真地增加(在本例,将是围绕x = 15)。

    虽然我了解衍生物可用于确定拐点,但请注意,数据很嘈杂,我不确定该方法能否让我明确识别“真正的拐点”(本例中x = 15)。

    我想知道,如果一个简单的方法是可行的,如

    找到4个数据点,其中x1 < X2 < X3 X4 <

    返回X1

    的指数你有有关如何完成此任务的任何建议?从上面的曲线

    index SQMean

    _____ ____________

    '0' '139.428574'

    '1' '133.298706'

    '2' '135.961044'

    '3' '143.688309'

    '4' '133.298706'

    '5' '133.181824'

    '6' '134.896103'

    '7' '146.415588'

    '8' '142.324677'

    '9' '128.168839'

    '10' '146.116882'

    '11' '146.766235'

    '12' '134.675323'

    '13' '138.610382'

    '14' '140.558441'

    '15' '128.662338'

    '16' '138.480515'

    '17' '153.610382'

    '18' '156.207794'

    '19' '183.428574'

    '20' '220.324677'

    '21' '224.324677'

    '22' '230.415588'

    '23' '226.766235'

    '24' '223.935059'

    '25' '229.922073'

    '26' '234.389618'

    '27' '235.493500'

    '28' '225.727280'

    '29' '241.623383'

    '30' '225.805191'

    '31' '240.896103'

    '32' '224.090912'

    '33' '230.467529'

    '34' '248.285721'

    '35' '233.779221'

    '36' '225.532471'

    '37' '247.337662'

    '38' '233.000000'

    '39' '241.740265'

    '40' '235.688309'

    '41' '238.662338'

    '42' '236.636368'

    '43' '236.025970'

    '44' '234.818176'

    '45' '240.974030'

    '46' '251.350647'

    '47' '241.857147'

    '48' '242.623383'

    '49' '245.714279'

    '50' '250.701294'

    '51' '229.415588'

    '52' '236.909088'

    '53' '243.779221'

    '54' '244.532471'

    '55' '241.493500'

    '56' '245.480515'

    '57' '244.324677'

    '58' '244.025970'

    '59' '231.987015'

    '60' '238.740265'

    '61' '239.532471'

    '62' '232.363632'

    '63' '242.454544'

    '64' '243.831161'

    '65' '229.688309'

    '66' '239.493500'

    '67' '247.324677'

    '68' '245.324677'

    '69' '244.662338'

    '70' '238.610382'

    '71' '243.324677'

    '72' '234.584412'

    '73' '235.181824'

    '74' '228.974030'

    '75' '228.246750'

    '76' '230.519485'

    '77' '231.441559'

    '78' '236.324677'

    '79' '229.935059'

    '80' '238.701294'

    '81' '236.441559'

    '82' '244.350647'

    '83' '233.714279'

    '84' '243.753250'

    2015-04-05

    pepe

    +0

    为什么您将问题标记为'c'? –

    2015-04-05 13:56:03

    +0

    嗨,你可以参考你以前的类似问题,并解释这与其他问题有何不同。 –

    2015-04-05 15:27:23

    +1

    找到4分并不是很健壮。您可以尝试对数据进行阶梯函数拟合,或者将数值聚类为较低和较高的值,例如使用kmeans(X,2)',然后使用集群边界来查找您的转换。无论如何,首先使用'smooth'或者'wiener'来平滑数据可能是一个好主意。 –

    2015-04-05 15:27:25

    展开全文
  • 机械2020年度策略:优选确定、布局拐点.pdf
  • 银行周报:业绩拐点确认,关注年报行情.pdf
  • 参考链接:使用Python检测新冠肺炎疫情拐点,抗疫成果明显1 简介拐点检测(Knee point detection),指的是在具有上升或下降趋势的曲线中,在某一点之后整体趋势明显发生变化,这样的点就称为拐点(如图1所示,在...

    参考链接:使用Python检测新冠肺炎疫情拐点,抗疫成果明显

    1 简介

    拐点检测(Knee point detection),指的是在具有上升或下降趋势的曲线中,在某一点之后整体趋势明显发生变化,这样的点就称为拐点(如图1所示,在蓝色标记出的点之后曲线陡然上升):

    c886ba3f698128113a554c1f2d481f59.png

    图1

    本文就将针对Python中用于拐点检测的第三方包kneed进行介绍,并以新型冠状肺炎数据为例,找出各指标数学意义上的拐点。

    2 基于kneed的拐点检测

    许多算法都需要利用肘部法则来确定某些关键参数,如K-means中聚类个数kDBSCAN中的搜索半径eps等。

    在面对需要确定所谓肘部,即拐点时,人为通过观察来确定位置的方式不严谨,需要一套有数学原理支撑的检测方法。

    Jeannie Albrecht等人在Finding a “Kneedle” in a Haystack: Detecting Knee Points in System Behavior(你可以在文章开头的Github仓库中找到)中从曲率的思想出发,针对离散型数据,结合离线、在线的不同应用场景以及Angle-basedMenger CurvatureEWMA等算法,提出了一套拐点检测方法。

    kneed就是对这篇论文所提出算法的实现。

    使用pip install kneed完成安装之后,下面我们来了解其主要用法:

    2.1.1 KneeLocator

    KneeLocatorkneed中用于检测拐点的模块,其主要参数如下:

    x:待检测数据对应的横轴数据序列,如时间点、日期等
    y:待检测数据序列,在x条件下对应的值,如x为星期一,对应的y为降水量
    S:float型,默认为1,敏感度参数,越小对应拐点被检测出得越快
    curve:str型,指明曲线之上区域是凸集还是凹集,concave代表凹,convex代表凸
    direction:str型,指明曲线初始趋势是增还是减,increasing表示增,decreasing表示减
    online:bool型,用于设置在线/离线识别模式,True表示在线,False表示离线;在线模式下会沿着x轴从右向左识别出每一个局部拐点,并在其中选择最优的拐点;离线模式下会返回从右向左检测到的第一个局部拐点

    KneeLocator在传入参数实例化完成计算后,可返回的我们主要关注的属性如下:

    knee及elbow:返回检测到的最优拐点对应的x
    
    knee_y及elbow_y:返回检测到的最优拐点对应的y
    
    all_elbows及all_knees:返回检测到的所有局部拐点对应的x
    
    all_elbows_y及all_knees_y:返回检测到的所有局部拐点对应的y

    curvedirection参数非常重要,用它们组合出想要识别出的拐点模式。

    以余弦函数为例,在oonline设置为True时,分别在curve='concave'+direction='increasing'curve='concave'+direction='decreasing'curve='convex'+direction='increasing'curve='convex'+direction='decreasing'参数组合下对同一段余弦曲线进行拐点计算:

    import matplotlib.pyplot as plt
    from matplotlib import style
    import numpy as np
    from kneed import KneeLocator
    
    style.use('seaborn-whitegrid')
    
    x = np.arange(1, 3, 0.01)*np.pi
    y = np.cos(x)
    
    # 计算各种参数组合下的拐点
    kneedle_cov_inc = KneeLocator(x,
                          y,
                          curve='convex',
                          direction='increasing',
                          online=True)
    
    kneedle_cov_dec = KneeLocator(x,
                          y,
                          curve='convex',
                          direction='decreasing',
                          online=True)
    
    kneedle_con_inc = KneeLocator(x,
                          y,
                          curve='concave',
                          direction='increasing',
                          online=True)
    
    kneedle_con_dec = KneeLocator(x,
                          y,
                          curve='concave',
                          direction='decreasing',
                          online=True)
    
    
    fig, axe = plt.subplots(2, 2, figsize=[12, 12])
    
    axe[0, 0].plot(x, y, 'k--')
    axe[0, 0].annotate(s='Knee Point', xy=(kneedle_cov_inc.knee+0.2, kneedle_cov_inc.knee_y), fontsize=10)
    axe[0, 0].scatter(x=kneedle_cov_inc.knee, y=kneedle_cov_inc.knee_y, c='b', s=200, marker='^', alpha=1)
    axe[0, 0].set_title('convex+increasing')
    axe[0, 0].fill_between(np.arange(1, 1.5, 0.01)*np.pi, np.cos(np.arange(1, 1.5, 0.01)*np.pi), 1, alpha=0.5, color='red')
    axe[0, 0].set_ylim(-1, 1)
    
    axe[0, 1].plot(x, y, 'k--')
    axe[0, 1].annotate(s='Knee Point', xy=(kneedle_cov_dec.knee+0.2, kneedle_cov_dec.knee_y), fontsize=10)
    axe[0, 1].scatter(x=kneedle_cov_dec.knee, y=kneedle_cov_dec.knee_y, c='b', s=200, marker='^', alpha=1)
    axe[0, 1].fill_between(np.arange(2.5, 3, 0.01)*np.pi, np.cos(np.arange(2.5, 3, 0.01)*np.pi), 1, alpha=0.5, color='red')
    axe[0, 1].set_title('convex+decreasing')
    axe[0, 1].set_ylim(-1, 1)
    
    axe[1, 0].plot(x, y, 'k--')
    axe[1, 0].annotate(s='Knee Point', xy=(kneedle_con_inc.knee+0.2, kneedle_con_inc.knee_y), fontsize=10)
    axe[1, 0].scatter(x=kneedle_con_inc.knee, y=kneedle_con_inc.knee_y, c='b', s=200, marker='^', alpha=1)
    axe[1, 0].fill_between(np.arange(1.5, 2, 0.01)*np.pi, np.cos(np.arange(1.5, 2, 0.01)*np.pi), 1, alpha=0.5, color='red')
    axe[1, 0].set_title('concave+increasing')
    axe[1, 0].set_ylim(-1, 1)
    
    axe[1, 1].plot(x, y, 'k--')
    axe[1, 1].annotate(s='Knee Point', xy=(kneedle_con_dec.knee+0.2, kneedle_con_dec.knee_y), fontsize=10)
    axe[1, 1].scatter(x=kneedle_con_dec.knee, y=kneedle_con_dec.knee_y, c='b', s=200, marker='^', alpha=1)
    axe[1, 1].fill_between(np.arange(2, 2.5, 0.01)*np.pi, np.cos(np.arange(2, 2.5, 0.01)*np.pi), 1, alpha=0.5, color='red')
    axe[1, 1].set_title('concave+decreasing')
    axe[1, 1].set_ylim(-1, 1)
    
    # 导出图像
    plt.savefig('图2.png', dpi=300)

    图中红色区域分别对应符合参数条件的搜索区域,蓝色三角形为每种参数组合下由kneed检测到的最优拐点:

    462dfd2d26c38828144b718c8fd6810e.png

    图2

    下面我们扩大余弦函数中x的范围,绘制出提取到的所有局部拐点:

    x = np.arange(0, 6, 0.01)*np.pi
    y = np.cos(x)
    
    # 计算convex+increasing参数组合下的拐点
    kneedle = KneeLocator(x,
                          y,
                          curve='convex',
                          direction='increasing',
                          online=True)
    
    fig, axe = plt.subplots(figsize=[8, 4])
    
    axe.plot(x, y, 'k--')
    axe.annotate(s='Knee Point', xy=(kneedle.knee+0.2, kneedle.knee_y), fontsize=10)
    axe.set_title('convex+increasing')
    axe.fill_between(np.arange(1, 1.5, 0.01)*np.pi, np.cos(np.arange(1, 1.5, 0.01)*np.pi), 1, alpha=0.5, color='red')
    axe.fill_between(np.arange(3, 3.5, 0.01)*np.pi, np.cos(np.arange(3, 3.5, 0.01)*np.pi), 1, alpha=0.5, color='red')
    axe.fill_between(np.arange(5, 5.5, 0.01)*np.pi, np.cos(np.arange(5, 5.5, 0.01)*np.pi), 1, alpha=0.5, color='red')
    axe.scatter(x=list(kneedle.all_knees), y=np.cos(list(kneedle.all_knees)), c='b', s=200, marker='^', alpha=1)
    axe.set_ylim(-1, 1)
    
    # 导出图像
    plt.savefig('图3.png', dpi=300)

    得到的结果如图3所示,其中注意,在使用kneed检测拐点时,落在最左或最右的拐点是无效拐点:

    aae2cea8d20af606cbb09f13629202ae.png

    2.2 探索新冠肺炎疫情数据

    接下来我们尝试将上文介绍的kneed应用到新冠肺炎数据上,来探究各个指标数学意义上的拐点是否已经出现。

    使用到的原始数据来自https://github.com/BlankerL/DXY-COVID-19-Data ,这个Github仓库以丁香园数据为数据源,实时同步更新粒度到城市级别的疫情发展数据。

    你可以在本文开头提到的我的Github仓库对应本文路径下找到下文使用到的数据,更新时间为2020-02-18 22:55:07,下面开始我们的分析。

    首先我们读入DXYArea.csv文件,并查看其信息,为了后面方便处理我们在读入时将updateTime列提前解析为时间格式:

    c9ecc141d0ee9b0376b92dca2b47af22.png
    import pandas as pd
    
    raw = pd.read_csv('DXYArea.csv', parse_dates=['updateTime'])
    raw.info()

    87efcfb4385abb887644d8fccd1f0509.png

    查看其第一行信息:  

    raw.loc[0,:]

    984704587c171fa4e837d9530e12e4b4.png

    可以看到,原始数据中包含了省、市信息,以及对应省及市的最新累计确诊人数累计疑似人数累计治愈人数累计死亡人数信息。

    我们的目的是检测全国范围内,累计确诊人数日新增确诊人数治愈率死亡率随时间(单位:天)变化下的曲线,是否已经出现数学意义上的拐点(由于武汉市数据变化的复杂性和特殊性,下面的分析只围绕除武汉市之外的其他地区进行)。

    首先我们对所有市取每天最晚一次更新的数据作为当天正式的记录值:

    # 抽取updateTime列中的年、月、日信息分别保存到新列中
    raw['year'], raw['month'], raw['day'] = list(zip(*raw['updateTime'].apply(lambda d: (d.year, d.month, d.day))))
    
    # 得到每天每个市最晚一次更新的疫情数据
    temp = raw.sort_values(['provinceName', 'cityName', 'year', 'month', 'day', 'updateTime'],
                    ascending=False,
                    ignore_index=True).groupby(['provinceName', 'cityName', 'year', 'month', 'day']) 
                                      .agg({'province_confirmedCount': 'first',
                                            'province_curedCount': 'first',
                                            'province_deadCount': 'first',
                                            'city_confirmedCount': 'first',
                                            'city_curedCount': 'first',
                                            'city_deadCount': 'first'}) 
                                      .reset_index(drop=False)
    
    # 查看前5行
    temp.head()

    742a2d1926507e62498310f4e11acd9f.png

    有了上面处理好的数据,接下来我们针对全国(除武汉市外)的相关指标的拐点进行分析。

    首先我们来对截止到今天(2020-2-18)我们关心的指标进行计算并做一个基本的可视化:

    # 计算各指标时序结果
    # 全国(除武汉市外)累计确诊人数
    nationwide_confirmed_count = temp[temp['cityName'] != '武汉'].groupby(['year', 'month', 'day']) 
                                                                 .agg({'city_confirmedCount': 'sum'}) 
                                                                 .reset_index(drop=False)
    
    # 全国(除武汉市外)累计治愈人数
    nationwide_cured_count = temp[temp['cityName'] != '武汉'].groupby(['year', 'month', 'day']) 
                                                             .agg({'city_curedCount': 'sum'}) 
                                                             .reset_index(drop=False)
    
    # 全国(除武汉市外)累计死亡人数
    nationwide_dead_count = temp[temp['cityName'] != '武汉'].groupby(['year', 'month', 'day']) 
                                                             .agg({'city_deadCount': 'sum'}) 
                                                             .reset_index(drop=False)
    
    # 全国(除武汉市外)每日新增确诊人数,即为nationwide_confirmed_count的一阶差分
    nationwide_confirmed_inc_count = nationwide_confirmed_count['city_confirmedCount'].diff()[1:]
    
    # 全国(除武汉市外)治愈率
    nationwide_cured_ratio = nationwide_cured_count['city_curedCount'] / nationwide_confirmed_count['city_confirmedCount']
    
    # 全国(除武汉市外)死亡率
    nationwide_died_ratio = nationwide_dead_count['city_deadCount'] / nationwide_confirmed_count['city_confirmedCount']
    
    # 绘图
    
    #解决中文显示问题
    plt.rcParams['font.sans-serif'] = ['KaiTi']
    plt.rcParams['axes.unicode_minus'] = False
    
    fig, axes = plt.subplots(3, 2, figsize=[12, 18])
    
    axes[0, 0].plot(nationwide_confirmed_count.index, nationwide_confirmed_count['city_confirmedCount'], 'k--')
    axes[0, 0].set_title('累计确诊人数', fontsize=20)
    axes[0, 0].set_xticks(nationwide_confirmed_count.index)
    axes[0, 0].set_xticklabels([f"{nationwide_confirmed_count.loc[i, 'month']}-{nationwide_confirmed_count.loc[i, 'day']}"
                                for i in nationwide_confirmed_count.index], rotation=60)
    
    axes[0, 1].plot(nationwide_cured_count.index, nationwide_cured_count['city_curedCount'], 'k--')
    axes[0, 1].set_title('累计治愈人数', fontsize=20)
    axes[0, 1].set_xticks(nationwide_cured_count.index)
    axes[0, 1].set_xticklabels([f"{nationwide_cured_count.loc[i, 'month']}-{nationwide_cured_count.loc[i, 'day']}"
                                for i in nationwide_cured_count.index], rotation=60)
    
    axes[1, 0].plot(nationwide_dead_count.index, nationwide_dead_count['city_deadCount'], 'k--')
    axes[1, 0].set_title('累计死亡人数', fontsize=20)
    axes[1, 0].set_xticks(nationwide_dead_count.index)
    axes[1, 0].set_xticklabels([f"{nationwide_dead_count.loc[i, 'month']}-{nationwide_dead_count.loc[i, 'day']}"
                                for i in nationwide_dead_count.index], rotation=60)
    
    axes[1, 1].plot(nationwide_confirmed_inc_count.index, nationwide_confirmed_inc_count, 'k--')
    axes[1, 1].set_title('每日新增确诊人数', fontsize=20)
    axes[1, 1].set_xticks(nationwide_confirmed_inc_count.index)
    axes[1, 1].set_xticklabels([f"{nationwide_confirmed_count.loc[i, 'month']}-{nationwide_confirmed_count.loc[i, 'day']}"
                                for i in nationwide_confirmed_inc_count.index], rotation=60)
    
    axes[2, 0].plot(nationwide_cured_ratio.index, nationwide_cured_ratio, 'k--')
    axes[2, 0].set_title('治愈率', fontsize=20)
    axes[2, 0].set_xticks(nationwide_cured_ratio.index)
    axes[2, 0].set_xticklabels([f"{nationwide_cured_count.loc[i, 'month']}-{nationwide_cured_count.loc[i, 'day']}"
                                for i in nationwide_cured_ratio.index], rotation=60)
    
    axes[2, 1].plot(nationwide_died_ratio.index, nationwide_died_ratio, 'k--')
    axes[2, 1].set_title('死亡率', fontsize=20)
    axes[2, 1].set_xticks(nationwide_died_ratio.index)
    axes[2, 1].set_xticklabels([f"{nationwide_dead_count.loc[i, 'month']}-{nationwide_dead_count.loc[i, 'day']}"
                                for i in nationwide_died_ratio.index], rotation=60)
    
    fig.suptitle('全国范围(除武汉外)', fontsize=30)
    
    # 导出图像
    plt.savefig('图7.png', dpi=300)

    cb92502a0ec4c95d9e4d5c45a4701a7f.png

    接着就到了检测拐点的时候了。

    为了简化代码,我们先编写自定义函数,用于从KneeLocatorcurvedirection参数的全部组合中,搜索合法的拐点输出值及对应拐点的趋势变化类型,若无则返回None:

    def knee_point_search(x, y):
        
        # 转为list以支持负号索引
        x, y = x.tolist(), y.tolist()
        output_knees = []
        for curve in ['convex', 'concave']:
            for direction in ['increasing', 'decreasing']:
                model = KneeLocator(x=x, y=y, curve=curve, direction=direction, online=False)
                if model.knee != x[0] and model.knee != x[-1]:
                    output_knees.append((model.knee, model.knee_y, curve, direction))
        
        if output_knees.__len__() != 0:
            print('发现拐点!')
            return output_knees
        else:
            print('未发现拐点!')

    下面我们对每个指标进行拐点搜索。

    先来看看累计确诊数,经过程序的搜索,并未发现有效拐点:

    接着检测累计治愈数,发现了有效拐点:

    744915c2f4d5557661e6a937dc396f49.png

    在曲线图上标记出拐点:

    744915c2f4d5557661e6a937dc396f49.png
    knee_info = knee_point_search(x=nationwide_cured_count.index,
                      y=nationwide_cured_count['city_curedCount'])
    fig, axe = plt.subplots(figsize=[8, 6])
    axe.plot(nationwide_cured_count.index, nationwide_cured_count['city_curedCount'], 'k--')
    axe.set_title('累计治愈人数', fontsize=20)
    axe.set_xticks(nationwide_cured_count.index)
    axe.set_xticklabels([f"{nationwide_cured_count.loc[i, 'month']}-{nationwide_cured_count.loc[i, 'day']}"
                                for i in nationwide_cured_count.index], rotation=60)
    
    for point in knee_info:
        axe.scatter(x=point[0], y=point[1], c='b', s=200, marker='^')
        axe.annotate(s=f'{point[2]} {point[3]}', xy=(point[0]+1, point[1]), fontsize=14)
    
    # 导出图像
    plt.savefig('图10.png', dpi=300)

    ee56de49e3ef9aae55ee36da689eaca0.png

    结合其convex+increasing的特点,可以说明从2月5日开始,累计治愈人数有了明显的加速上升趋势。

    再来看看累计死亡人数

    8756f886468ac2208a2bea8de535e76c.png

    图11

    绘制出其拐点:

    a08c62a58a71ea54712d31ab5aa91695.png

    图12

    同样在2月5日开始,累计死亡人数跟累计治愈人数同步,有了较为明显的加速上升趋势。

    对于日新增确诊数则找到了两个拐点,虽然这个指标在变化趋势上看波动较为明显,但结合其参数信息还是可以推断出其在第一个拐点处增速放缓,在第二个拐点出加速下降,说明全国除武汉之外的地区抗疫工作已经有了明显的成果:

    ee81510795a0b0c533c49b4beac08fcd.png

    图13

    治愈率死亡率同样出现了拐点,其中治愈率出现加速上升的拐点,伴随着广大医疗工作者的辛勤付出,更好的疗法加速了治愈率的上升:

    269071ab34683628bdd4aebd3cf3ca9e.png

    图14

    死亡率虽然最新一次的拐点代表着加速上升,但通过比较其与治愈率的变化幅度比较可以看出,死亡率的绝对增长量十分微弱:

    51a863745d08b4b78660d1bff6c22bf8.png

    图15

    通过上面的分析,可以看出在这场针对新冠肺炎的特殊战役中,到目前为止,除武汉外其他地区已取得阶段性的进步,但仍然需要付出更大的努力来巩固来之不易的改变。

    展开全文
  • 医药生物行业周观点:全球疫情拐点或已确认.pdf
  • 离散点拐点查找

    2011-11-12 09:18:18
    工控离散点拐点查找快速查找,准确度高,计算方便,使用简单
  • 稀土行业动态点评:稀土行业趋势拐点确认 板块性行情开启.pdf
  • 通信运营行业专题报告:行业拐点确认,运营商价值凸显.pdf
  • 20191125-东吴证券-机械行业2020年度策略:优选确定、布局拐点
  • 20191125-东吴证券-机械行业2020年度策略:优选确定、布局拐点.pdf
  • 机械行业周报:海洋工程装备获政策支持,轨交装备行业迎确定拐点.pdf
  • 驻点、极值点与拐点极值点:驻点(平稳点):驻点与极值点的关系拐点(反曲点)判断函数的极值点与拐点的条件不用作为判断依据的条件同一个点可否同时为极值点和拐点 极值点: 定义: 注意: 极值点是曲线上极大值...
  • ECharts的拐点样式

    千次阅读 2018-07-04 11:36:02
    ECharts&amp;nbsp;雷达图拐点样式设置 symbol: 'circle', //设置拐点格式样式 如:实心圆,空心圆或不显示拐点等 itemStyle: { normal:...
  • 电子行业周报:从海外财报看“芯”拐点确定性.pdf
  • 关注公众号:盈时 在期货上,或者广义的讲,在各类投机产品上——中国现在的股市也是投机市场——要想赚大钱,必须要抓拐点。 追涨杀跌,虽然能赚钱,但是只能...有一种“追涨杀跌”,不在此列,就是为了确认拐点已...
  • 钢铁行业报告:社库拐点已现 基本面阶段性反转确认.pdf
  • 汽车行业月度报告:汽车销量同比继续增长,行业拐点得到确认.pdf
  • 机械设备行业周报:工业自动化边际回暖,长期拐点仍待确认.pdf
  • 通信行业2020年度策略报告:需求拐点确认,聚焦行业龙头及其供应商.pdf
  • 家电行业周报:空调主动去库进入关键期,厨电拐点仍待确认.pdf
  • 20140916-光大证券-纺织和服装行业体育产业专题(二)受益确定性最强的体育运动用品行业将迎长周期需求拐点:长周期拐点将至,大市值公司隐现.pdf
  • 机械设备:全球电动车景气拐点确认,关注优秀油服企业本轮成长性.pdf
  • elbow 求拐点

    2018-06-15 09:46:00
    distancePointLine <- function(x, y, slope, intercept) { ## x, y is the point to test. ## slope, intercept is the line to check distance. ## ## Returns distance from the line. ## ## Returns 99...
  • 机械行业点评报告:拐点已现,看好工业机器人行业成长性及确定性.pdf
  • 2020年机械行业年度策略:拥抱专用设备确定性,静观通用设备拐点显现.pdf
  • 基础化工行业周报:生物素、PVC价格上涨,配额趋紧R22价格拐点进一步确认.pdf

空空如也

空空如也

1 2 3 4 5 ... 20
收藏数 6,838
精华内容 2,735
关键字:

如何确定拐点