精华内容
下载资源
问答
  • Twig Prune for 为CraftCMS模板添加一个Twig过滤器,以“删减”条目的字段。 安装 将prune目录移动到craft/plugins目录中。 从Craft控制面板转到设置>插件,然后启用prune插件 用法 这样做的主要原因是要控制输出...
  • Npm prune

    2020-12-29 04:34:22
    <div><p>Prune unused dependencies from node_modules on push. Currently they accumulate.</p><p>该提问来源于开源项目:heroku/heroku-buildpack-nodejs</p></div>
  • 85% bn prune+78% conv prune 0.284 3.7 M 替换backbone model size mAPval 0.5:0.95 mAPval 0.5 yolov5s 640 0.357 0.558 mobilenetv3small 0.75 640 TD TD 调参 浅层尽量少剪,从训练完成后gamma...
  • prune-crx插件

    2021-04-04 03:10:31
    语言:English 一个扩展,可帮助您修剪选项卡花园。 prune可以防止您打开重复项并清除您一段时间未查看的所有选项卡,从而帮助您管理选项卡。
  • 远程分支的3种状态 远程仓库确实存在分支dev 本地版本库(.git)中的远程快照 和远程分支建立联系的...Prune all unreachable objects from the object database unreachable objects 指的是.git\objects中没有...

    远程分支的3种状态

    • 远程仓库确实存在分支dev
    • 本地版本库(.git)中的远程快照
    • 和远程分支建立联系的本地分支

    git prune
    https://git-scm.com/docs/git-prune

    • Prune all unreachable objects from the object database
      unreachable objects 指的是.git\objects中没有被使用的hash文件
    song@test MINGW64 /d/Git/Temp (master)
    $ git prune -n
    0baff3f3df27aacdd2edb6f83a5c47dd3b7ca05b tree
    
    song@test MINGW64 /d/Git/Temp (master)
    $ git prune
    
    song@test MINGW64 /d/Git/Temp (master)
    $ git prune -n
    • 1
    • 2
    • 3
    • 4
    • 5
    • 6
    • 7
    • 8
    • 9

    git remote prune origin
    https://git-scm.com/docs/git-remote

    • Deletes all stale remote-tracking branches under .
    • 会清理掉状态2中的远程库已被删除的远程分支,本地库仍存在的 stale remote-tracking branches
    song@test MINGW64 /d/Git/Temp (master)
    $ git branch
    * master
    
    song@test MINGW64 /d/Git/Temp (master)
    $ git checkout -b dev
    Switched to a new branch 'dev'
    
    song@test MINGW64 /d/Git/Temp (dev)
    $ git push origin dev
    Total 0 (delta 0), reused 0 (delta 0)
    To github.com:Song2017/Temp.git
     * [new branch]      dev -> dev
    
    song@test MINGW64 /d/Git/Temp (dev)
    $ git branch -a -v
    * dev                   3902953  add readme.md
      master                3902953  add readme.md
      remotes/origin/HEAD   -> origin/master
      remotes/origin/dev    3902953  add readme.md
      remotes/origin/master 3902953  add readme.md
    • 1
    • 2
    • 3
    • 4
    • 5
    • 6
    • 7
    • 8
    • 9
    • 10
    • 11
    • 12
    • 13
    • 14
    • 15
    • 16
    • 17
    • 18
    • 19
    • 20
    • 21
    • 接下来,在github中删掉dev分支,此时本地版本库中的数据快照仍然有dev分支
    • git remote prune 会与远程库进行一次同步,最终清理掉版本库中的dev分支,但本地工作区中的dev分支并不会删除。。
    song@test MINGW64 /d/Git/Temp (master)
    $ git remote prune origin
    Pruning origin
    URL: git@github.com:Song2017/Temp.git
     * [pruned] origin/dev
    
    song@test MINGW64 /d/Git/Temp (master)
    $ git branch -a -v
      dev                   3902953  add readme.md
    * master                3902953  add readme.md
      remotes/origin/HEAD   -> origin/master
      remotes/origin/master 3902953  add readme.md
    • 1
    • 2
    • 3
    • 4
    • 5
    • 6
    • 7
    • 8
    • 9
    • 10
    • 11
    • 12

    git fetch –(2 -)prune
    https://git-scm.com/docs/git-fetch

    • Before fetching, remove any remote-tracking references that no longer exist on the remote
    • 同git remote prune

    删除本地分支

    git branch -d/D dev  -D 强制删除
    展开全文
  • Snap revision prune

    2020-12-05 08:47:26
    <div><p>This prune function will remove all the cached revision files expect the latest revision under the snapcraft’s xdg cache(XDG_CACHE_HOME/snapcraft/{project_name}/revisions), <p>So far I'm ...
  • <div><p>and perhaps prompt before running prune</p><p>该提问来源于开源项目:microsoft/vscode-docker</p></div>
  • channel-prune-源码

    2021-05-16 07:23:53
    这是的PyTorch实现。 在可以实现3倍的模型尺寸缩减,并且精度... (请参阅prune_InceptionV3_example.py和prune_Resnet50_example.py) 要修剪新模型,您需要根据模型的体系结构在FilterPruner下定义一个转发函数和
  • prune resnet18

    2019-12-07 12:36:49
    最近在学习模型压缩中的剪枝,但是对于怎么实现剪枝不太了解,于是查找了别人的代码,并在过程中加入自己的注释理解 这次学习的是在resnet18训练好的...以下是prune模块 # -*- coding: utf-8 -*- '''Deep Compre...

    最近在学习模型压缩中的剪枝
    但是对于怎么实现剪枝不太了解
    于是查找了别人的代码,并在过程中加入自己的注释理解

    这次学习的是在resnet18训练好的cifar-10 下进行的剪枝
    代码源于
    https://github.com/kentaroy47/Deep-Compression.Pytorch

    以下是prune模块

    # -*- coding: utf-8 -*-
    
    '''Deep Compression with PyTorch.'''
    from __future__ import print_function
    
    import torch
    import torch.nn as nn
    import torch.optim as optim
    import torch.nn.functional as F
    import torch.backends.cudnn as cudnn
    
    import torchvision
    import torchvision.transforms as transforms
    
    import os
    import argparse
    
    from models import *
    from utils import progress_bar
    
    import numpy as np
    
    parser = argparse.ArgumentParser(description='PyTorch CIFAR10 Pruning')
    parser.add_argument('--loadfile', '-l', default="checkpoint/res18.t7",dest='loadfile')
    parser.add_argument('--prune', '-p', default=0.5, dest='prune', help='Parameters to be pruned')
    parser.add_argument('--lr', default=0.01, type=float, help='learning rate')
    parser.add_argument('--net', default='res18')
    args = parser.parse_args()
    
    prune = float(args.prune)  #prune = 0.5 剪去50%
    
    device = 'cuda' if torch.cuda.is_available() else 'cpu'
    best_acc = 0  # best test accuracy
    start_epoch = 0  # start from epoch 0 or last checkpoint epoch
    
    # Data
    print('==> Preparing data..')
    transform_train = transforms.Compose([
        transforms.RandomCrop(32, padding=4),
        transforms.RandomHorizontalFlip(),
        transforms.ToTensor(),
        transforms.Normalize((0.4914, 0.4822, 0.4465), (0.2023, 0.1994, 0.2010)),
    ])
    transform_test = transforms.Compose([
        transforms.ToTensor(),
        transforms.Normalize((0.4914, 0.4822, 0.4465), (0.2023, 0.1994, 0.2010)),
    ])
    trainset = torchvision.datasets.CIFAR10(root='./data', train=True, download=True, transform=transform_train)
    trainloader = torch.utils.data.DataLoader(trainset, batch_size=128, shuffle=True, num_workers=0)
    
    testset = torchvision.datasets.CIFAR10(root='./data', train=False, download=True, transform=transform_test)
    testloader = torch.utils.data.DataLoader(testset, batch_size=100, shuffle=False, num_workers=0)
    
    # Model
    print('==> Building model..')
    if args.net=='res18':
        net = ResNet18()
    elif args.net=='vgg':
        net = VGG('VGG19')
        
    net = net.to(device)
    if device == 'cuda':
        net = torch.nn.DataParallel(net)
        cudnn.benchmark = True
    
    
    # Load weights from checkpoint.
    print('==> Resuming from checkpoint..')
    assert os.path.isfile(args.loadfile), 'Error: no checkpoint directory found!'
    checkpoint = torch.load(args.loadfile)   #dict
    net.load_state_dict(checkpoint['net'])
    #dict_keys(['acc', 'epoch', 'net', 'address', 'mask']), len(checkpoint) = 5
    print(checkpoint.values())
    classes = ('plane', 'car', 'bird', 'cat', 'deer', 'dog', 'frog', 'horse', 'ship', 'truck')
    
    def prune_weights(torchweights):
        weights=np.abs(torchweights.cpu().numpy());
        weightshape=weights.shape  #返回一个整型数字的元组,元组中的每个元素表示相应的数组每一维的长度
        rankedweights=weights.reshape(weights.size).argsort()
        #.reshape(weightshape)  这里应该是将weights化成一维形式,argsort()函数是将x中的元素从小到大排列,提取其对应的index(索引号) 
        
        num = weights.size
        prune_num = int(np.round(num*prune))
        print('prune_num:',prune_num)
        count=0
        masks = np.zeros_like(rankedweights)
        for n, rankedweight in enumerate(rankedweights):  #n是idx, rankweight 是idx对应的权重
            if rankedweight > prune_num:
                masks[n]=1
            else: count+=1
    #        if n<15:
    #            print("n, rankedweight:",n,'\t',rankedweight)
                
    #    print('masks:',masks)
        print("total weights:", num)
        print("weights pruned:",count)
        
        masks=masks.reshape(weightshape)   #转化成只有1 and 0 的矩阵形式再与weights相乘即可将某些权重清零
        weights=masks*weights
        
        return torch.from_numpy(weights).cuda(), masks
    '''for example
    pruning layer: module.layer1.0.conv2.weight
    prune_num: 18432
    n, rankedweight: 0          14054
    n, rankedweight: 1          1747
    n, rankedweight: 2          31774
    n, rankedweight: 3          16140
    n, rankedweight: 4          1811
    n, rankedweight: 5          35556
    n, rankedweight: 6          16134
    n, rankedweight: 7          1784
    n, rankedweight: 8          7769
    n, rankedweight: 9          1896
    n, rankedweight: 10          16356
    n, rankedweight: 11          2028
    n, rankedweight: 12          1808
    n, rankedweight: 13          30484
    n, rankedweight: 14          30050
    masks: [0 0 1 ... 1 0 1]
    total weights: 36864
    weights pruned: 18433
    ###############################
    64
    '''
    #    print("rankedweights:",rankedweights)
        
    # prune weights
    # The pruned weight location is saved in the addressbook and maskbook.
    # These will be used during training to keep the weights zero.
    addressbook=[]
    maskbook=[]
    #items把字典的每一对key和value组成数组后以列表的形式返回
    for k, v in net.state_dict().items():
        if "conv2" in k:
            addressbook.append(k)
            # k = module.layer*.*.conv2.weight  字典名称
            print("pruning layer:",k)
    #        print('\t', v,v.size(1),'\t',v.size(2))
            weights=v  #矩阵 512 * 3
            weights, masks = prune_weights(weights)
    #        print(len(masks))  #len = 64, 128, 256, 512
            maskbook.append(masks)
    #        print(weights)
            checkpoint['net'][k] = weights
            
    checkpoint['address'] = addressbook
    checkpoint['mask'] = maskbook
    net.load_state_dict(checkpoint['net'])
    
    # Training
    
    criterion = nn.CrossEntropyLoss()
    optimizer = optim.Adam(net.parameters(), lr=args.lr, weight_decay=5e-4)
    
    def train(epoch):
        print('\nEpoch: %d' % epoch)
        net.train()
        train_loss = 0
        correct = 0
        total = 0
        for batch_idx, (inputs, targets) in enumerate(trainloader):
            inputs, targets = inputs.to(device), targets.to(device)
            optimizer.zero_grad()
            outputs = net(inputs)
            loss = criterion(outputs, targets)
            loss.backward()
            
            # mask pruned weights 
            checkpoint['net']=net.state_dict()
    #        print("zeroing..")
    #        print(np.count_nonzero(checkpoint['net'][addressbook[0]].cpu().numpy()))  
    #        #count_nonzero  数module.layer1.0.conv2.weight  里面weight != 0 的个数
            for address, mask in zip(addressbook, maskbook):
                print(address)
                checkpoint['net'][address] = torch.from_numpy(checkpoint['net'][address].cpu().numpy() * mask)
            print(checkpoint['net'][address])
            print(np.count_nonzero(checkpoint['net'][addressbook[0]].cpu().numpy()))  
            optimizer.step()
    
            train_loss += loss.item()
            _, predicted = outputs.max(1)
            total += targets.size(0)
            correct += predicted.eq(targets).sum().item()
    
            progress_bar(batch_idx, len(trainloader), 'Loss: %.3f | Acc: %.3f%% (%d/%d)'
                % (train_loss/(batch_idx+1), 100.*correct/total, correct, total))
    
    def test(epoch):
        global best_acc
        net.eval()
        test_loss = 0
        correct = 0
        total = 0
        with torch.no_grad():
            for batch_idx, (inputs, targets) in enumerate(testloader):
                inputs, targets = inputs.to(device), targets.to(device)
                outputs = net(inputs)
                loss = criterion(outputs, targets)
    
                test_loss += loss.item()
                _, predicted = outputs.max(1)
                total += targets.size(0)
                correct += predicted.eq(targets).sum().item()
    
                progress_bar(batch_idx, len(testloader), 'Loss: %.3f | Acc: %.3f%% (%d/%d)'
                    % (test_loss/(batch_idx+1), 100.*correct/total, correct, total))
    
        # Save checkpoint.
        acc = 100.*correct/total
        if acc > best_acc:
            print('Saving..')
            state = {
                'net': net.state_dict(),
                'acc': acc,
                'epoch': epoch,
            }
            if not os.path.isdir('checkpoint'):
                os.mkdir('checkpoint')
            torch.save(state, './checkpoint/pruned-'+args.net+'-ckpt.t7')
            best_acc = acc
    
    
    if __name__ == '__main__':
        for epoch in range(start_epoch, start_epoch+20):
            train(epoch)
            test(epoch)
            with open("prune-results-"+str(prune)+'-'+str(args.net)+".txt", "a") as f: 
                f.write(str(epoch)+"\n")
                f.write(str(best_acc)+"\n")
    

    自己还有很多不太懂的地方,记录一下学习经历,day day up

    展开全文
  • npm-prune

    2020-04-15 18:25:27
    npm prune [[<@scope>/]…] [–production] [–dry-run] [–json] 详情 此命令移除“无关”的包。如果提供了包名,那么只有名称匹配的那个包才会被移除。 无关的包指的是没有在父包的依赖关系列表中列出的包。...

    移除无关的包
    概要

    npm prune [[<@scope>/]…] [–production] [–dry-run] [–json]

    详情

    此命令移除“无关”的包。如果提供了包名,那么只有名称匹配的那个包才会被移除。

    无关的包指的是没有在父包的依赖关系列表中列出的包。

    如果指定了 --production 参数,或者将 NODE_ENV 环境变量 设置为 production,这个命令将移除 devDependencies 配置信息中列出的包。设置 --no-production 将会取消 NODE_ENV 为 production 的设置。

    If the --dry-run flag is used then no changes will actually be made.

    If the --json flag is used then the changes npm prune made (or would have made with --dry-run) are printed as a JSON object.

    In normal operation with package-locks enabled, extraneous modules are pruned automatically when modules are installed and you’ll only need this command with the --production flag.

    If you’ve disabled package-locks then extraneous modules will not be removed and it’s up to you to run npm prune from time-to-time to remove them.

    展开全文
  • prune_mx_face-源码

    2021-04-30 16:36:01
    主要代码:微调和修剪 finetune.py 修剪过滤器: prune_once.py 捕获梯度: gradcam.py VGG16模型和sphereface模型: models.py
  • Linux下find命令拥有多种查找方式,那么find命令中加上-path -prune参数会怎么样呢?下面小编就给大家介绍下find命令中-path -prune的用法。假如在当前目录下查找文件,且当前目录下有很多文件及目录(多层目录),...

    Linux下find命令拥有多种查找方式,那么find命令中加上-path -prune参数会怎么样呢?下面小编就给大家介绍下find命令中-path -prune的用法。

    0616ef5cfe0ea137f51be96a524896d1.png

    假如在当前目录下查找文件,且当前目录下有很多文件及目录(多层目录),包括dir0、dir1和dir2 。。。等目录及dir00、dir01.。.dir10、dir11.。。等子目录。

    1. 在当前目录下查找所有txt后缀文件

    find 。/ -name *.txt

    2.在当前目录下的dir0目录及子目录下查找txt后缀文件

    find 。/ -path ‘。/dir0*’ -name *.txt

    3.在当前目录下的dir0目录下的子目录dir00及其子目录下查找txt后缀文件

    find 。/ -path ‘*dir00*’ -name *.txt

    4.在除dir0及子目录以外的目录下查找txt后缀文件

    find 。/ -path ‘。/dir0*’ -a -prune -o -name *.txt -print

    说明:-a 应该是and的缩写,意思是逻辑运算符‘或’(&&); -o应该是or的缩写,意思是逻辑运算符‘与’(||), -not 表示非。

    命令行的意思是:如果目录dir0存在(即-a左边为真),则求-prune的值,-prune 返回真,‘与’逻辑表达式为真(即-path ‘。/dir0*’ -a -prune 为真),find命令将在除这个目录以外的目录下查找txt后缀文件并打印出来;如果目录dir0不存在(即-a左边为假),则不求值-prune ,‘与’逻辑表达式为假,则在当前目录下查找所有txt后缀文件。

    5.在除dir0、dir1及子目录以外的目录下查找txt后缀文件

    find 。/ \( -path ‘。/dir0*’ -o -path ‘。/dir1*’ \) -a -prune -o -name *.txt -print

    注意:圆括号()表示表达式的结合。即指示 shell 不对后面的字符作特殊解释,而留给 find 命令去解释其意义。由于命令行不能直接使用圆括号,所以需要用反斜杠‘\’进行转意(即‘\’转意字符使命令行认识圆括号)。同时注意‘\(’,‘\)’两边都需空格。

    6.在dir0、dir1及子目录下查找txt后缀文件

    find 。/ \( -path ‘。/dir0*’ -o -path ‘。/dir1*’ \) -a -name *.txt -print

    +1. 在所有以名为dir_general的目录下查找txt后缀文件

    find 。/ -path ‘*/dir_general/*’ -name *.txt -print

    上面就是Linux下find命令-path -prune的用法介绍了,可以上传,find组合-path -prune参数能产生6种查找方式,满足需要特定查找的用户要求,赶紧试试看吧。

    展开全文
  • prune_usernotes 机器人修剪用户笔记
  • @ slsplus / node-prune 节点修剪工具,从./node_modules中修剪不必要的文件,例如markdown,打字稿源文件等。 受golan版本启发。 安装 $ npm i @slsplus/node-prune -g 用法 $ np -h Usage: np [options] Prune ...
  • find-prune

    2018-03-09 23:52:19
    gl@gl:~$ find ./ -name 'hello'./info/hello./hello./temp/hellogl@gl:~$ find ./ -path './temp' -prune./tempgl@gl:~$ find ./ -path './temp' -print./tem...
  • I think <code>prune</code> should delete everything including workdir. <p>Also right now it's possible to run prune multiple times. On the second time it prints: <pre><code> Removing network srcd-...
  • Prune ttMove like any other

    2021-01-09 18:14:03
    Prune it like any other and use <code>moveCount > 1</code> for LMR. <p><strong>STC with Hash=16</strong></p> <pre><code> LLR: 2.95 (-2.94,2.94) [-3.00,1.00] Total: 49264 W: 10076 L: 10007 D: ...
  • git-prune 修剪远程存储库(源)上的就绪分支和旧分支 npm install -g git-prune 用法 git-prune 如果ready/*模式中的任何分支超过 1 天,都将从origin删除。 任何其他分支,除了master ,如果超过 30 天,将从...
  • To prune, or not to prune: exploring the efficacy of pruning for model compression 1 Introduction given a bound on the model’s memory footprint, how can we arrive at the most accurate model? 作者对比...
  • 删除旧分支 此操作将删除最近X天未提交的分支,并删除最早的标签。 请勿使用v3,因为它存在错误 要求 需要签出仓库,可以通过市场上的action / checkout @ v2 action... uses : digicert/prune_old_branches_action@v1
  • Prune-and-Search

    2010-07-24 08:22:13
    Prune-and-search discuss Prune and Search Algorithm
  • Prune 是一个自动测试 Play Framework 性能的工具。 它会自动检查不同版本的 Play,针对这些版本编译应用程序,然后运行负载测试。 它将所有结果保存到 Git 存储库中的文件中。 它还会将结果摘要推送到网站。 Prune...
  • This prune command misfired "you need to enter a number between 1 and 99." when I supplied the number 1. <p><strong>To Reproduce Steps to reproduce the behavior: 1. Use the command '!prune...
  • 在Windows中可以在某些路径中查找文件,也可以设定不在某些路径中查找文件,下面用Linux中的find的命令结合其-path -prune参数来看看在Linux中怎么实现此功能。  假如在当前目录下查找文件,且当前目录下有很多...
  • 如果想查找某目录下的某些文件,但是想要避开某个目录,使用find 的-prune,但是-prune用法很严格,网上有很多文章介绍了它的用法,但是经过本人的实际使用,有些并不好用。 目录排除: 选项:-prune find ./ -path '...
  • t pretend I completely understand the problem, but the following discussion suggests using <code>prune_bundler</code> in Puma config. <p>https://github.com/puma/puma/issues/416</p><p>该提问来源于开源...
  • 0xd2cdf7 doris::ShardedLRUCache::prune() *** SIGSEGV () received by PID 38777 (TID 0x7f9cd32e4700) from PID 0; stack trace: *** @ 0x7f9cde7632f0 (unknown) @ 0xd2cdf7 doris::ShardedLRUCache:...
  • 当你的远程分支已删除时,本地Gitbranch-a 查看时,发现那些删除的分支还在,想删除?那恭喜你,就可以使用下面这两个分支了: *it fetch origin --prune or git remote prune origin*

空空如也

空空如也

1 2 3 4 5 ... 20
收藏数 28,663
精华内容 11,465
关键字:

prune