精华内容
下载资源
问答
  • 基于matlab加权图像融合 二、源代码 clear g_R=0; g_G=0; g_B=0; h_R=0; h_G=0; h_B=0; fenzi_R=0; fenzi_G=0; fenzi_B=0; fenmu_up_R=0; fenmu_up_G=0; fenmu_up_B=0; fenmu_low_R=0; fenmu_low_G=0; fenmu_low_B=0...

    一、简介

    基于matlab加权图像融合

    二、源代码

    clear
    g_R=0;
    g_G=0;
    g_B=0;
    h_R=0;
    h_G=0;
    h_B=0;
    fenzi_R=0;
    fenzi_G=0;
    fenzi_B=0;
    fenmu_up_R=0;
    fenmu_up_G=0;
    fenmu_up_B=0;
    fenmu_low_R=0;
    fenmu_low_G=0;
    fenmu_low_B=0;
    tableR=[];
    tableG=[];
    tableB=[];
    up=imread('high.jpg');         %读图像
    low=imread('low.jpg');
    
    figure(1)
    imshow(up);                                     %读RGB数值
    
    [M,N,color]=size(up);
    
    title('加权-RGB表示的高分辨率图像');
    
    figure(2)
    imshow(low); 
    title('加权-RGB表示的低分辨率图像');
    r=double(up(:,:,1));
    g=double(up(:,:,2));
    b=double(up(:,:,3));
    r_low=double(low(:,:,1));
    g_low=double(low(:,:,2));
    b_low=double(low(:,:,3));
    RGB(:,:,1)=0.5*r+0.5*r_low;
    RGB(:,:,2)=0.5*g+0.5*g_low;
    RGB(:,:,3)=0.5*b+0.5*b_low;
    R=RGB(:,:,1);
    G=RGB(:,:,2);
    B=RGB(:,:,3);
    RGB=uint8(round(RGB));   
    figure(3)
    imshow(RGB)
    title('加权-RGB转化后的图像');
    
    
    
    
                  %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
                  %                       下面是计算平均梯度G                          %
                  %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%     
    
    
    for ii=1:M-1
        for jj=1:N-1
            g_R=g_R+sqrt((((r(ii+1,jj)-r(ii,jj))^2+(r(ii,jj+1)-r(ii,jj))^2))/2);
            g_G=g_G+sqrt((((g(ii+1,jj)-g(ii,jj))^2+(g(ii,jj+1)-g(ii,jj))^2))/2);
            g_B=g_B+sqrt((((b(ii+1,jj)-b(ii,jj))^2+(b(ii,jj+1)-b(ii,jj))^2))/2);
    
        end
    end
    fprintf('\n\n   highR的清晰度为:%.4f\n   highG的清晰度为:%.4f\n   highG的清晰度为:%.4f',...
                g_R/(M-1)/(N-1),g_G/(M-1)/(N-1),g_B/(M-1)/(N-1));              
                  
    g_R=0;
    g_G=0;
    g_B=0;
                  
    
    for ii=1:M-1
        for jj=1:N-1
            g_R=g_R+sqrt((((r_low(ii+1,jj)-r_low(ii,jj))^2+(r_low(ii,jj+1)-r_low(ii,jj))^2))/2);
            g_G=g_G+sqrt((((g_low(ii+1,jj)-g_low(ii,jj))^2+(g_low(ii,jj+1)-g_low(ii,jj))^2))/2);
            g_B=g_B+sqrt((((b_low(ii+1,jj)-b_low(ii,jj))^2+(b_low(ii,jj+1)-b_low(ii,jj))^2))/2);
    
        end
    end
    fprintf('\n\n   lowR的清晰度为:%.4f\n   lowG的清晰度为:%.4f\n   lowG的清晰度为:%.4f',...
             g_R/(M-1)/(N-1),g_G/(M-1)/(N-1),g_B/(M-1)/(N-1));              
                  
    g_R=0;
    g_G=0;
    g_B=0;
          
    
    for ii=1:M-1
        for jj=1:N-1
            g_R=g_R+sqrt((((R(ii+1,jj)-R(ii,jj))^2+(R(ii,jj+1)-R(ii,jj))^2))/2);
            g_G=g_G+sqrt((((G(ii+1,jj)-G(ii,jj))^2+(G(ii,jj+1)-G(ii,jj))^2))/2);
            g_B=g_B+sqrt((((B(ii+1,jj)-B(ii,jj))^2+(B(ii,jj+1)-B(ii,jj))^2))/2);
    
        end
    

    三、运行结果

    在这里插入图片描述
    在这里插入图片描述

    在这里插入图片描述
    在这里插入图片描述

    四、备注

    完整代码或者代写添加QQ 1564658423

    展开全文
  • %--------------------------------------------------------------------------%计量经济学服务中心《空间计量经济学及Matlab应用》%--------------------------------------------------------------------------...

    %--------------------------------------------------------------------------%计量经济学服务中心《空间计量经济学及Matlab应用》%--------------------------------------------------------------------------Vname=VariableGeometrically weighted regression estimatesDependent Variable = crimeR-squared = 0.9418Rbar-squared = 0.9393Bandwidth = 0.6518# iterations = 17Decay type = gaussianNobs, Nvars = 49, 3***************************************Obs = 1, x-coordinate= 42.3800, y-coordinate= 35.6200, sige= 3.4125Variable Coefficient t-statistic t-probabilityconstant 51.197363 9.212794 0.000000income -0.461038 -1.678857 0.099547hvalue -0.434237 -3.693955 0.000556Obs = 2, x-coordinate= 40.5200, y-coordinate= 36.5000, sige= 6.7847Variable Coefficient t-statistic t-probabilityconstant 63.564308 9.955778 0.000000income -0.369902 -0.991321 0.326399hvalue -0.683553 -4.656428 0.000025Obs = 3, x-coordinate= 38.7100, y-coordinate= 36.7100, sige= 8.6457Variable Coefficient t-statistic t-probabilityconstant 72.673672 9.395151 0.000000income -0.161106 -0.269269 0.788853hvalue -0.826921 -5.367996 0.000002Obs = 4, x-coordinate= 38.4100, y-coordinate= 33.3600, sige= 5.2400Variable Coefficient t-statistic t-probabilityconstant 81.381328 7.772343 0.000000income 0.149437 0.194405 0.846662hvalue -1.073198 -9.228621 0.000000Obs = 5, x-coordinate= 44.0700, y-coordinate= 38.8000, sige= 0.6985Variable Coefficient t-statistic t-probabilityconstant 46.737222 13.309854 0.000000income -0.689933 -2.949392 0.004869hvalue -0.223718 -4.843363 0.000013Obs = 6, x-coordinate= 41.1800, y-coordinate= 39.8200, sige= 2.7853Variable Coefficient t-statistic t-probabilityconstant 57.351504 10.979281 0.000000income -0.971958 -2.506024 0.015580hvalue -0.310679 -3.233765 0.002189Obs = 7, x-coordinate= 38.0000, y-coordinate= 40.0100, sige= 2.2903Variable Coefficient t-statistic t-probabilityconstant 79.683414 14.237667 0.000000income -1.990153 -3.856516 0.000336hvalue -0.402011 -2.423975 0.019088Obs = 8, x-coordinate= 39.2800, y-coordinate= 43.7500, sige= 0.6613Variable Coefficient t-statistic t-probabilityconstant 79.374676 10.227137 0.000000income -3.294825 -6.536725 0.000000hvalue 0.059876 0.936349 0.353686Obs = 9, x-coordinate= 34.9100, y-coordinate= 39.6100, sige= 2.8503Variable Coefficient t-statistic t-probabilityconstant 72.218154 10.454632 0.000000income -1.988247 -2.094491 0.041410hvalue -0.063618 -0.350051 0.727801Obs = 10, x-coordinate= 36.4200, y-coordinate= 47.6100, sige= 0.3660Variable Coefficient t-statistic t-probabilityconstant 54.058540 23.753628 0.000000income -1.719995 -13.667469 0.000000hvalue 0.033105 1.405730 0.166114Obs = 11, x-coordinate= 34.4600, y-coordinate= 48.5800, sige= 0.6241Variable Coefficient t-statistic t-probabilityconstant 55.363293 25.304369 0.000000income -1.767205 -17.192082 0.000000hvalue 0.019889 0.631315 0.530769Obs = 12, x-coordinate= 32.6500, y-coordinate= 49.6100, sige= 1.1183Variable Coefficient t-statistic t-probabilityconstant 54.800116 18.118969 0.000000income -1.673002 -11.485387 0.000000hvalue 0.000544 0.009703 0.992298Obs = 13, x-coordinate= 29.9100, y-coordinate= 50.1100, sige= 1.8016Variable Coefficient t-statistic t-probabilityconstant 49.090996 11.226397 0.000000income -1.206984 -4.006017 0.000209hvalue -0.061675 -0.484850 0.629943Obs = 14, x-coordinate= 27.8000, y-coordinate= 51.2400, sige= 1.1740Variable Coefficient t-statistic t-probabilityconstant 42.025898 8.693270 0.000000income -1.049190 -2.353344 0.022662hvalue 0.034076 0.168603 0.866803Obs = 15, x-coordinate= 25.2400, y-coordinate= 50.8900, sige= 0.5074Variable Coefficient t-statistic t-probabilityconstant 42.023487 9.931361 0.000000income -2.035189 -4.285382 0.000085hvalue 0.541132 2.264220 0.028021Obs = 16, x-coordinate= 27.9300, y-coordinate= 48.4400, sige= 1.8858Variable Coefficient t-statistic t-probabilityconstant 50.858338 11.840157 0.000000income -0.970544 -2.036648 0.047108hvalue -0.163386 -0.772214 0.443696Obs = 17, x-coordinate= 31.9100, y-coordinate= 46.7300, sige= 1.9074Variable Coefficient t-statistic t-probabilityconstant 63.935965 23.149085 0.000000income -1.851883 -8.000266 0.000000hvalue -0.065988 -0.739344 0.463225Obs = 18, x-coordinate= 35.9200, y-coordinate= 43.4400, sige= 1.0826Variable Coefficient t-statistic t-probabilityconstant 61.515865 14.934047 0.000000income -1.892916 -5.019187 0.000007hvalue 0.013062 0.157654 0.875377Obs = 19, x-coordinate= 33.4600, y-coordinate= 43.3700, sige= 2.7225Variable Coefficient t-statistic t-probabilityconstant 65.413374 17.271159 0.000000income -2.860764 -4.125718 0.000143hvalue 0.275876 1.222495 0.227368Obs = 20, x-coordinate= 33.1400, y-coordinate= 41.1300, sige= 5.0673Variable Coefficient t-statistic t-probabilityconstant 66.620907 8.186391 0.000000income -1.619154 -1.246106 0.218651hvalue -0.110761 -0.311593 0.756672Obs = 21, x-coordinate= 31.6100, y-coordinate= 43.9500, sige= 2.6677Variable Coefficient t-statistic t-probabilityconstant 68.176378 23.711500 0.000000income -3.351877 -5.596394 0.000001hvalue 0.449873 2.009607 0.049996Obs = 22, x-coordinate= 30.4000, y-coordinate= 44.1000, sige= 2.6080Variable Coefficient t-statistic t-probabilityconstant 68.744965 23.673393 0.000000income -3.282837 -5.969185 0.000000hvalue 0.438989 2.053138 0.045418Obs = 23, x-coordinate= 29.1800, y-coordinate= 43.7000, sige= 2.8861Variable Coefficient t-statistic t-probabilityconstant 69.068145 17.943117 0.000000income -3.326136 -5.847206 0.000000hvalue 0.468812 1.914810 0.061364Obs = 24, x-coordinate= 28.7800, y-coordinate= 41.0400, sige= 8.1087Variable Coefficient t-statistic t-probabilityconstant 77.271200 7.416531 0.000000income -3.000189 -3.431018 0.001230hvalue 0.167212 0.398976 0.691645Obs = 25, x-coordinate= 27.3100, y-coordinate= 43.2300, sige= 4.0434Variable Coefficient t-statistic t-probabilityconstant 67.368725 9.525528 0.000000income -3.069044 -3.468780 0.001099hvalue 0.363366 0.754567 0.454120Obs = 26, x-coordinate= 24.9600, y-coordinate= 42.6700, sige= 2.5678Variable Coefficient t-statistic t-probabilityconstant 61.306231 5.851086 0.000000income 0.006368 0.004423 0.996489hvalue -1.071954 -1.162870 0.250514Obs = 27, x-coordinate= 25.9000, y-coordinate= 41.2100, sige= 6.2344Variable Coefficient t-statistic t-probabilityconstant 59.819535 4.913992 0.000010income -1.697764 -1.212751 0.231040hvalue -0.138505 -0.170688 0.865172Obs = 28, x-coordinate= 25.8500, y-coordinate= 39.3200, sige= 5.2496Variable Coefficient t-statistic t-probabilityconstant 45.265068 2.954417 0.004803income -2.135825 -1.284057 0.205161hvalue 0.591982 0.883602 0.381226Obs = 29, x-coordinate= 27.4900, y-coordinate= 41.0900, sige= 8.3927Variable Coefficient t-statistic t-probabilityconstant 72.899979 6.290465 0.000000income -3.258441 -2.970188 0.004599hvalue 0.307426 0.555503 0.581078Obs = 30, x-coordinate= 28.8200, y-coordinate= 38.3200, sige= 6.0199Variable Coefficient t-statistic t-probabilityconstant 80.285094 7.449344 0.000000income -0.676605 -0.717337 0.476572hvalue -0.618717 -2.097025 0.041175Obs = 31, x-coordinate= 30.9000, y-coordinate= 41.3100, sige= 5.9421Variable Coefficient t-statistic t-probabilityconstant 68.118651 7.883805 0.000000income -1.803631 -2.133632 0.037905hvalue 0.059481 0.178766 0.858859Obs = 32, x-coordinate= 32.8800, y-coordinate= 39.3600, sige= 4.5678Variable Coefficient t-statistic t-probabilityconstant 58.637810 7.764366 0.000000income 0.495270 0.487439 0.628121hvalue -0.388646 -1.896549 0.063791Obs = 33, x-coordinate= 30.6400, y-coordinate= 39.7200, sige= 5.1218Variable Coefficient t-statistic t-probabilityconstant 70.568456 9.798923 0.000000income -0.218856 -0.335471 0.738702hvalue -0.448133 -1.933095 0.059014Obs = 34, x-coordinate= 30.3500, y-coordinate= 38.2900, sige= 3.1096Variable Coefficient t-statistic t-probabilityconstant 80.030552 12.784499 0.000000income 0.036213 0.068159 0.945936hvalue -0.786849 -5.179351 0.000004Obs = 35, x-coordinate= 32.0900, y-coordinate= 36.6000, sige= 3.5543Variable Coefficient t-statistic t-probabilityconstant 63.967857 8.009308 0.000000income 0.337987 0.382854 0.703484hvalue -0.492099 -3.556112 0.000846Obs = 36, x-coordinate= 34.0800, y-coordinate= 37.6000, sige= 2.7764Variable Coefficient t-statistic t-probabilityconstant 67.746908 11.590897 0.000000income -0.755463 -0.934476 0.354641hvalue -0.243619 -2.063643 0.044369Obs = 37, x-coordinate= 36.1200, y-coordinate= 37.1300, sige= 5.2909Variable Coefficient t-statistic t-probabilityconstant 65.979447 8.493093 0.000000income -0.082415 -0.089905 0.928729hvalue -0.420816 -3.386697 0.001402Obs = 38, x-coordinate= 36.3000, y-coordinate= 37.8500, sige= 4.1933Variable Coefficient t-statistic t-probabilityconstant 70.241135 9.853816 0.000000income -0.851484 -1.007494 0.318647hvalue -0.331039 -2.669386 0.010278Obs = 39, x-coordinate= 36.4000, y-coordinate= 35.9500, sige= 7.5290Variable Coefficient t-statistic t-probabilityconstant 60.058183 6.254403 0.000000income 1.346573 1.335110 0.188010hvalue -0.676333 -5.334379 0.000002Obs = 40, x-coordinate= 35.6000, y-coordinate= 35.7200, sige= 6.1315Variable Coefficient t-statistic t-probabilityconstant 59.441973 6.623725 0.000000income 1.197840 1.197762 0.236772hvalue -0.582346 -4.668150 0.000024Obs = 41, x-coordinate= 34.6600, y-coordinate= 35.7600, sige= 4.4315Variable Coefficient t-statistic t-probabilityconstant 64.924831 8.369141 0.000000income 0.077997 0.082850 0.934308hvalue -0.395195 -3.149189 0.002788Obs = 42, x-coordinate= 33.9200, y-coordinate= 36.1500, sige= 2.7971Variable Coefficient t-statistic t-probabilityconstant 68.995022 11.150905 0.000000income -0.721014 -0.903935 0.370452hvalue -0.276558 -2.460420 0.017450Obs = 43, x-coordinate= 30.4200, y-coordinate= 34.0800, sige= 1.6449Variable Coefficient t-statistic t-probabilityconstant 42.987174 3.069301 0.003493income -0.130118 -0.085542 0.932179hvalue -0.015665 -0.079318 0.937103Obs = 44, x-coordinate= 28.2600, y-coordinate= 30.3200, sige= 1.5262Variable Coefficient t-statistic t-probabilityconstant 38.427625 7.366893 0.000000income -0.618892 -1.442734 0.155458hvalue -0.192297 -0.847771 0.400688Obs = 45, x-coordinate= 29.8500, y-coordinate= 27.9400, sige= 1.2787Variable Coefficient t-statistic t-probabilityconstant 31.201319 5.045446 0.000007income -0.603071 -1.601338 0.115730hvalue -0.061387 -0.506869 0.614520Obs = 46, x-coordinate= 28.2100, y-coordinate= 27.2700, sige= 1.5429Variable Coefficient t-statistic t-probabilityconstant 27.113505 4.535402 0.000037income -0.413367 -1.327768 0.190407hvalue -0.043317 -0.586370 0.560318Obs = 47, x-coordinate= 26.6900, y-coordinate= 24.2500, sige= 0.5555Variable Coefficient t-statistic t-probabilityconstant 24.205091 3.701915 0.000542income -0.278691 -0.853556 0.397505hvalue -0.032273 -0.812649 0.420350Obs = 48, x-coordinate= 25.7100, y-coordinate= 25.4700, sige= 0.6629Variable Coefficient t-statistic t-probabilityconstant 24.211353 4.173486 0.000122income -0.271872 -0.991422 0.326350hvalue -0.034801 -0.854272 0.397112Obs = 49, x-coordinate= 26.5800, y-coordinate= 29.0200, sige= 1.4185Variable Coefficient t-statistic t-probabilityconstant 30.052990 5.675101 0.000001income -0.431664 -1.644271 0.106522hvalue -0.081504 -0.934733 0.354510

    展开全文
  • 【实例简介】地理加权回归(GWR)matlab代码,亲测可用,该代码利用matlab实现了地理加权回归的代码,内附实际算例。【实例截图】【核心代码】function result = gwr(y,x,east,north,info);% PURPOSE: compute ...

    【实例简介】地理加权回归(GWR)matlab代码,亲测可用,该代码利用matlab实现了地理加权回归的代码,内附实际算例。

    【实例截图】

    【核心代码】

    function result = gwr(y,x,east,north,info);

    % PURPOSE: compute geographically weighted regression

    %----------------------------------------------------

    % USAGE: results = gwr(y,x,east,north,info)

    % where: y = dependent variable vector

    % x = explanatory variable matrix

    % east = x-coordinates in space

    % north = y-coordinates in space

    % info = a structure variable with fields:

    % info.bwidth = scalar bandwidth to use or zero

    % for cross-validation estimation (default)

    % info.bmin = minimum bandwidth to use in CV search

    % info.bmax = maximum bandwidth to use in CV search

    % defaults: bmin = 0.1, bmax = 20

    % info.dtype = 'gaussian' for Gaussian weighting (default)

    % = 'exponential' for exponential weighting

    % = 'tricube' for tri-cube weighting

    % info.q = q-nearest neighbors to use for tri-cube weights

    % (default: CV estimated)

    % info.qmin = minimum # of neighbors to use in CV search

    % info.qmax = maximum # of neighbors to use in CV search

    % defaults: qmin = nvar 2, qmax = 4*nvar

    % ---------------------------------------------------

    % NOTE: res = gwr(y,x,east,north) does CV estimation of bandwidth

    % ---------------------------------------------------

    % RETURNS: a results structure

    % results.meth = 'gwr'

    % results.beta = bhat matrix (nobs x nvar)

    % results.tstat = t-stats matrix (nobs x nvar)

    % results.yhat = yhat

    % results.resid = residuals

    % results.sige = e'e/(n-dof) (nobs x 1)

    % results.nobs = nobs

    % results.nvar = nvars

    % results.bwidth = bandwidth if gaussian or exponential

    % results.q = q nearest neighbors if tri-cube

    % results.dtype = input string for Gaussian, exponential weights

    % results.iter = # of simplex iterations for cv

    % results.north = north (y-coordinates)

    % results.east = east (x-coordinates)

    % results.y = y data vector

    %---------------------------------------------------

    % See also: prt,plt, prt_gwr, plt_gwr to print and plot results

    %---------------------------------------------------

    % References: Brunsdon, Fotheringham, Charlton (1996)

    % Geographical Analysis, pp. 281-298

    %---------------------------------------------------

    % NOTES: uses auxiliary function scoref for cross-validation

    %---------------------------------------------------

    % written by: James P. LeSage 2/98

    % University of Toledo

    % Department of Economics

    % Toledo, OH 43606

    % jpl@jpl.econ.utoledo.edu

    if nargin == 5 % user options

    if ~isstruct(info)

    error('gwr: must supply the option argument as a structure variable');

    else

    fields = fieldnames(info);

    nf = length(fields);

    % set defaults

    [n k] = size(x);

    bwidth = 0; dtype = 0; q = 0; qmin = k 2; qmax = 5*k;

    bmin = 0.1; bmax = 20.0;

    for i=1:nf

    if strcmp(fields{i},'bwidth')

    bwidth = info.bwidth;

    elseif strcmp(fields{i},'dtype')

    dstring = info.dtype;

    if strcmp(dstring,'gaussian')

    dtype = 0;

    elseif strcmp(dstring,'exponential')

    dtype = 1;

    elseif strcmp(dstring,'tricube')

    dtype = 2;

    end;

    elseif strcmp(fields{i},'q')

    q = info.q;

    elseif strcmp(fields{i},'qmax');

    qmax = info.qmax;

    elseif strcmp(fields{i},'qmin');

    qmin = info.qmin;

    elseif strcmp(fields{i},'bmin');

    bmin = info.bmin;

    elseif strcmp(fields{i},'bmax');

    bmax = info.bmax;

    end;

    end; % end of for i

    end; % end of if else

    elseif nargin == 4

    bwidth = 0; dtype = 0; dstring = 'gaussian';

    bmin = 0.1; bmax = 20.0;

    else

    error('Wrong # of arguments to gwr');

    end;

    % error checking on inputs

    [nobs nvar] = size(x);

    [nobs2 junk] = size(y);

    [nobs3 junk] = size(north);

    [nobs4 junk] = size(east);

    result.north = north;

    result.east = east;

    if nobs ~= nobs2

    error('gwr: y and x must contain same # obs');

    elseif nobs3 ~= nobs

    error('gwr: north coordinates must equal # obs');

    elseif nobs3 ~= nobs4

    error('gwr: east coordinates must equal # in north');

    end;

    switch dtype

    case{0,1} % bandwidth cross-validation

    if bwidth == 0 % cross-validation

    options = optimset('fminbnd');

    optimset('MaxIter',500);

    if dtype == 0 % Gaussian weights

    [bdwt,junk,exitflag,output] = fminbnd('scoref',bmin,bmax,options,y,x,east,north,dtype);

    elseif dtype == 1 % exponential weights

    [bdwt,junk,exitflag,output] = fminbnd('scoref',bmin,bmax,options,y,x,east,north,dtype);

    end;

    if output.iterations == 500,

    fprintf(1,'gwr: cv convergence not obtained in %4d iterations',output.iterations);

    else

    result.iter = output.iterations;

    end;

    else

    bdwt = bwidth*bwidth; % user supplied bandwidth

    end;

    case{2} % q-nearest neigbhor cross-validation

    if q == 0 % cross-validation

    q = scoreq(qmin,qmax,y,x,east,north);

    else

    % use user-supplied q-value

    end;

    otherwise

    end;

    % do GWR using bdwt as bandwidth

    [n k] = size(x);

    bsave = zeros(n,k);

    ssave = zeros(n,k);

    sigv = zeros(n,1);

    yhat = zeros(n,1);

    resid = zeros(n,1);

    wt = zeros(n,1);

    d = zeros(n,1);

    for iter=1:n;

    dx = east - east(iter,1);

    dy = north - north(iter,1);

    d = (dx.*dx dy.*dy);

    sd = std(sqrt(d));

    % sort distance to find q nearest neighbors

    ds = sort(d);

    if dtype == 2, dmax = ds(q,1); end;

    if dtype == 0, % Gausian weights

    wt = stdn_pdf(sqrt(d)/(sd*bdwt));

    elseif dtype == 1, % exponential weights

    wt = exp(-d/bdwt);

    elseif dtype == 2, % tricube weights

    wt = zeros(n,1);

    nzip = find(d <= dmax);

    wt(nzip,1) = (1-(d(nzip,1)/dmax).^3).^3;

    end; % end of if,else

    wt = sqrt(wt);

    % computational trick to speed things up

    % use non-zero wt to pull out y,x observations

    nzip = find(wt >= 0.01);

    ys = y(nzip,1).*wt(nzip,1);

    xs = matmul(x(nzip,:),wt(nzip,1));

    xpxi = invpd(xs'*xs);

    b = xpxi*xs'*ys;

    % compute predicted values

    yhatv = xs*b;

    yhat(iter,1) = x(iter,:)*b;

    resid(iter,1) = y(iter,1) - yhat(iter,1);

    % compute residuals

    e = ys - yhatv;

    % find # of non-zero observations

    nadj = length(nzip);

    sige = (e'*e)/nadj;

    % compute t-statistics

    sdb = sqrt(sige*diag(xpxi));

    % store coefficient estimates and std errors in matrices

    % one set of beta,std for each observation

    bsave(iter,:) = b';

    ssave(iter,:) = sdb';

    sigv(iter,1) = sige;

    end;

    % fill-in results structure

    result.meth = 'gwr';

    result.nobs = nobs;

    result.nvar = nvar;

    if (dtype == 0 | dtype == 1)

    result.bwidth = sqrt(bdwt);

    else

    result.q = q;

    end;

    result.beta = bsave;

    result.tstat = bsave./ssave;

    result.sige = sigv;

    result.dtype = dstring;

    result.y = y;

    result.yhat = yhat;

    % compute residuals and conventional r-squared

    result.resid = resid;

    sigu = result.resid'*result.resid;

    ym = y - mean(y);

    rsqr1 = sigu;

    rsqr2 = ym'*ym;

    result.rsqr = 1.0 - rsqr1/rsqr2; % r-squared

    rsqr1 = rsqr1/(nobs-nvar);

    rsqr2 = rsqr2/(nobs-1.0);

    result.rbar = 1 - (rsqr1/rsqr2); % rbar-squared

    展开全文
  • 局部加权线性最小二乘就不需要我们预先知道待求解的模型,因为该方法是基于多个线性函数的叠加,最终只用到了线性模型。计算线性模型时引入了一个加权函数: 来给当前预测数据分配权重,分配机制是:给距离近的点更...

    通常我们使用的最小二乘都需要预先设定一个模型,然后通过最小二乘方法解出模型的系数。

    而大多数情况是我们是不知道这个模型的,比如这篇博客中z=ax^2+by^2+cxy+dx+ey+f 这样的模型。

    局部加权线性最小二乘就不需要我们预先知道待求解的模型,因为该方法是基于多个线性函数的叠加,最终只用到了线性模型。

    计算线性模型时引入了一个加权函数:

    1326dc70afa50c3a43bced739b161385.gif 来给当前预测数据分配权重,分配机制是:给距离近的点更高的权重,给距离远的点更低的权重。

    公式中的k类似与高斯函数中的sigma。

    当sigma变大时,函数变得矮胖,计算局部线性函数时更多的使用全局数据;

    当sigma变小时,函数变得瘦高,计算局部线性函数时更多的使用局部数据。

    代码如下:

    clear all;

    close all;

    clc;

    x=(1:0.1:10)';

    y=x.^2+x+3 +rand(length(x),1)*6;

    plot(x,y,'.')

    sigma=0.1; %设置局部窗口,越大越使用全局数据,越小越使用局部数据

    W=zeros(length(x));

    C=[];

    for i=1:length(x)

    for j=1:length(x)

    W(j,j)=exp(-((x(i)-x(j))^2)/(2*sigma^2)); %权重矩阵

    end

    XX=[x ones(length(x),1)];

    YY=y;

    C=[C inv(XX'*W*XX)*XX'*W*YY]; %加权最小二乘,计算求得局部线性函数的系数

    end

    re=diag(XX*C);

    hold on;

    plot(x,re);

    结果如下:

    197495f49169c2de0146088b05877027.png

    可以看出,红色的局部线性函数最终拟合出了全局的数据。

    不过该方法既然不需要知道模型,那我们如何预测未来的数据结果呢?

    展开全文
  • 对数据进行加权的另一个常见原因是记录的每个观测值实际上是在相同的 x 值处提取的几个测量值的均值。在此处的数据中,假设前两个值各代表一个原始测量值,其余四个值中的每个值分别代表一个从 5 个原始测量值求得的...
  • matlab加权平均值和相对误差算法

    千次阅读 2018-12-13 12:42:06
    y:数据 w:w1,w2,w3,w4=1,2,3,4 1+2+3+4=10  y=[676 825 774 716 940 1159 1384 1524 1668 1688 1958 2031 2234 2566 2820 3006 3093 3277 3514 3770 4107]; w=[1/10;2/10;3/10;...for i=1:...
  • matlab开发-加权加权线性。对不同标准差的数据点进行加权拟合。
  • Bayliss加权MATLAB程序

    2020-11-09 21:47:13
    Bayliss加权MATLAB程序
  • matlab开发-加权数据库

    2019-08-26 09:12:35
    matlab开发-加权数据库。加权数据分块。
  • Matlab实现加权K近邻

    千次阅读 多人点赞 2019-05-21 15:02:57
    加权K近邻是K近邻的一种修正,当理解K近邻之后,加权K近邻则很好理解了,不说了,上代码, function label1=WKNN(training,testing,k) [row, column]=size(training); [row1, column1]=size(testing); %计算测试集...
  • MATLAB局部加权线性拟合

    千次阅读 2019-11-10 13:20:21
    局部加权线性拟合相对于多项式拟合,计算线性模型时引入了一个加权函数: 来给当前预测数据分配权重。给距离近的点更高的权重,给距离远的点更低的权重,往往能具有更高的精确性。 x=(1:0.1:10)'; y=x.^2+3.*x+...
  • 1Matlab:地理加权回归模型命令简介在Matlab软件中,可以调用gwr.m来实现地理加权回归模型的参数过程,下面介绍 GWR 在Matlab中的实现过程:“gwr.m”函数命令的调用方式如下所示:#%% 地理加权回归模型MATLAB程序...
  • matlab 局部加权回归

    千次阅读 2013-03-24 23:30:45
    似然函数:   似然函数每一项加上了权值,且添加了正则化参数。 令,lamda=0.0001 tau取值自己在实践中调整 使用牛顿方法逼近最大似然: ...----------------------------------------------
  • matlab开发-加权支持向量机。基于支持向量机的异常控制趋势模式不平衡分类
  • 动态加权综合评价方法* * (3). 逼近理想点(TOPSIS)方法 4. 综合评价数学模型的建立方法 二、综合评价的一般方法 * * (3). 逼近理想点(TOPSIS)方法 4. 综合评价数学模型的建立方法 二、综合评价的一般方法 * * 1. ...
  • matlab开发-加权归一化交叉相关。通过标准化互相关(但使用加权模板)在图像中执行模式匹配
  • 1Matlab:地理加权回归模型命令简介在Matlab软件中,可以调用gwr.m来实现地理加权回归模型的参数过程,下面介绍 GWR 在Matlab中的实现过程:“gwr.m”函数命令的调用方式如下所示:#%% 地理加权回归模型MATLAB程序...
  • 圆周相控阵天线阵列的密度加权matlab程序代码。采用MATLAB实现密度加权
  • 基于matlab加权图像融合 二、源代码 clear g_R=0; g_G=0; g_B=0; h_R=0; h_G=0; h_B=0; fenzi_R=0; fenzi_G=0; fenzi_B=0; fenmu_up_R=0; fenmu_up_G=0; fenmu_up_B=0; fenmu_low_R=0; fenmu_low_G=0; fenmu_low_B=0...
  • matlab开发-加权中数据的位置标准偏差。插值加权中值及其标准差和分布的计算
  • matlab实现梯度倒数加权滤波

    热门讨论 2011-12-02 15:47:59
    MATLAB实现梯度倒数加权的滤波算法代码。
  • 提出了一种新的进化搜索算法,即加权差分进化算法(WDE)。 在本文中,提出了加权差分演化算法(WDE)来解决实数值优化问题。实际上,当随机确定WDE的所有参数时,WDE没有控制参数,只有模式大小。WDE可以...
  • 加权Voronoi算法,Matlab

    2017-05-08 16:27:23
    加权Voronoi的生成算法
  • 语言:matlab 输入:一个有向加权图的矩阵A 输出:一个无向无权图B 代码: A=[1,0,1,1;0,1,0,0;0,1,1,0;0,1,1,1]; (tril(A,-1)+triu(A',0))|(tril(A,-1)+triu(A',0)) 输入: 1 0 1 1 0 1 0 0 0 1 1 0 0 1 1 1 ...
  • 该代码是基于MATLAB 2019b编写的反距离加权函数(全局变量),代码中设置的影响级为-1,与ArcGIS不同:-2的影响级以及周围12个点进行反距离加权
  • matlab开发-加权颜色和纹理图像matting的示例选择。E.Shahrian,D.Rajan,图像铺垫的加权颜色和纹理样本选择,CVPR 2012
  • matlab开发-2加权多项式拟合与估值。两个脚本:polyFitWeighted2用权重拟合二维数据,polyVal2计算二维多项式
  • 加权的灰色关联度的计算,matlab程序
  • 该代码利用matlab实现了地理加权回归的代码,内附实际算例。

空空如也

空空如也

1 2 3 4 5 ... 20
收藏数 618
精华内容 247
关键字:

matlab加权

matlab 订阅