首页 > 其他分享 >sdxl结构分析

sdxl结构分析

时间:2023-11-27 19:13:00浏览次数:22  
标签:分析 1280 LoRACompatibleLinear 640 features sdxl 结构 True out

目录

text encoder

unet

conv_in
Conv2d(4, 320, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
time_proj
Timesteps()
time_embedding
TimestepEmbedding(
  (linear_1): LoRACompatibleLinear(in_features=320, out_features=1280, bias=True)
  (act): SiLU()
  (linear_2): LoRACompatibleLinear(in_features=1280, out_features=1280, bias=True)
)
add_time_proj
Timesteps()
add_embedding
TimestepEmbedding(
  (linear_1): LoRACompatibleLinear(in_features=2816, out_features=1280, bias=True)
  (act): SiLU()
  (linear_2): LoRACompatibleLinear(in_features=1280, out_features=1280, bias=True)
)
down_blocks
ModuleList(
  (0): DownBlock2D(
    (resnets): ModuleList(
      (0-1): 2 x ResnetBlock2D(
        (norm1): GroupNorm(32, 320, eps=1e-05, affine=True)
        (conv1): LoRACompatibleConv(320, 320, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
        (time_emb_proj): LoRACompatibleLinear(in_features=1280, out_features=320, bias=True)
        (norm2): GroupNorm(32, 320, eps=1e-05, affine=True)
        (dropout): Dropout(p=0.0, inplace=False)
        (conv2): LoRACompatibleConv(320, 320, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
        (nonlinearity): SiLU()
      )
    )
    (downsamplers): ModuleList(
      (0): Downsample2D(
        (conv): LoRACompatibleConv(320, 320, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1))
      )
    )
  )
  (1): CrossAttnDownBlock2D(
    (attentions): ModuleList(
      (0-1): 2 x Transformer2DModel(
        (norm): GroupNorm(32, 640, eps=1e-06, affine=True)
        (proj_in): LoRACompatibleLinear(in_features=640, out_features=640, bias=True)
        (transformer_blocks): ModuleList(
          (0-1): 2 x BasicTransformerBlock(
            (norm1): LayerNorm((640,), eps=1e-05, elementwise_affine=True)
            (attn1): Attention(
              (to_q): LoRACompatibleLinear(in_features=640, out_features=640, bias=False)
              (to_k): LoRACompatibleLinear(in_features=640, out_features=640, bias=False)
              (to_v): LoRACompatibleLinear(in_features=640, out_features=640, bias=False)
              (to_out): ModuleList(
                (0): LoRACompatibleLinear(in_features=640, out_features=640, bias=True)
                (1): Dropout(p=0.0, inplace=False)
              )
            )
            (norm2): LayerNorm((640,), eps=1e-05, elementwise_affine=True)
            (attn2): Attention(
              (to_q): LoRACompatibleLinear(in_features=640, out_features=640, bias=False)
              (to_k): LoRACompatibleLinear(in_features=2048, out_features=640, bias=False)
              (to_v): LoRACompatibleLinear(in_features=2048, out_features=640, bias=False)
              (to_out): ModuleList(
                (0): LoRACompatibleLinear(in_features=640, out_features=640, bias=True)
                (1): Dropout(p=0.0, inplace=False)
              )
            )
            (norm3): LayerNorm((640,), eps=1e-05, elementwise_affine=True)
            (ff): FeedForward(
              (net): ModuleList(
                (0): GEGLU(
                  (proj): LoRACompatibleLinear(in_features=640, out_features=5120, bias=True)
                )
                (1): Dropout(p=0.0, inplace=False)
                (2): LoRACompatibleLinear(in_features=2560, out_features=640, bias=True)
              )
            )
          )
        )
        (proj_out): LoRACompatibleLinear(in_features=640, out_features=640, bias=True)
      )
    )
    (resnets): ModuleList(
      (0): ResnetBlock2D(
        (norm1): GroupNorm(32, 320, eps=1e-05, affine=True)
        (conv1): LoRACompatibleConv(320, 640, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
        (time_emb_proj): LoRACompatibleLinear(in_features=1280, out_features=640, bias=True)
        (norm2): GroupNorm(32, 640, eps=1e-05, affine=True)
        (dropout): Dropout(p=0.0, inplace=False)
        (conv2): LoRACompatibleConv(640, 640, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
        (nonlinearity): SiLU()
        (conv_shortcut): LoRACompatibleConv(320, 640, kernel_size=(1, 1), stride=(1, 1))
      )
      (1): ResnetBlock2D(
        (norm1): GroupNorm(32, 640, eps=1e-05, affine=True)
        (conv1): LoRACompatibleConv(640, 640, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
        (time_emb_proj): LoRACompatibleLinear(in_features=1280, out_features=640, bias=True)
        (norm2): GroupNorm(32, 640, eps=1e-05, affine=True)
        (dropout): Dropout(p=0.0, inplace=False)
        (conv2): LoRACompatibleConv(640, 640, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
        (nonlinearity): SiLU()
      )
    )
    (downsamplers): ModuleList(
      (0): Downsample2D(
        (conv): LoRACompatibleConv(640, 640, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1))
      )
    )
  )
  (2): CrossAttnDownBlock2D(
    (attentions): ModuleList(
      (0-1): 2 x Transformer2DModel(
        (norm): GroupNorm(32, 1280, eps=1e-06, affine=True)
        (proj_in): LoRACompatibleLinear(in_features=1280, out_features=1280, bias=True)
        (transformer_blocks): ModuleList(
          (0-9): 10 x BasicTransformerBlock(
            (norm1): LayerNorm((1280,), eps=1e-05, elementwise_affine=True)
            (attn1): Attention(
              (to_q): LoRACompatibleLinear(in_features=1280, out_features=1280, bias=False)
              (to_k): LoRACompatibleLinear(in_features=1280, out_features=1280, bias=False)
              (to_v): LoRACompatibleLinear(in_features=1280, out_features=1280, bias=False)
              (to_out): ModuleList(
                (0): LoRACompatibleLinear(in_features=1280, out_features=1280, bias=True)
                (1): Dropout(p=0.0, inplace=False)
              )
            )
            (norm2): LayerNorm((1280,), eps=1e-05, elementwise_affine=True)
            (attn2): Attention(
              (to_q): LoRACompatibleLinear(in_features=1280, out_features=1280, bias=False)
              (to_k): LoRACompatibleLinear(in_features=2048, out_features=1280, bias=False)
              (to_v): LoRACompatibleLinear(in_features=2048, out_features=1280, bias=False)
              (to_out): ModuleList(
                (0): LoRACompatibleLinear(in_features=1280, out_features=1280, bias=True)
                (1): Dropout(p=0.0, inplace=False)
              )
            )
            (norm3): LayerNorm((1280,), eps=1e-05, elementwise_affine=True)
            (ff): FeedForward(
              (net): ModuleList(
                (0): GEGLU(
                  (proj): LoRACompatibleLinear(in_features=1280, out_features=10240, bias=True)
                )
                (1): Dropout(p=0.0, inplace=False)
                (2): LoRACompatibleLinear(in_features=5120, out_features=1280, bias=True)
              )
            )
          )
        )
        (proj_out): LoRACompatibleLinear(in_features=1280, out_features=1280, bias=True)
      )
    )
    (resnets): ModuleList(
      (0): ResnetBlock2D(
        (norm1): GroupNorm(32, 640, eps=1e-05, affine=True)
        (conv1): LoRACompatibleConv(640, 1280, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
        (time_emb_proj): LoRACompatibleLinear(in_features=1280, out_features=1280, bias=True)
        (norm2): GroupNorm(32, 1280, eps=1e-05, affine=True)
        (dropout): Dropout(p=0.0, inplace=False)
        (conv2): LoRACompatibleConv(1280, 1280, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
        (nonlinearity): SiLU()
        (conv_shortcut): LoRACompatibleConv(640, 1280, kernel_size=(1, 1), stride=(1, 1))
      )
      (1): ResnetBlock2D(
        (norm1): GroupNorm(32, 1280, eps=1e-05, affine=True)
        (conv1): LoRACompatibleConv(1280, 1280, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
        (time_emb_proj): LoRACompatibleLinear(in_features=1280, out_features=1280, bias=True)
        (norm2): GroupNorm(32, 1280, eps=1e-05, affine=True)
        (dropout): Dropout(p=0.0, inplace=False)
        (conv2): LoRACompatibleConv(1280, 1280, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
        (nonlinearity): SiLU()
      )
    )
  )
)
up_blocks
ModuleList(
  (0): CrossAttnUpBlock2D(
    (attentions): ModuleList(
      (0-2): 3 x Transformer2DModel(
        (norm): GroupNorm(32, 1280, eps=1e-06, affine=True)
        (proj_in): LoRACompatibleLinear(in_features=1280, out_features=1280, bias=True)
        (transformer_blocks): ModuleList(
          (0-9): 10 x BasicTransformerBlock(
            (norm1): LayerNorm((1280,), eps=1e-05, elementwise_affine=True)
            (attn1): Attention(
              (to_q): LoRACompatibleLinear(in_features=1280, out_features=1280, bias=False)
              (to_k): LoRACompatibleLinear(in_features=1280, out_features=1280, bias=False)
              (to_v): LoRACompatibleLinear(in_features=1280, out_features=1280, bias=False)
              (to_out): ModuleList(
                (0): LoRACompatibleLinear(in_features=1280, out_features=1280, bias=True)
                (1): Dropout(p=0.0, inplace=False)
              )
            )
            (norm2): LayerNorm((1280,), eps=1e-05, elementwise_affine=True)
            (attn2): Attention(
              (to_q): LoRACompatibleLinear(in_features=1280, out_features=1280, bias=False)
              (to_k): LoRACompatibleLinear(in_features=2048, out_features=1280, bias=False)
              (to_v): LoRACompatibleLinear(in_features=2048, out_features=1280, bias=False)
              (to_out): ModuleList(
                (0): LoRACompatibleLinear(in_features=1280, out_features=1280, bias=True)
                (1): Dropout(p=0.0, inplace=False)
              )
            )
            (norm3): LayerNorm((1280,), eps=1e-05, elementwise_affine=True)
            (ff): FeedForward(
              (net): ModuleList(
                (0): GEGLU(
                  (proj): LoRACompatibleLinear(in_features=1280, out_features=10240, bias=True)
                )
                (1): Dropout(p=0.0, inplace=False)
                (2): LoRACompatibleLinear(in_features=5120, out_features=1280, bias=True)
              )
            )
          )
        )
        (proj_out): LoRACompatibleLinear(in_features=1280, out_features=1280, bias=True)
      )
    )
    (resnets): ModuleList(
      (0-1): 2 x ResnetBlock2D(
        (norm1): GroupNorm(32, 2560, eps=1e-05, affine=True)
        (conv1): LoRACompatibleConv(2560, 1280, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
        (time_emb_proj): LoRACompatibleLinear(in_features=1280, out_features=1280, bias=True)
        (norm2): GroupNorm(32, 1280, eps=1e-05, affine=True)
        (dropout): Dropout(p=0.0, inplace=False)
        (conv2): LoRACompatibleConv(1280, 1280, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
        (nonlinearity): SiLU()
        (conv_shortcut): LoRACompatibleConv(2560, 1280, kernel_size=(1, 1), stride=(1, 1))
      )
      (2): ResnetBlock2D(
        (norm1): GroupNorm(32, 1920, eps=1e-05, affine=True)
        (conv1): LoRACompatibleConv(1920, 1280, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
        (time_emb_proj): LoRACompatibleLinear(in_features=1280, out_features=1280, bias=True)
        (norm2): GroupNorm(32, 1280, eps=1e-05, affine=True)
        (dropout): Dropout(p=0.0, inplace=False)
        (conv2): LoRACompatibleConv(1280, 1280, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
        (nonlinearity): SiLU()
        (conv_shortcut): LoRACompatibleConv(1920, 1280, kernel_size=(1, 1), stride=(1, 1))
      )
    )
    (upsamplers): ModuleList(
      (0): Upsample2D(
        (conv): LoRACompatibleConv(1280, 1280, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
      )
    )
  )
  (1): CrossAttnUpBlock2D(
    (attentions): ModuleList(
      (0-2): 3 x Transformer2DModel(
        (norm): GroupNorm(32, 640, eps=1e-06, affine=True)
        (proj_in): LoRACompatibleLinear(in_features=640, out_features=640, bias=True)
        (transformer_blocks): ModuleList(
          (0-1): 2 x BasicTransformerBlock(
            (norm1): LayerNorm((640,), eps=1e-05, elementwise_affine=True)
            (attn1): Attention(
              (to_q): LoRACompatibleLinear(in_features=640, out_features=640, bias=False)
              (to_k): LoRACompatibleLinear(in_features=640, out_features=640, bias=False)
              (to_v): LoRACompatibleLinear(in_features=640, out_features=640, bias=False)
              (to_out): ModuleList(
                (0): LoRACompatibleLinear(in_features=640, out_features=640, bias=True)
                (1): Dropout(p=0.0, inplace=False)
              )
            )
            (norm2): LayerNorm((640,), eps=1e-05, elementwise_affine=True)
            (attn2): Attention(
              (to_q): LoRACompatibleLinear(in_features=640, out_features=640, bias=False)
              (to_k): LoRACompatibleLinear(in_features=2048, out_features=640, bias=False)
              (to_v): LoRACompatibleLinear(in_features=2048, out_features=640, bias=False)
              (to_out): ModuleList(
                (0): LoRACompatibleLinear(in_features=640, out_features=640, bias=True)
                (1): Dropout(p=0.0, inplace=False)
              )
            )
            (norm3): LayerNorm((640,), eps=1e-05, elementwise_affine=True)
            (ff): FeedForward(
              (net): ModuleList(
                (0): GEGLU(
                  (proj): LoRACompatibleLinear(in_features=640, out_features=5120, bias=True)
                )
                (1): Dropout(p=0.0, inplace=False)
                (2): LoRACompatibleLinear(in_features=2560, out_features=640, bias=True)
              )
            )
          )
        )
        (proj_out): LoRACompatibleLinear(in_features=640, out_features=640, bias=True)
      )
    )
    (resnets): ModuleList(
      (0): ResnetBlock2D(
        (norm1): GroupNorm(32, 1920, eps=1e-05, affine=True)
        (conv1): LoRACompatibleConv(1920, 640, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
        (time_emb_proj): LoRACompatibleLinear(in_features=1280, out_features=640, bias=True)
        (norm2): GroupNorm(32, 640, eps=1e-05, affine=True)
        (dropout): Dropout(p=0.0, inplace=False)
        (conv2): LoRACompatibleConv(640, 640, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
        (nonlinearity): SiLU()
        (conv_shortcut): LoRACompatibleConv(1920, 640, kernel_size=(1, 1), stride=(1, 1))
      )
      (1): ResnetBlock2D(
        (norm1): GroupNorm(32, 1280, eps=1e-05, affine=True)
        (conv1): LoRACompatibleConv(1280, 640, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
        (time_emb_proj): LoRACompatibleLinear(in_features=1280, out_features=640, bias=True)
        (norm2): GroupNorm(32, 640, eps=1e-05, affine=True)
        (dropout): Dropout(p=0.0, inplace=False)
        (conv2): LoRACompatibleConv(640, 640, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
        (nonlinearity): SiLU()
        (conv_shortcut): LoRACompatibleConv(1280, 640, kernel_size=(1, 1), stride=(1, 1))
      )
      (2): ResnetBlock2D(
        (norm1): GroupNorm(32, 960, eps=1e-05, affine=True)
        (conv1): LoRACompatibleConv(960, 640, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
        (time_emb_proj): LoRACompatibleLinear(in_features=1280, out_features=640, bias=True)
        (norm2): GroupNorm(32, 640, eps=1e-05, affine=True)
        (dropout): Dropout(p=0.0, inplace=False)
        (conv2): LoRACompatibleConv(640, 640, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
        (nonlinearity): SiLU()
        (conv_shortcut): LoRACompatibleConv(960, 640, kernel_size=(1, 1), stride=(1, 1))
      )
    )
    (upsamplers): ModuleList(
      (0): Upsample2D(
        (conv): LoRACompatibleConv(640, 640, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
      )
    )
  )
  (2): UpBlock2D(
    (resnets): ModuleList(
      (0): ResnetBlock2D(
        (norm1): GroupNorm(32, 960, eps=1e-05, affine=True)
        (conv1): LoRACompatibleConv(960, 320, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
        (time_emb_proj): LoRACompatibleLinear(in_features=1280, out_features=320, bias=True)
        (norm2): GroupNorm(32, 320, eps=1e-05, affine=True)
        (dropout): Dropout(p=0.0, inplace=False)
        (conv2): LoRACompatibleConv(320, 320, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
        (nonlinearity): SiLU()
        (conv_shortcut): LoRACompatibleConv(960, 320, kernel_size=(1, 1), stride=(1, 1))
      )
      (1-2): 2 x ResnetBlock2D(
        (norm1): GroupNorm(32, 640, eps=1e-05, affine=True)
        (conv1): LoRACompatibleConv(640, 320, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
        (time_emb_proj): LoRACompatibleLinear(in_features=1280, out_features=320, bias=True)
        (norm2): GroupNorm(32, 320, eps=1e-05, affine=True)
        (dropout): Dropout(p=0.0, inplace=False)
        (conv2): LoRACompatibleConv(320, 320, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
        (nonlinearity): SiLU()
        (conv_shortcut): LoRACompatibleConv(640, 320, kernel_size=(1, 1), stride=(1, 1))
      )
    )
  )
)
mid_block
UNetMidBlock2DCrossAttn(
  (attentions): ModuleList(
    (0): Transformer2DModel(
      (norm): GroupNorm(32, 1280, eps=1e-06, affine=True)
      (proj_in): LoRACompatibleLinear(in_features=1280, out_features=1280, bias=True)
      (transformer_blocks): ModuleList(
        (0-9): 10 x BasicTransformerBlock(
          (norm1): LayerNorm((1280,), eps=1e-05, elementwise_affine=True)
          (attn1): Attention(
            (to_q): LoRACompatibleLinear(in_features=1280, out_features=1280, bias=False)
            (to_k): LoRACompatibleLinear(in_features=1280, out_features=1280, bias=False)
            (to_v): LoRACompatibleLinear(in_features=1280, out_features=1280, bias=False)
            (to_out): ModuleList(
              (0): LoRACompatibleLinear(in_features=1280, out_features=1280, bias=True)
              (1): Dropout(p=0.0, inplace=False)
            )
          )
          (norm2): LayerNorm((1280,), eps=1e-05, elementwise_affine=True)
          (attn2): Attention(
            (to_q): LoRACompatibleLinear(in_features=1280, out_features=1280, bias=False)
            (to_k): LoRACompatibleLinear(in_features=2048, out_features=1280, bias=False)
            (to_v): LoRACompatibleLinear(in_features=2048, out_features=1280, bias=False)
            (to_out): ModuleList(
              (0): LoRACompatibleLinear(in_features=1280, out_features=1280, bias=True)
              (1): Dropout(p=0.0, inplace=False)
            )
          )
          (norm3): LayerNorm((1280,), eps=1e-05, elementwise_affine=True)
          (ff): FeedForward(
            (net): ModuleList(
              (0): GEGLU(
                (proj): LoRACompatibleLinear(in_features=1280, out_features=10240, bias=True)
              )
              (1): Dropout(p=0.0, inplace=False)
              (2): LoRACompatibleLinear(in_features=5120, out_features=1280, bias=True)
            )
          )
        )
      )
      (proj_out): LoRACompatibleLinear(in_features=1280, out_features=1280, bias=True)
    )
  )
  (resnets): ModuleList(
    (0-1): 2 x ResnetBlock2D(
      (norm1): GroupNorm(32, 1280, eps=1e-05, affine=True)
      (conv1): LoRACompatibleConv(1280, 1280, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
      (time_emb_proj): LoRACompatibleLinear(in_features=1280, out_features=1280, bias=True)
      (norm2): GroupNorm(32, 1280, eps=1e-05, affine=True)
      (dropout): Dropout(p=0.0, inplace=False)
      (conv2): LoRACompatibleConv(1280, 1280, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
      (nonlinearity): SiLU()
    )
  )
)
conv_norm_out
GroupNorm(32, 320, eps=1e-05, affine=True)
conv_act
SiLU()
conv_out
Conv2d(320, 4, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))

vae

标签:分析,1280,LoRACompatibleLinear,640,features,sdxl,结构,True,out
From: https://www.cnblogs.com/FrostyForest/p/17860155.html

相关文章

  • Minitab 2021:让数据分析变得更简单,更直观 win版
    Minitab2021是一款广受欢迎的统计分析管理软件,它为用户提供了强大的数据处理和分析能力,适用于各种行业和领域。通过Minitab2021,用户可以轻松应对各种数据分析挑战,从基本的统计分析到复杂的数据挖掘,都能得到准确、可靠的结果。点击获取Minitab2021Minitab2021的界面简洁......
  • 作为数据分析人员,怎么才有数据分析思路?
    作为数据分析人员,拥有清晰的数据分析思路是非常重要的。下面我将为您详细描述如何培养数据分析思路,包括问题定义、数据收集、数据清洗和整理、探索性数据分析、模型建立与评估等方面。1.问题定义确定清晰的分析目标:明确需要解决的问题是什么,例如市场调研、用户行为分析、产品......
  • 通过Python进行文本数据分析和自然语言处理
    在当今信息时代,文本数据已经成为获取和传递信息的重要方式之一。而Python作为一种功能强大的编程语言,可以利用其丰富的文本分析库和自然语言处理工具,对文本数据进行有效的挖掘和分析。本文将介绍如何使用Python进行文本数据分析和自然语言处理。1.文本数据预处理在进行文本数据分析......
  • 初中生如何提高自己的成绩,老师如何分析学生的成绩巧?
    初中生提高成绩的方法及老师分析学生成绩的技巧是一个涉及教育心理学、学习方法论和教学实践的广泛话题。下面我将详细为您描述初中生提高成绩的方法以及老师分析学生成绩的技巧。一、初中生提高成绩的方法1.树立正确的学习态度制定长期目标和短期目标,明确自己想要取得的成绩......
  • diffusers sdxl 性能分析
    加载fp16模型到显存的消耗单图运算时的消耗unet阶段vae阶段双图运算时的消耗unet阶段vae阶段......
  • Netty 源码分析
    ServerBootstrap主要介绍服务端的启动流程以及如何绑定端口号、开启服务端Socket并让其进入接收连接状态的启动模板如下;try{ChannelFuturefuture=newServerBootstrap().group(bossGroup,workerGroup).channel(NioServerSocketChannel.class)......
  • 分析安科瑞Acrel-EIOT能源物联网平台的工作原理以及应用场景—李笑曼
    安科瑞电气股份有限公司李笑曼壹柒捌贰壹壹贰玖柒叁叁1功能Acrel-EIoT能源物联网开放平台是一套基于物联网数据中台,建立统一的上下行数据标准,为互联网用户提供能源物联网数据服务的平台。用户仅需购买安科瑞物联网传感器,选配网关,自行安装后扫码即可使用手机和电脑得到所需的行......
  • 超详细的Mysql锁 实战分析,你想知道的都在这里~
    1.mysql回表查询在这里提起主要是用于说明mysql数据和索引的结构,有助于理解后续加锁过程中的一些问题。mysql索引结构和表数据结构是相互独立的,根据索引查询,只能找到索引列和主键聚簇索引。如果select语句中不包含索引列,mysql会根据主键聚簇索引二次回表查询所需要的数据,查询出......
  • 小学考试成绩分析
    现在的教育行业对考试数据的分析越来越精细化。小学考完试之后,学校都会要求老师们写报告材料,还要求图文并茂地分析本班、本科目的教学成果。报告的内容一般要求有参考人数,平均分,优秀率、良好率、合格率,最高分、最低分等统计数据。虽然小学不允许向家长和学生公布学生分数和排名的......
  • GWAS:plink进行meta分析
    之前教程提到过Metal是可以做Meta分析,除了Metal,PLINK也可以进行Meta分析。命令如下所示:plink--meta-analysisgwas1.plinkgwas2.plinkgwas3.plink+logscaleqt--meta-analysis-snp-fieldSNP--meta-analysis-chr-fieldCHR--meta-analysis-bp-fieldBP--meta-analysis......