Commit Graph

248 Commits (fc240e3fc5791c572402b0857948da7b1e68d77f)

Author SHA1 Message Date
Pekka Enberg 2121db74ba kmemtrace: trace kfree() calls with NULL or zero-length objects 16 years ago
Eduard - Gabriel Munteanu ca2b84cb3c kmemtrace: use tracepoints 16 years ago
Akinobu Mita 1a00df4a2c slub: use get_track() 16 years ago
David Rientjes c0bdb232b2 slub: rename calculate_min_partial() to set_min_partial() 16 years ago
David Rientjes 73d342b169 slub: add min_partial sysfs tunable 16 years ago
David Rientjes 3b89d7d881 slub: move min_partial to struct kmem_cache 16 years ago
Christoph Lameter fe1200b63d SLUB: Introduce and use SLUB_MAX_SIZE and SLUB_PAGE_SHIFT constants 16 years ago
Zhang Yanmin e8120ff1ff SLUB: Fix default slab order for big object sizes 16 years ago
Christoph Lameter ffadd4d0fe SLUB: Introduce and use SLUB_MAX_SIZE and SLUB_PAGE_SHIFT constants 16 years ago
Nick Piggin cf40bd16fd lockdep: annotate reclaim context (__GFP_NOFS) 16 years ago
Kirill A. Shutemov b1aabecd55 mm: Export symbol ksize() 16 years ago
David Rientjes 3718909448 slub: fix per cpu kmem_cache_cpu array memory leak 16 years ago
Pekka Enberg 6047a007d0 SLUB: Use ->objsize from struct kmem_cache_cpu in slab_free() 16 years ago
Frederik Schwarzer 0211a9c850 trivial: fix an -> a typos in documentation and comments 16 years ago
Rusty Russell 174596a0b9 cpumask: convert mm/ 16 years ago
Frederic Weisbecker 36994e58a4 tracing/kmemtrace: normalize the raw tracer event to the unified tracing API 16 years ago
Ingo Molnar 2a38b1c4f1 kmemtrace: move #include lines 16 years ago
Pekka Enberg 2e67624c22 kmemtrace: remove unnecessary casts 16 years ago
Eduard - Gabriel Munteanu 94b528d056 kmemtrace: SLUB hooks for caller-tracking functions. 16 years ago
Eduard - Gabriel Munteanu 5b882be4e0 kmemtrace: SLUB hooks. 16 years ago
Eduard - Gabriel Munteanu 35995a4d81 SLUB: Replace __builtin_return_address(0) with _RET_IP_. 16 years ago
David Rientjes 7b8f3b66d9 slub: avoid leaking caches or refcounts on sysfs error 16 years ago
OGAWA Hirofumi 89124d706d slub: Add might_sleep_if() to slab_alloc() 16 years ago
Akinobu Mita 773ff60e84 SLUB: failslab support 16 years ago
Rusty Russell 29c0177e6a cpumask: change cpumask_scnprintf, cpumask_parse_user, cpulist_parse, and cpulist_scnprintf to take pointers. 16 years ago
Hugh Dickins 9c24624727 KSYM_SYMBOL_LEN fixes 16 years ago
Nick Andrew 9f6c708e5c slub: Fix incorrect use of loose 16 years ago
KAMEZAWA Hiroyuki dc19f9db38 memcg: memory hotplug fix for notifier callback 16 years ago
David Rientjes 0094de92a4 slub: make early_kmem_cache_node_alloc void 16 years ago
Cyrill Gorcunov e9beef1815 slub - fix get_object_page comment 16 years ago
Eduard - Gabriel Munteanu ce71e27c6f SLUB: Replace __builtin_return_address(0) with _RET_IP_. 16 years ago
Cyrill Gorcunov 210b5c0613 SLUB: cleanup - define macros instead of hardcoded numbers 16 years ago
Alexey Dobriyan 7b3c3a50a3 proc: move /proc/slabinfo boilerplate to mm/slub.c, mm/slab.c 17 years ago
Salman Qazi 02b71b7012 slub: fixed uninitialized counter in struct kmem_cache_node 17 years ago
Christoph Lameter e2cb96b7ec slub: Disable NUMA remote node defragmentation by default 17 years ago
Pekka Enberg 5595cffc82 SLUB: dynamic per-cache MIN_PARTIAL 17 years ago
Adrian Bunk 231367fd9b mm: unexport ksize 17 years ago
Alexey Dobriyan 51cc50685a SL*B: drop kmem cache argument from constructor 17 years ago
Andy Whitcroft 8a38082d21 slub: record page flag overlays explicitly 17 years ago
Pekka Enberg 0ebd652b35 slub: dump more data on slab corruption 17 years ago
Alexey Dobriyan 41ab8592ca SLUB: simplify re on_each_cpu() 17 years ago
Alexey Dobriyan 88e4ccf294 slub: current is always valid 17 years ago
Christoph Lameter 0937502af7 slub: Add check for kfree() of non slab objects. 17 years ago
Linus Torvalds 7daf705f36 Start using the new '%pS' infrastructure to print symbols 17 years ago
Dmitry Adamushko bdb2192851 slub: Fix use-after-preempt of per-CPU data structure 17 years ago
Christoph Lameter cde5353599 Christoph has moved 17 years ago
Christoph Lameter 41d54d3bf8 slub: Do not use 192 byte sized cache if minimum alignment is 128 byte 17 years ago
Jens Axboe 15c8b6c1aa on_each_cpu(): kill unused 'retry' parameter 17 years ago
Pekka Enberg 76994412f8 slub: ksize() abuse checks 17 years ago
Benjamin Herrenschmidt 4ea33e2dc2 slub: fix atomic usage in any_slab_objects() 17 years ago