Commit Graph

455 Commits (a55aac79de0ea6fc52d35f535867b6573a5ff0f8)

Author SHA1 Message Date
Heiko Carstens 2565409fc0 mm,x86,um: move CMPXCHG_DOUBLE config option 13 years ago
Heiko Carstens 43570fd2f4 mm,slub,x86: decouple size of struct page from CONFIG_CMPXCHG_LOCAL 13 years ago
Stanislaw Gruszka fc8d8620d3 slub: min order when debug_guardpage_minorder > 0 13 years ago
David Rientjes 74ee4ef1f9 slub: disallow changing cpu_partial from userspace for debug caches 13 years ago
Jan Beulich cdcd629869 x86: Fix and improve cmpxchg_double{,_local}() 13 years ago
Christoph Lameter 933393f58f percpu: Remove irqsafe_cpu_xxx variants 13 years ago
Shaohua Li b13683d1cc slub: add missed accounting 13 years ago
Christoph Lameter 213eeb9fd9 slub: Extract get_freelist from __slab_alloc 13 years ago
Christoph Lameter 8f1e33daed slub: Switch per cpu partial page support off for debugging 13 years ago
Eric Dumazet 73736e0387 slub: fix a possible memleak in __slab_alloc() 13 years ago
Shaohua Li 4c493a5a5c slub: add missed accounting 13 years ago
Eric Dumazet bc6697d8a5 slub: avoid potential NULL dereference or corruption 13 years ago
Christoph Lameter 42d623a8cd slub: use irqsafe_cpu_cmpxchg for put_cpu_partial 13 years ago
Dave Jones 265d47e711 slub: add taint flag outputting to debug paths 13 years ago
Shaohua Li 9ada19342b slub: move discard_slab out of node lock 13 years ago
Shaohua Li f64ae042d9 slub: use correct parameter to add a page to partial list tail 13 years ago
Akinobu Mita 798248206b lib/string.c: introduce memchr_inv() 13 years ago
Alex Shi dcc3be6a54 slub: Discard slab page when node partial > minimum partial number 14 years ago
Alex Shi 9f26490412 slub: correct comments error for per cpu partial 14 years ago
Vasiliy Kulikov ab067e99d2 mm: restrict access to slab files under procfs and sysfs 14 years ago
Alex,Shi 12d79634f8 slub: Code optimization in get_partial_node() 14 years ago
Shaohua Li 136333d104 slub: explicitly document position of inserting slab to partial list 14 years ago
Shaohua Li 130655ef09 slub: add slab with one free object to partial list tail 14 years ago
Christoph Lameter 49e2258586 slub: per cpu cache for partial pages 14 years ago
Christoph Lameter 497b66f2ec slub: return object pointer from get_partial() / new_slab(). 14 years ago
Christoph Lameter acd19fd1a7 slub: pass kmem_cache_cpu pointer to get_partial() 14 years ago
Christoph Lameter e6e82ea112 slub: Prepare inuse field in new_slab() 14 years ago
Christoph Lameter 7db0d70540 slub: Remove useless statements in __slab_alloc 14 years ago
Christoph Lameter 69cb8e6b7c slub: free slabs without holding locks 14 years ago
Christoph Lameter 81107188f1 slub: Fix partial count comparison confusion 14 years ago
Akinobu Mita ef62fb32b7 slub: fix check_bytes() for slub debugging 14 years ago
Christoph Lameter 6fbabb20fa slub: Fix full list corruption if debugging is on 14 years ago
Sebastian Andrzej Siewior ffc79d2880 slub: use print_hex_dump 14 years ago
Christoph Lameter 9e577e8b46 slub: When allocating a new slab also prep the first object 14 years ago
Phil Carmody 497888cf69 treewide: fix potentially dangerous trailing ';' in #defined values/expressions 14 years ago
Christoph Lameter 1d07171c5e slub: disable interrupts in cmpxchg_double_slab when falling back to pagelock 14 years ago
Pekka Enberg bfa71457a0 SLUB: Fix missing <linux/stacktrace.h> include 14 years ago
Marcin Slusarz c4089f98e9 slub: reduce overhead of slub_debug 14 years ago
Ben Greear d18a90dd85 slub: Add method to verify memory is not freed 14 years ago
Ben Greear d6543e3935 slub: Enable backtrace for create/delete points 14 years ago
Christoph Lameter 4eade540fc slub: Not necessary to check for empty slab on load_freelist 14 years ago
Christoph Lameter 03e404af26 slub: fast release on full slab 14 years ago
Christoph Lameter e36a2652d7 slub: Add statistics for the case that the current slab does not match the node 14 years ago
Christoph Lameter fc59c05306 slub: Get rid of the another_slab label 14 years ago
Christoph Lameter 80f08c191f slub: Avoid disabling interrupts in free slowpath 14 years ago
Christoph Lameter 5c2e4bbbd6 slub: Disable interrupts in free_debug processing 14 years ago
Christoph Lameter 881db7fb03 slub: Invert locking and avoid slab lock 14 years ago
Christoph Lameter 2cfb7455d2 slub: Rework allocator fastpaths 14 years ago
Christoph Lameter 61728d1efc slub: Pass kmem_cache struct to lock and freeze slab 14 years ago
Christoph Lameter 5cc6eee8a8 slub: explicit list_lock taking 14 years ago