bionic/libc/arch-arm
Christopher Ferris fbefb252b0 Modify prefetch for krait memcpy.
I originally modified the krait mainloop prefetch from cacheline * 8 to * 2.
This causes a perf degradation for copies bigger than will fit in the cache.
Fixing this back to the original * 8. I tried other multiples, but * 8 is th
sweet spot on krait.

Bug: 11221806

(cherry picked from commit c3c58fb560)

Change-Id: I369f81d91ba97a3fcecac84ac57dec921b4758c8
2013-10-15 15:44:00 -07:00
..
bionic Merge "Optimize strcat/strcpy, small tweaks to strlen." 2013-08-05 23:32:06 +00:00
cortex-a7 libc/arm: add cortex-a7 cpu variant 2013-03-23 01:38:22 -07:00
cortex-a8 libc/arm: add cortex-a8 cpu variant 2013-05-15 20:13:28 -07:00
cortex-a9 __memcpy_chk: Fix signed cmp of unsigned values. 2013-09-10 17:34:03 -07:00
cortex-a15 Remove new aligned memcpy path for cortex-a15. 2013-10-15 14:54:02 -07:00
generic Create optimized __strcpy_chk/__strcat_chk. 2013-08-14 07:46:00 +00:00
include/machine Upgrade libm. 2013-02-01 14:51:19 -08:00
krait Modify prefetch for krait memcpy. 2013-10-15 15:44:00 -07:00
syscalls libc: add swapon and swapoff syscalls 2013-06-25 13:18:03 -07:00
arm.mk Create optimized __strcpy_chk/__strcat_chk. 2013-08-14 07:46:00 +00:00
syscalls.mk libc: add swapon and swapoff syscalls 2013-06-25 13:18:03 -07:00