得了幽门螺旋杆菌有什么症状| 红头文件是什么意思| 寅时属什么生肖| 可遇不可求是什么意思| 5岁属什么生肖| 辅助生殖是什么意思| 嗓子干吃什么药| 硫磺有什么作用| 苡字五行属什么| 胃溃疡不能吃什么食物| 鲁班是干什么的| 睾丸积液吃什么药最好| 小孩睡觉出汗是什么原因| 拘留所和看守所有什么区别| 见异思迁什么意思| 什么是铅中毒| 辟谷什么意思| 笑靥是什么意思| 朱砂有什么作用与功效| 老是口腔溃疡是什么原因| 西边五行属什么| 奇门遁甲是什么意思| 非特异性阴道炎是什么意思| 什么药可以延长射精| 花生对胃有什么好处| csw是什么意思| 感冒头晕吃什么药| 蕴是什么意思| 台州为什么念第一声| 肝肾不足证是什么意思| 脚麻是什么原因引起的| 矢气是什么意思| 前列腺液是什么样子| 热感冒吃什么食物好| 高压低是什么原因引起的| 转氨酶异常是什么意思| 更年期挂什么科| 胃烧心是怎么回事吃什么药| 蜂蜜加柠檬有什么功效和作用| 高密度脂蛋白胆固醇高是什么意思| 子不问卜自惹祸殃什么意思| 宁的五行属性是什么| 一个鸟一个衣是什么字| 什么叫近视| 为什么会发烧| 中天是什么意思| 农历正月初一是什么节| 孕妇有狐臭擦什么最好| 慈母手中线的下一句是什么| 米加白念什么| 什么是低筋面粉| 佛手是什么东西| 助听器什么品牌最好| 梦见彩虹是什么征兆| 鱼扣是鱼的什么部位| 固执己见是什么意思| 70年产权是从什么时候开始算| 仰仗是什么意思| 什么是低血压| 头晕头痛吃什么药| 性激素是查什么| 紫外线是什么意思| 宣字五行属什么| 白介素2是治疗什么病的| 为什么经常刷牙还牙黄| 香港的别称是什么| cpc是什么意思| 案例是什么意思| 长辈生日送什么好| 肠胃痉挛吃什么药| 被螨虫咬了用什么药膏| 7月1号是什么节日| 什么的西瓜| 小猫不能吃什么食物| 梦见去看病是什么意思| 脑白质疏松症是什么病| 梅毒抗体阳性说明什么| 元五行属性是什么| 1995年的猪五行属什么| 为什么当警察| 备是什么意思| 刘邦是汉什么帝| 金钱草长什么样| 人参有什么作用| 遗传物质是什么| 经期可以喝什么| 8月29日什么星座| 舒张压偏高是什么原因| 耐受性是什么意思| 为什么会得心脏病| 胃不好可以吃什么| 木姜子是什么| 绝经是什么意思| 什么是交感神经| 黑色裤子配什么颜色t恤| 行运是什么意思| ims是什么意思| 吃什么补孕酮| 碎片化是什么意思| 把妹什么意思| 为什么要来月经| 荆芥是什么| 虫草泡水喝有什么功效| 嫌疑人是什么意思| 人流后可以吃什么| 海狗是什么| 棕榈油是什么油| 大出血是什么症状| 情绪什么意思| 烂漫什么意思| 灰什么| 阿托品是什么药| 2031年是什么年| 喝柠檬水有什么好处和坏处| 黄色分泌物是什么原因| 鼻涕由清变黄说明什么| 眼球发黄是什么原因| 青豆是什么豆| 什么叫肝功能不全| 小孩自闭症是什么原因引起的| 配裙子穿什么鞋子好看| 快速眼动是什么意思| 荆州是现在的什么地方| 征求是什么意思| 鲁肃的性格特点是什么| 农历12月18日是什么星座| 失足是什么意思| 机缘是什么意思| 囊是什么意思| 清凉补是什么| 鉴黄师是什么职业| 虎头蛇尾是什么生肖| 红苋菜不能和什么一起吃| 女性阴部潮湿是什么原因| 跑团是什么| 菜板买什么材质的好| 双生痣是什么意思| adhd挂什么科| 大姨妈来能吃什么水果| 沙棘原浆有什么功效| 心率过速吃什么药| 花菜炒什么好吃| 睡觉总是做梦是什么原因| primark是什么牌子| 儿童咽峡炎吃什么药| 从容不迫什么意思| 真正的爱情是什么| 隐血弱阳性是什么意思| 红细胞高说明什么| 小鸡啄米什么意思| 谷氨酸高是什么原因| 贿赂是什么意思| 女生胸部发育到什么年龄| 梦见买苹果是什么征兆| 红红的眼睛是什么生肖| 为什么叫关东军| 土豆和什么不能一起吃| 肛门坠胀用什么药| 什么分泌胰岛素| 女性吃什么改善更年期| 白细胞酯酶阳性是什么意思| 拉黑一个人意味着什么| 与狼共舞什么意思| 宝宝大便绿色是什么原因| 小孩放屁很臭是什么原因| 为什么水晶要消磁| 世界屋脊指的是什么| 2002年出生属什么| 情感和感情有什么区别| 什么是激素类药物| 宫腔积液是什么| 看望病人买什么东西好| 大便想拉又拉不出来是什么原因| 牙齿上白色斑块是什么| 15号来月经排卵期是什么时候| 肌肉僵硬是什么原因引起的| 爆菊花是什么意思| 晕3d是什么原因| 奶白色是什么颜色| 肝脏的主要功能是什么| 囊肿里面是什么东西| honor是什么牌子| innisfree是什么牌子的化妆品| 脑梗需要注意什么| 每天吃洋葱有什么好处| 小肚子一直疼是什么原因| 肛门口瘙痒涂什么药膏| 牛奶丝是什么面料| 医生规培是什么意思| 嫩牛五方什么意思| 黄桃不能和什么一起吃| 山竹为什么叫山竹| 为什么胸口疼| 不过如此是什么意思| 鼾症是什么病| 16 5是什么码| 牙疼吃什么止疼药| 木属于五行属什么| 血压低吃什么能补上来| 吃什么润肠通便| o和ab型生的孩子是什么血型| 马路杀手是什么意思| 脸上出汗多是什么原因| 喝水就打嗝是什么原因| 末伏是什么意思| 胆固醇低是什么原因| 喜欢一个人会有什么表现| 肿脚是什么原因引起的| 宝贝什么意思| 子宫颈肥大有什么危害| 血崩是什么意思| 拉屎有血是什么原因| 什么是光合作用| 眉毛上长痣代表什么| 芦荟有什么功效| 摩羯座什么时候| 怀孕初期吃什么食物好| cup什么意思| 灼口综合症吃什么药| 肠癌吃什么| 什么叫伴手礼| 1995年属什么| 胸片能查出什么| 咕咕咕咕叫是什么鸟| 什么叫肺纤维化| bella是什么意思| 肺部占位性的病变指什么| 结缔组织病是什么病| 面部提升做什么项目最好| 移植后需要注意什么| 大嘴巴是什么意思| 牙龈无缘无故出血是什么原因| 急性扁桃体化脓是什么原因引起的| 体脂率是什么意思| 梦见摘豆角是什么预兆| 蝙蝠是什么变的| 早上尿黄是什么原因| 什么不安成语| 杯弓蛇影是什么物理现象| 重阳节是干什么的| 安宫牛黄丸什么时候吃最好| mlb是什么意思| 汗管瘤用什么药能去掉| 泌尿外科看什么病| 什么是用神| 2012什么年| 深渊是什么意思| 姗字五行属什么| 疼和痛有什么区别| 福报是什么| 闭角型青光眼是什么意思| 朋友生日送什么礼物好| 1997属什么| 窘迫是什么意思| 为什么晚上睡觉老是做梦| 色斑是什么原因引起的| 少田宅痣是什么意思| 什么是疝气| 害怕的反义词是什么| 玻璃五行属什么| 女人盗汗吃什么好得快| 牙龈肿痛用什么药| 8月15是什么星座| 属羊什么命| 百度Jump to content

From Wikipedia, the free encyclopedia
(Redirected from Automatic memory management)
百度   本次研讨会由澳门基本法推广协会,北京大学港澳台法律研究中心,澳门特区政府法务局、民政总署、教育暨青年局共同主办,来自内地与澳门的数十位宪法和基本法领域专家学者济济一堂,围绕“中央全面管治权与澳门特区高度自治权的有机结合”“融入国家发展大局与确保特区繁荣稳定”“爱国爱澳与人才培养”三个主要议题展开交流研讨。

Stop-and-copy garbage collection in a Lisp architecture:[1] Memory is divided into working and free memory; new objects are allocated in the former. When it is full (depicted), garbage collection is performed: All data structures still in use are located by pointer tracing and copied into consecutive locations in free memory.
After that, the working memory contents is discarded in favor of the compressed copy, and the role of working and free memory are exchanged (depicted).

In computer science, garbage collection (GC) is a form of automatic memory management.[2] The garbage collector attempts to reclaim memory that was allocated by the program, but is no longer referenced; such memory is called garbage. Garbage collection was invented by American computer scientist John McCarthy around 1959 to simplify manual memory management in Lisp.[3]

Garbage collection relieves the programmer from doing manual memory management, where the programmer specifies what objects to de-allocate and return to the memory system and when to do so.[2] Other, similar techniques include stack allocation, region inference, and memory ownership, and combinations thereof. Garbage collection may take a significant proportion of a program's total processing time, and affect performance as a result.

Resources other than memory, such as network sockets, database handles, windows, file descriptors, and device descriptors, are not typically handled by garbage collection, but rather by other methods (e.g. destructors). Some such methods de-allocate memory also.

Overview

[edit]

Many programming languages require garbage collection, either as part of the language specification (e.g., RPL, Java, C#, D,[4] Go, and most scripting languages) or effectively for practical implementation (e.g., formal languages like lambda calculus).[5] These are said to be garbage-collected languages. Other languages, such as C and C++, were designed for use with manual memory management, but have garbage-collected implementations available. Some languages, like Ada, Modula-3, and C++/CLI, allow both garbage collection and manual memory management to co-exist in the same application by using separate heaps for collected and manually managed objects. Still others, like D, are garbage-collected but allow the user to manually delete objects or even disable garbage collection entirely when speed is required.[6]

Although many languages integrate GC into their compiler and runtime system, post-hoc GC systems also exist, such as Automatic Reference Counting (ARC). Some of these post-hoc GC systems do not require recompilation.[7]

Advantages

[edit]

GC frees the programmer from manually de-allocating memory. This helps avoid some kinds of errors:[8]

  • Dangling pointers, which occur when a piece of memory is freed while there are still pointers to it, and one of those pointers is dereferenced. By then the memory may have been reassigned to another use, with unpredictable results.[9]
  • Double free bugs, which occur when the program tries to free a region of memory that has already been freed, and perhaps already been allocated again.
  • Certain kinds of memory leaks, in which a program fails to free memory occupied by objects that have become unreachable, which can lead to memory exhaustion.[10]

Disadvantages

[edit]

GC uses computing resources to decide which memory to free. Therefore, the penalty for the convenience of not annotating object lifetime manually in the source code is overhead, which can impair program performance.[11] A peer-reviewed paper from 2005 concluded that GC needs five times the memory to compensate for this overhead and to perform as fast as the same program using idealized explicit memory management. The comparison however is made to a program generated by inserting deallocation calls using an oracle, implemented by collecting traces from programs run under a profiler, and the program is only correct for one particular execution of the program.[12] Interaction with memory hierarchy effects can make this overhead intolerable in circumstances that are hard to predict or to detect in routine testing. The impact on performance was given by Apple as a reason for not adopting garbage collection in iOS, despite it being the most desired feature.[13]

The moment when the garbage is actually collected can be unpredictable, resulting in stalls (pauses to shift/free memory) scattered throughout a session. Unpredictable stalls can be unacceptable in real-time environments, in transaction processing, or in interactive programs. Incremental, concurrent, and real-time garbage collectors address these problems, with varying trade-offs.

Strategies

[edit]

Tracing

[edit]

Tracing garbage collection is the most common type of garbage collection, so much so that "garbage collection" often refers to tracing garbage collection, rather than other methods such as reference counting. The overall strategy consists of determining which objects should be garbage collected by tracing which objects are reachable by a chain of references from certain root objects, and considering the rest as garbage and collecting them. However, there are a large number of algorithms used in implementation, with widely varying complexity and performance characteristics.

Reference counting

[edit]

Reference counting garbage collection is where each object has a count of the number of references to it. Garbage is identified by having a reference count of zero. An object's reference count is incremented when a reference to it is created and decremented when a reference is destroyed. When the count reaches zero, the object's memory is reclaimed.[14]

As with manual memory management, and unlike tracing garbage collection, reference counting guarantees that objects are destroyed as soon as their last reference is destroyed, and usually only accesses memory which is either in CPU caches, in objects to be freed, or directly pointed to by those, and thus tends to not have significant negative side effects on CPU cache and virtual memory operation.

There are a number of disadvantages to reference counting; this can generally be solved or mitigated by more sophisticated algorithms:

Cycles
If two or more objects refer to each other, they can create a cycle whereby neither will be collected as their mutual references never let their reference counts become zero. Some garbage collection systems using reference counting (like the one in CPython) use specific cycle-detecting algorithms to deal with this issue.[15] Another strategy is to use weak references for the "backpointers" which create cycles. Under reference counting, a weak reference is similar to a weak reference under a tracing garbage collector. It is a special reference object whose existence does not increment the reference count of the referent object. Furthermore, a weak reference is safe in that when the referent object becomes garbage, any weak reference to it lapses, rather than being permitted to remain dangling, meaning that it turns into a predictable value, such as a null reference.
Space overhead (reference count)
Reference counting requires space to be allocated for each object to store its reference count. The count may be stored adjacent to the object's memory or in a side table somewhere else, but in either case, every single reference-counted object requires additional storage for its reference count. Memory space with the size of an unsigned pointer is commonly used for this task, meaning that 32 or 64 bits of reference count storage must be allocated for each object. On some systems, it may be possible to mitigate this overhead by using a tagged pointer to store the reference count in unused areas of the object's memory. Often, an architecture does not actually allow programs to access the full range of memory addresses that could be stored in its native pointer size; a certain number of high bits in the address is either ignored or required to be zero. If an object reliably has a pointer at a certain location, the reference count can be stored in the unused bits of the pointer. For example, each object in Objective-C has a pointer to its class at the beginning of its memory; on the ARM64 architecture using iOS 7, 19 unused bits of this class pointer are used to store the object's reference count.[16][17]
Speed overhead (increment/decrement)
In naive implementations, each assignment of a reference and each reference falling out of scope often require modifications of one or more reference counters. However, in a common case when a reference is copied from an outer scope variable into an inner scope variable, such that the lifetime of the inner variable is bounded by the lifetime of the outer one, the reference incrementing can be eliminated. The outer variable "owns" the reference. In the programming language C++, this technique is readily implemented and demonstrated with the use of const references. Reference counting in C++ is usually implemented using "smart pointers"[18] whose constructors, destructors, and assignment operators manage the references. A smart pointer can be passed by reference to a function, which avoids the need to copy-construct a new smart pointer (which would increase the reference count on entry into the function and decrease it on exit). Instead, the function receives a reference to the smart pointer which is produced inexpensively. The Deutsch-Bobrow method of reference counting capitalizes on the fact that most reference count updates are in fact generated by references stored in local variables. It ignores these references, only counting references in the heap, but before an object with reference count zero can be deleted, the system must verify with a scan of the stack and register that no other reference to it still exists. A further substantial decrease in the overhead on counter updates can be obtained by update coalescing introduced by Levanoni and Petrank.[19][20] Consider a pointer that in a given interval of the execution is updated several times. It first points to an object O1, then to an object O2, and so forth until at the end of the interval it points to some object On. A reference counting algorithm would typically execute rc(O1)--, rc(O2)++, rc(O2)--, rc(O3)++, rc(O3)--, ..., rc(On)++. But most of these updates are redundant. In order to have the reference count properly evaluated at the end of the interval it is enough to perform rc(O1)-- and rc(On)++. Levanoni and Petrank measured an elimination of more than 99% of the counter updates in typical Java benchmarks.
Requires atomicity
When used in a multithreaded environment, these modifications (increment and decrement) may need to be atomic operations such as compare-and-swap, at least for any objects which are shared, or potentially shared among multiple threads. Atomic operations are expensive on a multiprocessor, and even more expensive if they have to be emulated with software algorithms. It is possible to avoid this issue by adding per-thread or per-CPU reference counts and only accessing the global reference count when the local reference counts become or are no longer zero (or, alternatively, using a binary tree of reference counts, or even giving up deterministic destruction in exchange for not having a global reference count at all), but this adds significant memory overhead and thus tends to be only useful in special cases (it is used, for example, in the reference counting of Linux kernel modules). Update coalescing by Levanoni and Petrank[19][20] can be used to eliminate all atomic operations from the write-barrier. Counters are never updated by the program threads in the course of program execution. They are only modified by the collector which executes as a single additional thread with no synchronization. This method can be used as a stop-the-world mechanism for parallel programs, and also with a concurrent reference counting collector.
Not real-time
Naive implementations of reference counting do not generally provide real-time behavior, because any pointer assignment can potentially cause a number of objects bounded only by total allocated memory size to be recursively freed while the thread is unable to perform other work. It is possible to avoid this issue by delegating the freeing of unreferenced objects to other threads, at the cost of extra overhead.

Escape analysis

[edit]

Escape analysis is a compile-time technique that can convert heap allocations to stack allocations, thereby reducing the amount of garbage collection to be done. This analysis determines whether an object allocated inside a function is accessible outside of it. If a function-local allocation is found to be accessible to another function or thread, the allocation is said to "escape" and cannot be done on the stack. Otherwise, the object may be allocated directly on the stack and released when the function returns, bypassing the heap and associated memory management costs.[21]

Availability

[edit]

Generally speaking, higher-level programming languages are more likely to have garbage collection as a standard feature. In some languages lacking built-in garbage collection, it can be added through a library, as with the Boehm garbage collector for C and C++.

Most functional programming languages, such as ML, Haskell, and APL, have garbage collection built in. Lisp is especially notable as both the first functional programming language and the first language to introduce garbage collection.[22]

Other dynamic languages, such as Ruby and Julia (but not Perl 5 or PHP before version 5.3,[23] which both use reference counting), JavaScript and ECMAScript also tend to use GC. Object-oriented programming languages such as Smalltalk, ooRexx, RPL and Java usually provide integrated garbage collection. Notable exceptions are C++ and Delphi, which have destructors.

BASIC

[edit]

BASIC and Logo have often used garbage collection for variable-length data types, such as strings and lists, so as not to burden programmers with memory management details. On the Altair 8800, programs with many string variables and little string space could cause long pauses due to garbage collection.[24] Similarly the Applesoft BASIC interpreter's garbage collection algorithm repeatedly scans the string descriptors for the string having the highest address in order to compact it toward high memory, resulting in performance[25] and pauses anywhere from a few seconds to a few minutes.[26] A replacement garbage collector for Applesoft BASIC by Randy Wigginton identifies a group of strings in every pass over the heap, reducing collection time dramatically.[27] BASIC.SYSTEM, released with ProDOS in 1983, provides a windowing garbage collector for BASIC that is many times faster.[28]

Objective-C

[edit]

While the Objective-C traditionally had no garbage collection, with the release of OS X 10.5 in 2007 Apple introduced garbage collection for Objective-C 2.0, using an in-house developed runtime collector.[29] However, with the 2012 release of OS X 10.8, garbage collection was deprecated in favor of LLVM's automatic reference counter (ARC) that was introduced with OS X 10.7.[30] Furthermore, since May 2015 Apple even forbade the usage of garbage collection for new OS X applications in the App Store.[31][32] For iOS, garbage collection has never been introduced due to problems in application responsivity and performance;[13][33] instead, iOS uses ARC.[34][35]

Limited environments

[edit]

Garbage collection is rarely used on embedded or real-time systems because of the usual need for very tight control over the use of limited resources. However, garbage collectors compatible with many limited environments have been developed.[36] The Microsoft .NET Micro Framework, .NET nanoFramework[37] and Java Platform, Micro Edition are embedded software platforms that, like their larger cousins, include garbage collection.

Java

[edit]

Garbage collectors available in Java OpenJDKs virtual machine (JVM) include:

  • Serial
  • Parallel
  • CMS (Concurrent Mark Sweep)
  • G1 (Garbage-First)
  • ZGC (Z Garbage Collector)
  • Epsilon
  • Shenandoah
  • GenZGC (Generational ZGC)
  • GenShen (Generational Shenandoah)
  • IBM Metronome (only in IBM OpenJDK)
  • SAP (only in SAP OpenJDK)
  • Azul C4 (Continuously Concurrent Compacting Collector)[38] (only in Azul Systems OpenJDK)

Compile-time use

[edit]

Compile-time garbage collection is a form of static analysis allowing memory to be reused and reclaimed based on invariants known during compilation.

This form of garbage collection has been studied in the Mercury programming language,[39] and it saw greater usage with the introduction of LLVM's automatic reference counter (ARC) into Apple's ecosystem (iOS and OS X) in 2011.[34][35][31]

Real-time systems

[edit]

Incremental, concurrent, and real-time garbage collectors have been developed, for example by Henry Baker and by Henry Lieberman.[40][41][42]

In Baker's algorithm, the allocation is done in either half of a single region of memory. When it becomes half full, a garbage collection is performed which moves the live objects into the other half and the remaining objects are implicitly deallocated. The running program (the 'mutator') has to check that any object it references is in the correct half, and if not move it across, while a background task is finding all of the objects.[43]

Generational garbage collection schemes are based on the empirical observation that most objects die young. In generational garbage collection, two or more allocation regions (generations) are kept, which are kept separate based on the object's age. New objects are created in the "young" generation that is regularly collected, and when a generation is full, the objects that are still referenced from older regions are copied into the next oldest generation. Occasionally a full scan is performed.

Some high-level language computer architectures include hardware support for real-time garbage collection.

Most implementations of real-time garbage collectors use tracing.[citation needed] Such real-time garbage collectors meet hard real-time constraints when used with a real-time operating system.[44]

See also

[edit]

References

[edit]
  1. ^ Abelson, Harold; Sussman, Gerald Jay; Sussman, Julie (2016). Structure and Interpretation of Computer Programs (PDF) (2nd ed.). Cambridge, Massachusetts, US: MIT Press. pp. 734–736.
  2. ^ a b "What is garbage collection (GC) in programming?". Storage. Retrieved 2025-08-07.
  3. ^ McCarthy, John (1960). "Recursive functions of symbolic expressions and their computation by machine, Part I". Communications of the ACM. 3 (4): 184–195. doi:10.1145/367177.367199. S2CID 1489409. Retrieved 2025-08-07.
  4. ^ "Overview – D Programming Language". dlang.org. Digital Mars. Retrieved 2025-08-07.
  5. ^ Heller, Martin (2025-08-07). "What is garbage collection? Automated memory management for your programs". InfoWorld. Retrieved 2025-08-07.
  6. ^ "A Guide to Garbage Collection in Programming". freeCodeCamp.org. 2025-08-07. Retrieved 2025-08-07.
  7. ^ "Garbage Collection - D Programming Language". dlang.org. Retrieved 2025-08-07.
  8. ^ "Garbage Collection". rebelsky.cs.grinnell.edu. Retrieved 2025-08-07.
  9. ^ Heller, Martin (2025-08-07). "What is garbage collection? Automated memory management for your programs". InfoWorld. Retrieved 2025-08-07.
  10. ^ Microsoft (2025-08-07). "Fundamentals of garbage collection | Microsoft Learn". Retrieved 2025-08-07.
  11. ^ Zorn, Benjamin (2025-08-07). "The Measured Cost of Conservative Garbage Collection". Software: Practice and Experience. 23 (7). Department of Computer Science, University of Colorado Boulder: 733–756. CiteSeerX 10.1.1.14.1816. doi:10.1002/spe.4380230704. S2CID 16182444.
  12. ^ Hertz, Matthew; Berger, Emery D. (2005). "Quantifying the Performance of Garbage Collection vs. Explicit Memory Management" (PDF). Proceedings of the 20th Annual ACM SIGPLAN Conference on Object-Oriented Programming, Systems, Languages, and Applications - OOPSLA '05. pp. 313–326. doi:10.1145/1094811.1094836. ISBN 1-59593031-0. S2CID 6570650. Archived (PDF) from the original on 2025-08-07. Retrieved 2025-08-07.
  13. ^ a b "Developer Tools Kickoff – session 300" (PDF). WWDC 2011. Apple, Inc. 2025-08-07. Archived from the original (PDF) on 2025-08-07. Retrieved 2025-08-07.
  14. ^ Microsoft (2025-08-07). "Reference Counting Garbage Collection". Retrieved 2025-08-07.
  15. ^ "Reference Counts". Extending and Embedding the Python Interpreter. 2025-08-07. Retrieved 2025-08-07.
  16. ^ Ash, Mike. "Friday Q&A 2025-08-07: ARM64 and You". mikeash.com. Retrieved 2025-08-07.
  17. ^ "Hamster Emporium: [objc explain]: Non-pointer isa". Sealiesoftware.com. 2025-08-07. Retrieved 2025-08-07.
  18. ^ Pibinger, Roland (2025-08-07) [2025-08-07]. "RAII, Dynamic Objects, and Factories in C++".
  19. ^ a b Levanoni, Yossi; Petrank, Erez (2001). "An on-the-fly reference-counting garbage collector for java". Proceedings of the 16th ACM SIGPLAN Conference on Object-Oriented Programming, Systems, Languages, and Applications. OOPSLA 2001. pp. 367–380. doi:10.1145/504282.504309.
  20. ^ a b Levanoni, Yossi; Petrank, Erez (2006). "An on-the-fly reference-counting garbage collector for java". ACM Trans. Program. Lang. Syst. 28: 31–69. CiteSeerX 10.1.1.15.9106. doi:10.1145/1111596.1111597. S2CID 14777709.
  21. ^ Salagnac, Guillaume; Yovine, Sergio; Garbervetsky, Diego (2025-08-07). "Fast Escape Analysis for Region-based Memory Management". Electronic Notes in Theoretical Computer Science. 131: 99–110. doi:10.1016/j.entcs.2005.01.026.
  22. ^ Chisnall, David (2025-08-07). Influential Programming Languages, Part 4: Lisp.
  23. ^ "PHP: Performance Considerations". php.net. Retrieved 2025-08-07.
  24. ^ "Altair 8800 Basic 4.1 Reference Manual" (PDF). The Vintage Technology Digital Archive. April 1977. p. 108. Archived (PDF) from the original on 2025-08-07. Retrieved 2025-08-07.
  25. ^ "I did some work to speed up string garbage collection under Applesoft..." Hacker News. Retrieved 2025-08-07.
  26. ^ Little, Gary B. (1985). Inside the Apple IIc. Bowie, Md.: Brady Communications Co. p. 82. ISBN 0-89303-564-5. Retrieved 2025-08-07.
  27. ^ "Fast Garbage Collection". Call-A.P.P.L.E.: 40–45. January 1981.
  28. ^ Worth, Don (1984). Beneath Apple Pro DOS (PDF) (March 1985 printing ed.). Chatsworth, California, US: Quality Software. pp. 2–6. ISBN 0-912985-05-4. Archived (PDF) from the original on 2025-08-07. Retrieved 2025-08-07.
  29. ^ "Objective-C 2.0 Overview". Archived from the original on 2025-08-07.
  30. ^ Siracusa, John (2025-08-07). "Mac OS X 10.7 Lion: the Ars Technica review".
  31. ^ a b "Apple says Mac app makers must transition to ARC memory management by May". AppleInsider. 2025-08-07.
  32. ^ Cichon, Waldemar (2025-08-07). "App Store: Apple entfernt Programme mit Garbage Collection". Heise.de. Retrieved 2025-08-07.
  33. ^ Silva, Precious (2025-08-07). "iOS 8 vs Android 5.0 Lollipop: Apple Kills Google with Memory Efficiency". International Business Times. Archived from the original on 2025-08-07. Retrieved 2025-08-07.
  34. ^ a b Napier, Rob; Kumar, Mugunth (2025-08-07). iOS 6 Programming Pushing the Limit. John Wiley & Sons. ISBN 978-1-11844997-4. Retrieved 2025-08-07.
  35. ^ a b Cruz, José R. C. (2025-08-07). "Automatic Reference Counting on iOS". Dr. Dobbs. Archived from the original on 2025-08-07. Retrieved 2025-08-07.
  36. ^ Fu, Wei; Hauser, Carl (2005). "A real-time garbage collection framework for embedded systems". Proceedings of the 2005 Workshop on Software and Compilers for Embedded Systems - SCOPES '05. pp. 20–26. doi:10.1145/1140389.1140392. ISBN 1-59593207-0. S2CID 8635481.
  37. ^ ".NET nanoFramework".
  38. ^ Tene, Gil; Iyengar, Balaji; Wolf, Michael (2011). "C4: the continuously concurrent compacting collector" (PDF). ISMM '11: Proceedings of the international symposium on Memory management. doi:10.1145/1993478. ISBN 978-1-45030263-0. Archived (PDF) from the original on 2025-08-07.
  39. ^ Mazur, Nancy (May 2004). Compile-time garbage collection for the declarative language Mercury (PDF) (Thesis). Katholieke Universiteit Leuven. Archived (PDF) from the original on 2025-08-07.
  40. ^ Huelsbergen, Lorenz; Winterbottom, Phil (1998). "Very concurrent mark-&-sweep garbage collection without fine-grain synchronization" (PDF). Proceedings of the First International Symposium on Memory Management - ISMM '98. pp. 166–175. doi:10.1145/286860.286878. ISBN 1-58113114-3. S2CID 14399427. Archived (PDF) from the original on 2025-08-07.
  41. ^ "GC FAQ".
  42. ^ Lieberman, Henry; Hewitt, Carl (1983). "A real-time garbage collector based on the lifetimes of objects". Communications of the ACM. 26 (6): 419–429. doi:10.1145/358141.358147. hdl:1721.1/6335. S2CID 14161480.
  43. ^ Baker, Henry G. (1978). "List processing in real time on a serial computer". Communications of the ACM. 21 (4): 280–294. doi:10.1145/359460.359470. hdl:1721.1/41976. S2CID 17661259. see also description
  44. ^ McCloskey; Bacon; Cheng; Grove (2008), Staccato: A Parallel and Concurrent Real-time Compacting Garbage Collector for Multiprocessors (PDF), archived (PDF) from the original on 2025-08-07

Further reading

[edit]
[edit]
头部神经痛吃什么药好 正月二十九是什么星座 继往开来是什么意思 什么赴什么继 一什么眼镜
颈动脉斑块是什么意思 甜瓜不能和什么一起吃 暗物质是什么 精液是什么味道的 经常便秘是什么原因
鸡和什么属相最配对 炉果是什么 水瓶座是什么星座 峻字五行属什么 蹼是什么意思
鼻孔流血是什么原因 精液发黄是什么原因 虎父无犬女是什么意思 巫山云雨是什么意思 梦见煮饺子是什么意思
今天股市为什么大跌hcv9jop0ns3r.cn 喉咙干是什么原因wuhaiwuya.com 爱上一个人是什么感觉hcv7jop7ns1r.cn 领空是什么意思hcv9jop6ns9r.cn 马华念什么字hcv8jop9ns3r.cn
黄曲霉菌是什么颜色hcv8jop5ns8r.cn 为什么小孩子经常流鼻血hcv9jop6ns4r.cn 心率低吃什么药最好hcv8jop6ns1r.cn 经期适合吃什么食物hcv9jop1ns3r.cn 什么手串最好hcv8jop9ns3r.cn
什么是膝关节退行性变hcv9jop0ns7r.cn 遗精是什么意思hcv7jop9ns3r.cn 貂蝉姓什么hcv8jop3ns5r.cn 口干舌燥吃什么药hcv8jop3ns9r.cn 耳舌念什么hanqikai.com
喝水喝多了有什么坏处hcv9jop2ns7r.cn 藏茶属于什么茶hcv8jop9ns3r.cn 带状疱疹是什么原因引起hcv9jop4ns3r.cn 月德合是什么意思520myf.com 月经来的少是什么原因hcv8jop7ns2r.cn
百度