Optimized Random Access File Downloading
Abstract
Significant amount of publicly availiable information is files available for download via HTTP. To access it, users are forced to download the entire file, which is often an inefficient way to retrieve data. For example, when you need to use file data immediately or you need to access only a portion of the file, normal file downloading results in undesirable overhead and network resource consumption.
In this article a method for optimizing the on-demand download of a file is described. It is based on downloading of exactly those parts of the file that are currenly being read by a client software. To maintain a balance between the speed of execution of the next read request and the total level of associated overhead, an effective download method is proposed. It reduces the maximum waiting time of the read request execution of by an order of magnitude with an average overhead rate of 10%.
The mechanism of sequential read requests detection and predicting future calls to fragments of the file is described, which reduces the level of overhead costs by 2-10 times for partial file reading. To this end, a probabilistic model for determining the size of fragments which are most likely to be read sequentially is proposed. It is based on the two-parameter Weibull distribution and allows flexible configuration of its parameters for effective use with different access patterns.
Full Text:
PDF (Russian)References
Javvin Technologies, RTP, Network Protocols Handbook, 2005, ISBN 9780974094526.
Parmar H., Ed. And Thornburgh M., Ed., Adobe Systems Incorporated, Adobe’s Real Time Messaging Protocol, 2012.
Gill Binny S., Modha Dharmendra S. SARC: Sequential Prefetching in Adaptive Replacement Cache.
Wu, F., Xi, H., Li, J., & Zou, N. (2007). Linux readahead: less tricks for more. In . Proceedings of the Linux Symposium, 2, 273–284.
Shriver, E., Small, C., & Smith, K. A. (1999). Why does file system prefetching work? In Pro-ceedings of the Annual Technical Conference on 1999 USENIX Annual Technical Conference, (pp. 71-84).
Patterson III Russel. H. (1997). Informed Prefetching and Caching. PhD thesis, School of Computer Science, Carnegie Mellon University, Pittsburgh, PA.
Weibull, W. (1951), "A statistical distribution function of wide applicability", J. Appl. Mech.-Trans. ASME Т. 18 (3): 293–297.
Engineering statistics handbook. National Institute of Standards and Technology (2008).
Bach, M. J., The Design of the UNIX Operating System, Englewood Cliffs, NJ: Prentice-Hall, 1986.
Butt, A. R., Gniady, C., & Hu, Y. C. (2007). The Performance Impact of Kernel Prefetching on Buffer Cache Replacement Algorithms. IEEE Transactions on Computers, 56(7), 889–908. doi:10.1109/TC.2007.1029
Kroeger, T. M. and Long D. D. E., Design and implementation of a predictive file prefetching algorithm, USENIX Annual Technical Conference, pp. 105-118, 2001.
Liang, S., Jiang, S., & Zhang, X. (2007). STEP: Sequentiality and Thrashing Detection Based Prefetching to Improve Performance of Networked Storage Servers. 27th International Conference on Distributed Computing Systems (ICDCS’07), (p. 64).
Megiddo, N. and Modha D. S., ARC: A self-tuning, low overhead replacement cache, Proc. FAST Conf., pp. 115-130, 2003.
Megiddo, N. and Modha D. S., Outperforming LRU with an adaptive replacement cache algorithm, IEEE Computer, vol. 37, no. 4, pp. 58-65, 2004.
Sagias, Nikos C. & Karagiannidis, George K. (2005), "Gaussian class multivariate Weibull distributions: theory and applications in fading channels", Institute of Electrical and Electronics Engineers. Transactions on Information Theory Т. 51 (10): 3608–3619, ISSN 0018-9448, doi:10.1109/TIT.2005.855598
Wu, F.: Sequential file prefetching in Linux. In: Wiseman, Y., Jiang, S. (eds.) Advanced Operating Systems and Kernel Applications: Techniques and Technologies, ch. 11, pp. 217–236. IGI Global (2009)
Refbacks
- There are currently no refbacks.
Abava Кибербезопасность IT Congress 2024
ISSN: 2307-8162