Search results for: intelligent computing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1678

Search results for: intelligent computing

298 Review of Theories and Applications of Genetic Programing in Sediment Yield Modeling

Authors: Adesoji Tunbosun Jaiyeola, Josiah Adeyemo

Abstract:

Sediment yield can be considered to be the total sediment load that leaves a drainage basin. The knowledge of the quantity of sediments present in a river at a particular time can lead to better flood capacity in reservoirs and consequently help to control over-bane flooding. Furthermore, as sediment accumulates in the reservoir, it gradually loses its ability to store water for the purposes for which it was built. The development of hydrological models to forecast the quantity of sediment present in a reservoir helps planners and managers of water resources systems, to understand the system better in terms of its problems and alternative ways to address them. The application of artificial intelligence models and technique to such real-life situations have proven to be an effective approach of solving complex problems. This paper makes an extensive review of literature relevant to the theories and applications of evolutionary algorithms, and most especially genetic programming. The successful applications of genetic programming as a soft computing technique were reviewed in sediment modelling and other branches of knowledge. Some fundamental issues such as benchmark, generalization ability, bloat and over-fitting and other open issues relating to the working principles of GP, which needs to be addressed by the GP community were also highlighted. This review aim to give GP theoreticians, researchers and the general community of GP enough research direction, valuable guide and also keep all stakeholders abreast of the issues which need attention during the next decade for the advancement of GP.

Keywords: benchmark, bloat, generalization, genetic programming, over-fitting, sediment yield

Procedia PDF Downloads 414
297 An Investigation of Performance Versus Security in Cognitive Radio Networks with Supporting Cloud Platforms

Authors: Kurniawan D. Irianto, Demetres D. Kouvatsos

Abstract:

The growth of wireless devices affects the availability of limited frequencies or spectrum bands as it has been known that spectrum bands are a natural resource that cannot be added. Many studies about available spectrum have been done and it shows that licensed frequencies are idle most of the time. Cognitive radio is one of the solutions to solve those problems. Cognitive radio is a promising technology that allows the unlicensed users known as secondary users (SUs) to access licensed bands without making interference to licensed users or primary users (PUs). As cloud computing has become popular in recent years, cognitive radio networks (CRNs) can be integrated with cloud platform. One of the important issues in CRNs is security. It becomes a problem since CRNs use radio frequencies as a medium for transmitting and CRNs share the same issues with wireless communication systems. Another critical issue in CRNs is performance. Security has adverse effect to performance and there are trade-offs between them. The goal of this paper is to investigate the performance related to security trade-off in CRNs with supporting cloud platforms. Furthermore, Queuing Network Models with preemptive resume and preemptive repeat identical priority are applied in this project to measure the impact of security to performance in CRNs with or without cloud platform. The generalized exponential (GE) type distribution is used to reflect the bursty inter-arrival and service times at the servers. The results show that the best performance is obtained when security is disable and cloud platform is enable.

Keywords: performance vs. security, cognitive radio networks, cloud platforms, GE-type distribution

Procedia PDF Downloads 325
296 Humanity's Still Sub-Quantum Core-Self Intelligence

Authors: Andrew Shugyo Daijo Bonnici

Abstract:

Core-Self Intelligence (CSI) is an absolutely still, non-verbal, non-cerebral intelligence. Our still core-self intelligence is felt at our body's center point of gravity, just an inch below our navel, deep within our lower abdomen. The still sub-quantum depth of core-Self remains untouched by the conditioning influences of family, society, culture, religion, and spiritual views that shape our personalities and ego-self identities. As core-Self intelligence is inborn and unconditioned, it exists within all human beings regardless of age, race, color, creed, mental acuity, or national origin. Our core-self intelligence functions as a wise and compassionate guide that advances our health and well-being, our mental clarity and emotional resiliency, our fearless peace and behavioral wisdom, and our ever-deepening compassion for self and others. Although our core-Self, with its absolutely still non-judgmental intelligence, operates far beneath the functioning of our ego-self identity and our thinking mind, it effectively coexists with our passing thoughts, all of our figuring and thinking, our logical and rational way of knowing, the ebb and flow of our feelings, and the natural or triggered emergence of our emotions. When we allow our whole inner somatic awareness to gently sink into the intelligent center point of gravity within our lower abdomen, the felt arising of our core- Self’s inborn stillness has a serene and relaxing effect on our ego-self and thinking mind. It naturally slows down the speedy passage of our involuntary thoughts, diminishes our ego-self's defensive and reactive functioning, and decreases narcissistic reflections on I, me, and mine. All of these healthy cognitive benefits advance our innate wisdom and compassion, facilitate our personal and interpersonal growth, and liberate the ever-fresh wonder and curiosity of our beginner's heartmind. In conclusion, by studying, exploring, and researching our core-Self intelligence, psychologists and psychotherapists can unlock new avenues for advancing the farther reaches of our mental, emotional, and spiritual health and well-being, our innate behavioral wisdom and boundless empathy, our lucid compassion for self and others, and our unwavering confidence in the still guiding light of our core-Self that exists at the abdominal center point of all human beings.

Keywords: intelligence, transpersonal, beginner’s heartmind, compassionate wisdom

Procedia PDF Downloads 37
295 Using Optical Character Recognition to Manage the Unstructured Disaster Data into Smart Disaster Management System

Authors: Dong Seop Lee, Byung Sik Kim

Abstract:

In the 4th Industrial Revolution, various intelligent technologies have been developed in many fields. These artificial intelligence technologies are applied in various services, including disaster management. Disaster information management does not just support disaster work, but it is also the foundation of smart disaster management. Furthermore, it gets historical disaster information using artificial intelligence technology. Disaster information is one of important elements of entire disaster cycle. Disaster information management refers to the act of managing and processing electronic data about disaster cycle from its’ occurrence to progress, response, and plan. However, information about status control, response, recovery from natural and social disaster events, etc. is mainly managed in the structured and unstructured form of reports. Those exist as handouts or hard-copies of reports. Such unstructured form of data is often lost or destroyed due to inefficient management. It is necessary to manage unstructured data for disaster information. In this paper, the Optical Character Recognition approach is used to convert handout, hard-copies, images or reports, which is printed or generated by scanners, etc. into electronic documents. Following that, the converted disaster data is organized into the disaster code system as disaster information. Those data are stored in the disaster database system. Gathering and creating disaster information based on Optical Character Recognition for unstructured data is important element as realm of the smart disaster management. In this paper, Korean characters were improved to over 90% character recognition rate by using upgraded OCR. In the case of character recognition, the recognition rate depends on the fonts, size, and special symbols of character. We improved it through the machine learning algorithm. These converted structured data is managed in a standardized disaster information form connected with the disaster code system. The disaster code system is covered that the structured information is stored and retrieve on entire disaster cycle such as historical disaster progress, damages, response, and recovery. The expected effect of this research will be able to apply it to smart disaster management and decision making by combining artificial intelligence technologies and historical big data.

Keywords: disaster information management, unstructured data, optical character recognition, machine learning

Procedia PDF Downloads 101
294 Extended Kalman Filter and Markov Chain Monte Carlo Method for Uncertainty Estimation: Application to X-Ray Fluorescence Machine Calibration and Metal Testing

Authors: S. Bouhouche, R. Drai, J. Bast

Abstract:

This paper is concerned with a method for uncertainty evaluation of steel sample content using X-Ray Fluorescence method. The considered method of analysis is a comparative technique based on the X-Ray Fluorescence; the calibration step assumes the adequate chemical composition of metallic analyzed sample. It is proposed in this work a new combined approach using the Kalman Filter and Markov Chain Monte Carlo (MCMC) for uncertainty estimation of steel content analysis. The Kalman filter algorithm is extended to the model identification of the chemical analysis process using the main factors affecting the analysis results; in this case, the estimated states are reduced to the model parameters. The MCMC is a stochastic method that computes the statistical properties of the considered states such as the probability distribution function (PDF) according to the initial state and the target distribution using Monte Carlo simulation algorithm. Conventional approach is based on the linear correlation, the uncertainty budget is established for steel Mn(wt%), Cr(wt%), Ni(wt%) and Mo(wt%) content respectively. A comparative study between the conventional procedure and the proposed method is given. This kind of approaches is applied for constructing an accurate computing procedure of uncertainty measurement.

Keywords: Kalman filter, Markov chain Monte Carlo, x-ray fluorescence calibration and testing, steel content measurement, uncertainty measurement

Procedia PDF Downloads 260
293 Renewable Energy and Environment: Design of a Decision Aided Tool for Sustainable Development

Authors: Mustapha Ouardouz, Mina Amharref, Abdessamed Bernoussi

Abstract:

The future energy, for limited energy resources countries, goes through renewable energies (solar, wind etc.). The renewable energies constitute a major component of the energy strategy to cover a substantial part of the growing needs and contribute to environmental protection by replacing fossil fuels. Indeed, sustainable development involves the promotion of renewable energy and the preservation of the environment by the use of clean energy technologies to limit emissions of greenhouse gases and reducing the pressure exerted on the forest cover. So the impact studies, of the energy use on the environment and farm-related risks are necessary. For that, a global approach integrating all the various sectors involved in such project seems to be the best approach. In this paper we present an approach based on the multi criteria analysis and the realization of one pilot to achieve the development of an innovative geo-intelligent environmental platform. An implementation of this platform will collect, process, analyze and manage environmental data in connection with the nature of used energy in the studied region. As an application we consider a region in the north of Morocco characterized by intense agricultural and industrials activities and using diverse renewable energy. The strategic goals of this platform are; the decision support for better governance, improving the responsiveness of public and private companies connected by providing them in real time with reliable data, modeling and simulation possibilities of energy scenarios, the identification of socio-technical solutions to introduce renewable energies and estimate technical and implantable potential by socio-economic analyzes and the assessment of infrastructure for the region and the communities, the preservation and enhancement of natural resources for better citizenship governance through democratization of access to environmental information, the tool will also perform simulations integrating environmental impacts of natural disasters, particularly those linked to climate change. Indeed extreme cases such as floods, droughts and storms will be no longer rare and therefore should be integrated into such projects.

Keywords: renewable energies, decision aided tool, environment, simulation

Procedia PDF Downloads 437
292 Preliminary Experience in Multiple Green Health Hospital Construction

Authors: Ming-Jyh Chen, Wen-Ming Huang, Yi-Chu Liu, Li-Hui Yang

Abstract:

Introduction: Social responsibility is the key to sustainable organizational development. Under the ground Green Health Hospital Declaration signed by our superintendent, we have launched comprehensive energy conservation management in medical services, the community, and the staff’s life. To execute environment-friendly promotion with robust strategies, we build up a low-carbon medical system and community with smart green public construction promotion as well as intensifying energy conservation education and communication. Purpose/Methods: With the support of the board and the superintendent, we construct an energy management team, commencing with an environment-friendly system, management, education, and ISO 50001 energy management system; we have ameliorated energy performance and energy efficiency and continuing. Results: In the year 2021, we have achieved multiple goals. The energy management system efficiently controls diesel, natural gas, and electricity usage. About 5% of the consumption is saved when compared to the numbers from 2018 and 2021. Our company develops intelligent services and promotes various paperless electronic operations to provide people with a vibrant and environmentally friendly lifestyle. The goal is to save 68.6% on printing and photocopying by reducing 35.15 million sheets of paper yearly. We strengthen the concept of environmental protection classification among colleagues. In the past two years, the amount of resource recycling has reached more than 650 tons, and the resource recycling rate has reached 70%. The annual growth rate of waste recycling is about 28 metric tons. Conclusions: To build a green medical system with “high efficacy, high value, low carbon, low reliance,” energy stewardship, economic prosperity, and social responsibility are our principles when it comes to formulation of energy conservation management strategies, converting limited sources to efficient usage, developing clean energy, and continuing with sustainable energy.

Keywords: energy efficiency, environmental education, green hospital, sustainable development

Procedia PDF Downloads 54
291 Towards Learning Query Expansion

Authors: Ahlem Bouziri, Chiraz Latiri, Eric Gaussier

Abstract:

The steady growth in the size of textual document collections is a key progress-driver for modern information retrieval techniques whose effectiveness and efficiency are constantly challenged. Given a user query, the number of retrieved documents can be overwhelmingly large, hampering their efficient exploitation by the user. In addition, retaining only relevant documents in a query answer is of paramount importance for an effective meeting of the user needs. In this situation, the query expansion technique offers an interesting solution for obtaining a complete answer while preserving the quality of retained documents. This mainly relies on an accurate choice of the added terms to an initial query. Interestingly enough, query expansion takes advantage of large text volumes by extracting statistical information about index terms co-occurrences and using it to make user queries better fit the real information needs. In this respect, a promising track consists in the application of data mining methods to extract dependencies between terms, namely a generic basis of association rules between terms. The key feature of our approach is a better trade off between the size of the mining result and the conveyed knowledge. Thus, face to the huge number of derived association rules and in order to select the optimal combination of query terms from the generic basis, we propose to model the problem as a classification problem and solve it using a supervised learning algorithm such as SVM or k-means. For this purpose, we first generate a training set using a genetic algorithm based approach that explores the association rules space in order to find an optimal set of expansion terms, improving the MAP of the search results. The experiments were performed on SDA 95 collection, a data collection for information retrieval. It was found that the results were better in both terms of MAP and NDCG. The main observation is that the hybridization of text mining techniques and query expansion in an intelligent way allows us to incorporate the good features of all of them. As this is a preliminary attempt in this direction, there is a large scope for enhancing the proposed method.

Keywords: supervised leaning, classification, query expansion, association rules

Procedia PDF Downloads 302
290 A LED Warning Vest as Safety Smart Textile and Active Cooperation in a Working Group for Building a Normative Standard

Authors: Werner Grommes

Abstract:

The institute of occupational safety and health works in a working group for building a normative standard for illuminated warning vests and did a lot of experiments and measurements as basic work (cooperation). Intelligent car headlamps are able to suppress conventional warning vests with retro-reflective stripes as a disturbing light. Illuminated warning vests are therefore required for occupational safety. However, they must not pose any danger to the wearer or other persons. Here, the risks of the batteries (lithium types), the maximum brightness (glare) and possible interference radiation from the electronics on the implant carrier must be taken into account. The all-around visibility, as well as the required range, play an important role here. For the study, many luminance measurements of already commercially available LEDs and electroluminescent warning vests, as well as their electromagnetic interference fields and aspects of electrical safety, were measured. The results of this study showed that LED lighting is all far too bright and causes strong glare. The integrated controls with pulse modulation and switching regulators cause electromagnetic interference fields. Rechargeable lithium batteries can explode depending on the temperature range. Electroluminescence brings even more hazards. A test method was developed for the evaluation of visibility at distances of 50, 100, and 150 m, including the interview of test persons. A measuring method was developed for the detection of glare effects at close range with the assignment of the maximum permissible luminance. The electromagnetic interference fields were tested in the time and frequency ranges. A risk and hazard analysis were prepared for the use of lithium batteries. The range of values for luminance and risk analysis for lithium batteries were discussed in the standards working group. These will be integrated into the standard. This paper gives a brief overview of the topics of illuminated warning vests, which takes into account the risks and hazards for the vest wearer or others

Keywords: illuminated warning vest, optical tests and measurements, risks, hazards, optical glare effects, LED, E-light, electric luminescent

Procedia PDF Downloads 92
289 Building a Blockchain-based Internet of Things

Authors: Rob van den Dam

Abstract:

Today’s Internet of Things (IoT) comprises more than a billion intelligent devices, connected via wired/wireless communications. The expected proliferation of hundreds of billions more places us at the threshold of a transformation sweeping across the communications industry. Yet, we found that the IoT architecture and solutions that currently work for billions of devices won’t necessarily scale to tomorrow’s hundreds of billions of devices because of high cost, lack of privacy, not future-proof, lack of functional value and broken business models. As the IoT scales exponentially, decentralized networks have the potential to reduce infrastructure and maintenance costs to manufacturers. Decentralization also promises increased robustness by removing single points of failure that could exist in traditional centralized networks. By shifting the power in the network from the center to the edges, devices gain greater autonomy and can become points of transactions and economic value creation for owners and users. To validate the underlying technology vision, IBM jointly developed with Samsung Electronics the autonomous decentralized peer-to- peer proof-of-concept (PoC). The primary objective of this PoC was to establish a foundation on which to demonstrate several capabilities that are fundamental to building a decentralized IoT. Though many commercial systems in the future will exist as hybrid centralized-decentralized models, the PoC demonstrated a fully distributed proof. The PoC (a) validated the future vision for decentralized systems to extensively augment today’s centralized solutions, (b) demonstrated foundational IoT tasks without the use of centralized control, (c) proved that empowered devices can engage autonomously in marketplace transactions. The PoC opens the door for the communications and electronics industry to further explore the challenges and opportunities of potential hybrid models that can address the complexity and variety of requirements posed by the internet that continues to scale. Contents: (a) The new approach for an IoT that will be secure and scalable, (b) The three foundational technologies that are key for the future IoT, (c) The related business models and user experiences, (d) How such an IoT will create an 'Economy of Things', (e) The role of users, devices, and industries in the IoT future, (f) The winners in the IoT economy.

Keywords: IoT, internet, wired, wireless

Procedia PDF Downloads 314
288 Bi-Criteria Vehicle Routing Problem for Possibility Environment

Authors: Bezhan Ghvaberidze

Abstract:

A multiple criteria optimization approach for the solution of the Fuzzy Vehicle Routing Problem (FVRP) is proposed. For the possibility environment the levels of movements between customers are calculated by the constructed simulation interactive algorithm. The first criterion of the bi-criteria optimization problem - minimization of the expectation of total fuzzy travel time on closed routes is constructed for the FVRP. A new, second criterion – maximization of feasibility of movement on the closed routes is constructed by the Choquet finite averaging operator. The FVRP is reduced to the bi-criteria partitioning problem for the so called “promising” routes which were selected from the all admissible closed routes. The convenient selection of the “promising” routes allows us to solve the reduced problem in the real-time computing. For the numerical solution of the bi-criteria partitioning problem the -constraint approach is used. An exact algorithm is implemented based on D. Knuth’s Dancing Links technique and the algorithm DLX. The Main objective was to present the new approach for FVRP, when there are some difficulties while moving on the roads. This approach is called FVRP for extreme conditions (FVRP-EC) on the roads. Also, the aim of this paper was to construct the solving model of the constructed FVRP. Results are illustrated on the numerical example where all Pareto-optimal solutions are found. Also, an approach for more complex model FVRP with time windows was developed. A numerical example is presented in which optimal routes are constructed for extreme conditions on the roads.

Keywords: combinatorial optimization, Fuzzy Vehicle routing problem, multiple objective programming, possibility theory

Procedia PDF Downloads 450
287 4D Modelling of Low Visibility Underwater Archaeological Excavations Using Multi-Source Photogrammetry in the Bulgarian Black Sea

Authors: Rodrigo Pacheco-Ruiz, Jonathan Adams, Felix Pedrotti

Abstract:

This paper introduces the applicability of underwater photogrammetric survey within challenging conditions as the main tool to enhance and enrich the process of documenting archaeological excavation through the creation of 4D models. Photogrammetry was being attempted on underwater archaeological sites at least as early as the 1970s’ and today the production of traditional 3D models is becoming a common practice within the discipline. Photogrammetry underwater is more often implemented to record exposed underwater archaeological remains and less so as a dynamic interpretative tool.  Therefore, it tends to be applied in bright environments and when underwater visibility is > 1m, reducing its implementation on most submerged archaeological sites in more turbid conditions. Recent years have seen significant development of better digital photographic sensors and the improvement of optical technology, ideal for darker environments. Such developments, in tandem with powerful processing computing systems, have allowed underwater photogrammetry to be used by this research as a standard recording and interpretative tool. Using multi-source photogrammetry (5, GoPro5 Hero Black cameras) this paper presents the accumulation of daily (4D) underwater surveys carried out in the Early Bronze Age (3,300 BC) to Late Ottoman (17th Century AD) archaeological site of Ropotamo in the Bulgarian Black Sea under challenging conditions (< 0.5m visibility). It proves that underwater photogrammetry can and should be used as one of the main recording methods even in low light and poor underwater conditions as a way to better understand the complexity of the underwater archaeological record.

Keywords: 4D modelling, Black Sea Maritime Archaeology Project, multi-source photogrammetry, low visibility underwater survey

Procedia PDF Downloads 211
286 Change Detection Analysis on Support Vector Machine Classifier of Land Use and Land Cover Changes: Case Study on Yangon

Authors: Khin Mar Yee, Mu Mu Than, Kyi Lint, Aye Aye Oo, Chan Mya Hmway, Khin Zar Chi Winn

Abstract:

The dynamic changes of Land Use and Land Cover (LULC) changes in Yangon have generally resulted the improvement of human welfare and economic development since the last twenty years. Making map of LULC is crucially important for the sustainable development of the environment. However, the exactly data on how environmental factors influence the LULC situation at the various scales because the nature of the natural environment is naturally composed of non-homogeneous surface features, so the features in the satellite data also have the mixed pixels. The main objective of this study is to the calculation of accuracy based on change detection of LULC changes by Support Vector Machines (SVMs). For this research work, the main data was satellite images of 1996, 2006 and 2015. Computing change detection statistics use change detection statistics to compile a detailed tabulation of changes between two classification images and Support Vector Machines (SVMs) process was applied with a soft approach at allocation as well as at a testing stage and to higher accuracy. The results of this paper showed that vegetation and cultivated area were decreased (average total 29 % from 1996 to 2015) because of conversion to the replacing over double of the built up area (average total 30 % from 1996 to 2015). The error matrix and confidence limits led to the validation of the result for LULC mapping.

Keywords: land use and land cover change, change detection, image processing, support vector machines

Procedia PDF Downloads 100
285 Methodical Approach for the Integration of a Digital Factory Twin into the Industry 4.0 Processes

Authors: R. Hellmuth

Abstract:

The orientation of flexibility and adaptability with regard to factory planning is at machine and process level. Factory buildings are not the focus of current research. Factory planning has the task of designing products, plants, processes, organization, areas and the construction of a factory. The adaptability of a factory can be divided into three types: spatial, organizational and technical adaptability. Spatial adaptability indicates the ability to expand and reduce the size of a factory. Here, the area-related breathing capacity plays the essential role. It mainly concerns the factory site, the plant layout and the production layout. The organizational ability to change enables the change and adaptation of organizational structures and processes. This includes structural and process organization as well as logistical processes and principles. New and reconfigurable operating resources, processes and factory buildings are referred to as technical adaptability. These three types of adaptability can be regarded independently of each other as undirected potentials of different characteristics. If there is a need for change, the types of changeability in the change process are combined to form a directed, complementary variable that makes change possible. When planning adaptability, importance must be attached to a balance between the types of adaptability. The vision of the intelligent factory building and the 'Internet of Things' presupposes the comprehensive digitalization of the spatial and technical environment. Through connectivity, the factory building must be empowered to support a company's value creation process by providing media such as light, electricity, heat, refrigeration, etc. In the future, communication with the surrounding factory building will take place on a digital or automated basis. In the area of industry 4.0, the function of the building envelope belongs to secondary or even tertiary processes, but these processes must also be included in the communication cycle. An integrative view of a continuous communication of primary, secondary and tertiary processes is currently not yet available and is being developed with the aid of methods in this research work. A comparison of the digital twin from the point of view of production and the factory building will be developed. Subsequently, a tool will be elaborated to classify digital twins from the perspective of data, degree of visualization, and the trades. Thus a contribution is made to better integrate the secondary and tertiary processes in a factory into the added value.

Keywords: adaptability, digital factory twin, factory planning, industry 4.0

Procedia PDF Downloads 133
284 Gulfnet: The Advent of Computer Networking in Saudi Arabia and Its Social Impact

Authors: Abdullah Almowanes

Abstract:

The speed of adoption of new information and communication technologies is often seen as an indicator of the growth of knowledge- and technological innovation-based regional economies. Indeed, technological progress and scientific inquiry in any society have undergone a particularly profound transformation with the introduction of computer networks. In the spring of 1981, the Bitnet network was launched to link thousands of nodes all over the world. In 1985 and as one of the first adopters of Bitnet, Saudi Arabia launched a Bitnet-based network named Gulfnet that linked computer centers, universities, and libraries of Saudi Arabia and other Gulf countries through high speed communication lines. In this paper, the origins and the deployment of Gulfnet are discussed as well as social, economical, political, and cultural ramifications of the new information reality created by the network. Despite its significance, the social and cultural aspects of Gulfnet have not been investigated in history of science and technology literature to a satisfactory degree before. The presented research is based on an extensive archival research aimed at seeking out and analyzing of primary evidence from archival sources and records. During its decade and a half-long existence, Gulfnet demonstrated that the scope and functionality of public computer networks in Saudi Arabia have to be fine-tuned for compliance with Islamic culture and political system of the country. It also helped lay the groundwork for the subsequent introduction of the Internet. Since 1980s, in just few decades, the proliferation of computer networks has transformed communications world-wide.

Keywords: Bitnet, computer networks, computing and culture, Gulfnet, Saudi Arabia

Procedia PDF Downloads 228
283 Field-Free Orbital Hall Current-Induced Deterministic Switching in the MO/Co₇₁Gd₂₉/Ru Structure

Authors: Zelalem Abebe Bekele, Kun Lei, Xiukai Lan, Xiangyu Liu, Hui Wen, Kaiyou Wang

Abstract:

Spin-polarized currents offer an efficient means of manipulating the magnetization of a ferromagnetic layer for big data and neuromorphic computing. Research has shown that the orbital Hall effect (OHE) can produce orbital currents, potentially surpassing the counter spin currents induced by the spin Hall effect. However, it’s essential to note that orbital currents alone cannot exert torque directly on a ferromagnetic layer, necessitating a conversion process from orbital to spin currents. Here, we present an efficient method for achieving perpendicularly magnetized spin-orbit torque (SOT) switching by harnessing the localized orbital Hall current generated from a Mo layer within a Mo/CoGd device. Our investigation reveals a remarkable enhancement in the interface-induced planar Hall effect (PHE) within the Mo/CoGd bilayer, resulting in the generation of a z-polarized planar current for manipulating the magnetization of CoGd layer without the need for an in-plane magnetic field. Furthermore, the Mo layer induces out-of-plane orbital current, boosting the in-plane and out-of-plane spin polarization by converting the orbital current into spin current within the dual-property CoGd layer. At the optimal Mo layer thickness, a low critical magnetization switching current density of 2.51×10⁶ A cm⁻² is achieved. This breakthrough opens avenues for all-electrical control energy-efficient magnetization switching through orbital current, advancing the field of spin-orbitronics.

Keywords: spin-orbit torque, orbital hall effect, spin hall current, orbital hall current, interface-generated planar hall current, anisotropic magnetoresistance

Procedia PDF Downloads 20
282 Emergence of Information Centric Networking and Web Content Mining: A Future Efficient Internet Architecture

Authors: Sajjad Akbar, Rabia Bashir

Abstract:

With the growth of the number of users, the Internet usage has evolved. Due to its key design principle, there is an incredible expansion in its size. This tremendous growth of the Internet has brought new applications (mobile video and cloud computing) as well as new user’s requirements i.e. content distribution environment, mobility, ubiquity, security and trust etc. The users are more interested in contents rather than their communicating peer nodes. The current Internet architecture is a host-centric networking approach, which is not suitable for the specific type of applications. With the growing use of multiple interactive applications, the host centric approach is considered to be less efficient as it depends on the physical location, for this, Information Centric Networking (ICN) is considered as the potential future Internet architecture. It is an approach that introduces uniquely named data as a core Internet principle. It uses the receiver oriented approach rather than sender oriented. It introduces the naming base information system at the network layer. Although ICN is considered as future Internet architecture but there are lot of criticism on it which mainly concerns that how ICN will manage the most relevant content. For this Web Content Mining(WCM) approaches can help in appropriate data management of ICN. To address this issue, this paper contributes by (i) discussing multiple ICN approaches (ii) analyzing different Web Content Mining approaches (iii) creating a new Internet architecture by merging ICN and WCM to solve the data management issues of ICN. From ICN, Content-Centric Networking (CCN) is selected for the new architecture, whereas, Agent-based approach from Web Content Mining is selected to find most appropriate data.

Keywords: agent based web content mining, content centric networking, information centric networking

Procedia PDF Downloads 446
281 Embedded System of Signal Processing on FPGA: Underwater Application Architecture

Authors: Abdelkader Elhanaoui, Mhamed Hadji, Rachid Skouri, Said Agounad

Abstract:

The purpose of this paper is to study the phenomenon of acoustic scattering by using a new method. The signal processing (Fast Fourier Transform FFT Inverse Fast Fourier Transform iFFT and BESSEL functions) is widely applied to obtain information with high precision accuracy. Signal processing has a wider implementation in general-purpose pro-cessors. Our interest was focused on the use of FPGAs (Field-Programmable Gate Ar-rays) in order to minimize the computational complexity in single processor architecture, then be accelerated on FPGA and meet real-time and energy efficiency requirements. Gen-eral-purpose processors are not efficient for signal processing. We implemented the acous-tic backscattered signal processing model on the Altera DE-SOC board and compared it to Odroid xu4. By comparison, the computing latency of Odroid xu4 and FPGA is 60 sec-onds and 3 seconds, respectively. The detailed SoC FPGA-based system has shown that acoustic spectra are performed up to 20 times faster than the Odroid xu4 implementation. FPGA-based system of processing algorithms is realized with an absolute error of about 10⁻³. This study underlines the increasing importance of embedded systems in underwater acoustics, especially in non-destructive testing. It is possible to obtain information related to the detection and characterization of submerged cells. So we have achieved good exper-imental results in real-time and energy efficiency.

Keywords: DE1 FPGA, acoustic scattering, form function, signal processing, non-destructive testing

Procedia PDF Downloads 53
280 Bioethanol Production from Wild Sorghum (Sorghum arundinacieum) and Spear Grass (Heteropogon contortus)

Authors: Adeyinka Adesanya, Isaac Bamgboye

Abstract:

There is a growing need to develop the processes to produce renewable fuels and chemicals due to the economic, political, and environmental concerns associated with fossil fuels. Lignocellulosic biomass is an excellent renewable feedstock because it is both abundant and inexpensive. This project aims at producing bioethanol from lignocellulosic plants (Sorghum Arundinacieum and Heteropogon Contortus) by biochemical means, computing the energy audit of the process and determining the fuel properties of the produced ethanol. Acid pretreatment (0.5% H2SO4 solution) and enzymatic hydrolysis (using malted barley as enzyme source) were employed. The ethanol yield of wild sorghum was found to be 20% while that of spear grass was 15%. The fuel properties of the bioethanol from wild sorghum are 1.227 centipoise for viscosity, 1.10 g/cm3 for density, 0.90 for specific gravity, 78 °C for boiling point and the cloud point was found to be below -30 °C. That of spear grass was 1.206 centipoise for viscosity, 0.93 g/cm3 for density 1.08 specific gravity, 78 °C for boiling point and the cloud point was also found to be below -30 °C. The energy audit shows that about 64 % of the total energy was used up during pretreatment, while product recovery which was done manually demanded about 31 % of the total energy. Enzymatic hydrolysis, fermentation, and distillation total energy input were 1.95 %, 1.49 % and 1.04 % respectively, the alcoholometric strength of bioethanol from wild sorghum was found to be 47 % and the alcoholometric strength of bioethanol from spear grass was 72 %. Also, the energy efficiency of the bioethanol production for both grasses was 3.85 %.

Keywords: lignocellulosic biomass, wild sorghum, spear grass, biochemical conversion

Procedia PDF Downloads 210
279 Digital Manufacturing: Evolution and a Process Oriented Approach to Align with Business Strategy

Authors: Abhimanyu Pati, Prabir K. Bandyopadhyay

Abstract:

The paper intends to highlight the significance of Digital Manufacturing (DM) strategy in support and achievement of business strategy and goals of any manufacturing organization. Towards this end, DM initiatives have been given a process perspective, while not undermining its technological significance, with a view to link its benefits directly with fulfilment of customer needs and expectations in a responsive and cost-effective manner. A digital process model has been proposed to categorize digitally enabled organizational processes with a view to create synergistic groups, which adopt and use digital tools having similar characteristics and functionalities. This will throw future opportunities for researchers and developers to create a unified technology environment for integration and orchestration of processes. Secondly, an effort has been made to apply “what” and “how” features of Quality Function Deployment (QFD) framework to establish the relationship between customers’ needs – both for external and internal customers, and the features of various digital processes, which support for the achievement of these customer expectations. The paper finally concludes that in the present highly competitive environment, business organizations cannot thrive to sustain unless they understand the significance of digital strategy and integrate it with their business strategy with a clearly defined implementation roadmap. A process-oriented approach to DM strategy will help business executives and leaders to appreciate its value propositions and its direct link to organization’s competitiveness.

Keywords: knowledge management, cloud computing, knowledge management approaches, cloud-based knowledge management

Procedia PDF Downloads 291
278 Computerized Analysis of Phonological Structure of 10,400 Brazilian Sign Language Signs

Authors: Wanessa G. Oliveira, Fernando C. Capovilla

Abstract:

Capovilla and Raphael’s Libras Dictionary documents a corpus of 4,200 Brazilian Sign Language (Libras) signs. Duduchi and Capovilla’s software SignTracking permits users to retrieve signs even when ignoring the gloss corresponding to it and to discover the meaning of all 4,200 signs sign simply by clicking on graphic menus of the sign characteristics (phonemes). Duduchi and Capovilla have discovered that the ease with which any given sign can be retrieved is an inverse function of the average popularity of its component phonemes. Thus, signs composed of rare (distinct) phonemes are easier to retrieve than are those composed of common phonemes. SignTracking offers a means of computing the average popularity of the phonemes that make up each one of 4,200 signs. It provides a precise measure of the degree of ease with which signs can be retrieved, and sign meanings can be discovered. Duduchi and Capovilla’s logarithmic model proved valid: The degree with which any given sign can be retrieved is an inverse function of the arithmetic mean of the logarithm of the popularity of each component phoneme. Capovilla, Raphael and Mauricio’s New Libras Dictionary documents a corpus of 10,400 Libras signs. The present analysis revealed Libras DNA structure by mapping the incidence of 501 sign phonemes resulting from the layered distribution of five parameters: 163 handshape phonemes (CherEmes-ManusIculi); 34 finger shape phonemes (DactilEmes-DigitumIculi); 55 hand placement phonemes (ArtrotoToposEmes-ArticulatiLocusIculi); 173 movement dimension phonemes (CinesEmes-MotusIculi) pertaining to direction, frequency, and type; and 76 Facial Expression phonemes (MascarEmes-PersonalIculi).

Keywords: Brazilian sign language, lexical retrieval, libras sign, sign phonology

Procedia PDF Downloads 312
277 Systems Intelligence in Management (High Performing Organizations and People Score High in Systems Intelligence)

Authors: Raimo P. Hämäläinen, Juha Törmänen, Esa Saarinen

Abstract:

Systems thinking has been acknowledged as an important approach in the strategy and management literature ever since the seminal works of Ackhoff in the 1970´s and Senge in the 1990´s. The early literature was very much focused on structures and organizational dynamics. Understanding systems is important but making improvements also needs ways to understand human behavior in systems. Peter Senge´s book The Fifth Discipline gave the inspiration to the development of the concept of Systems Intelligence. The concept integrates the concepts of personal mastery and systems thinking. SI refers to intelligent behavior in the context of complex systems involving interaction and feedback. It is a competence related to the skills needed in strategy and the environment of modern industrial engineering and management where people skills and systems are in an increasingly important role. The eight factors of Systems Intelligence have been identified from extensive surveys and the factors relate to perceiving, attitude, thinking and acting. The personal self-evaluation test developed consists of 32 items which can also be applied in a peer evaluation mode. The concept and test extend to organizations too. One can talk about organizational systems intelligence. This paper reports the results of an extensive survey based on peer evaluation. The results show that systems intelligence correlates positively with professional performance. People in a managerial role score higher in SI than others. Age improves the SI score but there is no gender difference. Top organizations score higher in all SI factors than lower ranked ones. The SI-tests can also be used as leadership and management development tools helping self-reflection and learning. Finding ways of enhancing learning organizational development is important. Today gamification is a new promising approach. The items in the SI test have been used to develop an interactive card game following the Topaasia game approach. It is an easy way of engaging people in a process which both helps participants see and approach problems in their organization. It also helps individuals in identifying challenges in their own behavior and in improving in their SI.

Keywords: gamification, management competence, organizational learning, systems thinking

Procedia PDF Downloads 71
276 Rapid and Long-term Alien Language Analysis - Forming Frameworks for the Interpretation of Alien Communication for More Intelligent Life

Authors: Samiksha Raviraja, Junaid Arif

Abstract:

One of the most important abilities in species is the ability to communicate. This paper proposes steps to take when and if aliens came in contact with humans, and how humans would communicate with them. The situation would be a time-sensitive scenario, meaning that communication is at the utmost importance if such an event were to happen. First, humans would need to establish mutual peace by conveying that there is no threat to the alien race. Second, the aliens would need to acknowledge this understanding and reciprocate. This would be extremely difficult to do regardless of their intelligence level unless they are very human-like and have similarities to our way of communicating. The first step towards understanding their mind is to analyze their level of intelligence - Level 1-Low intelligence, Level 2-Human-like intelligence or Level 3-Advanced or High Intelligence. These three levels go hand in hand with the Kardashev scale. Further, the Barrow scale will also be used to categorize alien species in hopes of developing a common universal language to communicate in. This paper will delve into how the level of intelligence can be used toward achieving communication with aliens by predicting various possible scenarios and outcomes by proposing an intensive categorization system. This can be achieved by studying their Emotional and Intelligence Quotient (along with technological and scientific knowledge/intelligence). The limitations and capabilities of their intelligence must also be studied. By observing how they respond and react (expressions and senses) to different kinds of scenarios, items and people, the data will help enable good categorisation. It can be hypothesised that the more human-like aliens are or can relate to humans, the more likely it is that communication is possible. Depending on the situation, either human can teach aliens a human language, or humans can learn an alien language, or both races work together to develop a mutual understanding or mode of communication. There are three possible ways of contact. Aliens visit Earth, or humans discover aliens while on space exploration or through technology in the form of signals. A much rarer case would be humans and aliens running into each other during a space expedition of their own. The first two possibilities allow a more in-depth analysis of the alien life and enhanced results compared. The importance of finding a method of talking with aliens is important in order to not only protect Earth and humans but rather for the advancement of Science through the shared knowledge between the two species.

Keywords: intelligence, Kardashev scale, Barrow scale, alien civilizations, emotional and intelligence quotient

Procedia PDF Downloads 40
275 A Gene Selection Algorithm for Microarray Cancer Classification Using an Improved Particle Swarm Optimization

Authors: Arfan Ali Nagra, Tariq Shahzad, Meshal Alharbi, Khalid Masood Khan, Muhammad Mugees Asif, Taher M. Ghazal, Khmaies Ouahada

Abstract:

Gene selection is an essential step for the classification of microarray cancer data. Gene expression cancer data (DNA microarray) facilitates computing the robust and concurrent expression of various genes. Particle swarm optimization (PSO) requires simple operators and less number of parameters for tuning the model in gene selection. The selection of a prognostic gene with small redundancy is a great challenge for the researcher as there are a few complications in PSO based selection method. In this research, a new variant of PSO (Self-inertia weight adaptive PSO) has been proposed. In the proposed algorithm, SIW-APSO-ELM is explored to achieve gene selection prediction accuracies. This new algorithm balances the exploration capabilities of the improved inertia weight adaptive particle swarm optimization and the exploitation. The self-inertia weight adaptive particle swarm optimization (SIW-APSO) is used to search the solution. The SIW-APSO is updated with an evolutionary process in such a way that each particle iteratively improves its velocities and positions. The extreme learning machine (ELM) has been designed for the selection procedure. The proposed method has been to identify a number of genes in the cancer dataset. The classification algorithm contains ELM, K- centroid nearest neighbor (KCNN), and support vector machine (SVM) to attain high forecast accuracy as compared to the start-of-the-art methods on microarray cancer datasets that show the effectiveness of the proposed method.

Keywords: microarray cancer, improved PSO, ELM, SVM, evolutionary algorithms

Procedia PDF Downloads 58
274 Transforming Healthcare with Immersive Visualization: An Analysis of Virtual and Holographic Health Information Platforms

Authors: Hossein Miri, Zhou YongQi, Chan Bormei-Suy

Abstract:

The development of advanced technologies and innovative solutions has opened up exciting new possibilities for revolutionizing healthcare systems. One such emerging concept is the use of virtual and holographic health information platforms that aim to provide interactive and personalized medical information to users. This paper provides a review of notable virtual and holographic health information platforms. It begins by highlighting the need for information visualization and 3D representation in healthcare. It then proceeds to provide background knowledge on information visualization and historical developments in 3D visualization technology. Additional domain knowledge concerning holography, holographic computing, and mixed reality is then introduced, followed by highlighting some of their common applications and use cases. After setting the scene and defining the context, the need and importance of virtual and holographic visualization in medicine are discussed. Subsequently, some of the current research areas and applications of digital holography and holographic technology are explored, alongside the importance and role of virtual and holographic visualization in genetics and genomics. An analysis of the key principles and concepts underlying virtual and holographic health information systems is presented, as well as their potential implications for healthcare are pointed out. The paper concludes by examining the most notable existing mixed-reality applications and systems that help doctors visualize diagnostic and genetic data and assist in patient education and communication. This paper is intended to be a valuable resource for researchers, developers, and healthcare professionals who are interested in the use of virtual and holographic technologies to improve healthcare.

Keywords: virtual, holographic, health information platform, personalized interactive medical information

Procedia PDF Downloads 55
273 Development of an Integrated Route Information Management Software

Authors: Oluibukun G. Ajayi, Joseph O. Odumosu, Oladimeji T. Babafemi, Azeez Z. Opeyemi, Asaleye O. Samuel

Abstract:

The need for the complete automation of every procedure of surveying and most especially, its engineering applications cannot be overemphasized due to the many demerits of the conventional manual or analogue approach. This paper presents the summarized details of the development of a Route Information Management (RIM) software. The software, codenamed ‘AutoROUTE’, was encoded using Microsoft visual studio-visual basic package, and it offers complete automation of the computational procedures and plan production involved in route surveying. It was experimented using a route survey data (longitudinal profile and cross sections) of a 2.7 km road which stretches from Dama to Lunko village in Minna, Niger State, acquired with the aid of a Hi-Target DGPS receiver. The developed software (AutoROUTE) is capable of computing the various simple curve parameters, horizontal curve, and vertical curve, and it can also plot road alignment, longitudinal profile, and cross-section with a capability to store this on the SQL incorporated into the Microsoft visual basic software. The plotted plans with AutoROUTE were compared with the plans produced with the conventional AutoCAD Civil 3D software, and AutoROUTE proved to be more user-friendly and accurate because it plots in three decimal places whereas AutoCAD plots in two decimal places. Also, it was discovered that AutoROUTE software is faster in plotting and the stages involved is less cumbersome compared to AutoCAD Civil 3D software.

Keywords: automated systems, cross sections, curves, engineering construction, longitudinal profile, route surveying

Procedia PDF Downloads 108
272 Development of a Matlab® Program for the Bi-Dimensional Truss Analysis Using the Stiffness Matrix Method

Authors: Angel G. De Leon Hernandez

Abstract:

A structure is defined as a physical system or, in certain cases, an arrangement of connected elements, capable of bearing certain loads. The structures are presented in every part of the daily life, e.g., in the designing of buildings, vehicles and mechanisms. The main goal of a structure designer is to develop a secure, aesthetic and maintainable system, considering the constraint imposed to every case. With the advances in the technology during the last decades, the capabilities of solving engineering problems have increased enormously. Nowadays the computers, play a critical roll in the structural analysis, pitifully, for university students the vast majority of these software are inaccessible due to the high complexity and cost they represent, even when the software manufacturers offer student versions. This is exactly the reason why the idea of developing a more reachable and easy-to-use computing tool. This program is designed as a tool for the university students enrolled in courser related to the structures analysis and designs, as a complementary instrument to achieve a better understanding of this area and to avoid all the tedious calculations. Also, the program can be useful for graduated engineers in the field of structural design and analysis. A graphical user interphase is included in the program to make it even simpler to operate it and understand the information requested and the obtained results. In the present document are included the theoretical basics in which the program is based to solve the structural analysis, the logical path followed in order to develop the program, the theoretical results, a discussion about the results and the validation of those results.

Keywords: stiffness matrix method, structural analysis, Matlab® applications, programming

Procedia PDF Downloads 98
271 Automatic Staging and Subtype Determination for Non-Small Cell Lung Carcinoma Using PET Image Texture Analysis

Authors: Seyhan Karaçavuş, Bülent Yılmaz, Ömer Kayaaltı, Semra İçer, Arzu Taşdemir, Oğuzhan Ayyıldız, Kübra Eset, Eser Kaya

Abstract:

In this study, our goal was to perform tumor staging and subtype determination automatically using different texture analysis approaches for a very common cancer type, i.e., non-small cell lung carcinoma (NSCLC). Especially, we introduced a texture analysis approach, called Law’s texture filter, to be used in this context for the first time. The 18F-FDG PET images of 42 patients with NSCLC were evaluated. The number of patients for each tumor stage, i.e., I-II, III or IV, was 14. The patients had ~45% adenocarcinoma (ADC) and ~55% squamous cell carcinoma (SqCCs). MATLAB technical computing language was employed in the extraction of 51 features by using first order statistics (FOS), gray-level co-occurrence matrix (GLCM), gray-level run-length matrix (GLRLM), and Laws’ texture filters. The feature selection method employed was the sequential forward selection (SFS). Selected textural features were used in the automatic classification by k-nearest neighbors (k-NN) and support vector machines (SVM). In the automatic classification of tumor stage, the accuracy was approximately 59.5% with k-NN classifier (k=3) and 69% with SVM (with one versus one paradigm), using 5 features. In the automatic classification of tumor subtype, the accuracy was around 92.7% with SVM one vs. one. Texture analysis of FDG-PET images might be used, in addition to metabolic parameters as an objective tool to assess tumor histopathological characteristics and in automatic classification of tumor stage and subtype.

Keywords: cancer stage, cancer cell type, non-small cell lung carcinoma, PET, texture analysis

Procedia PDF Downloads 301
270 Continuous FAQ Updating for Service Incident Ticket Resolution

Authors: Kohtaroh Miyamoto

Abstract:

As enterprise computing becomes more and more complex, the costs and technical challenges of IT system maintenance and support are increasing rapidly. One popular approach to managing IT system maintenance is to prepare and use an FAQ (Frequently Asked Questions) system to manage and reuse systems knowledge. Such an FAQ system can help reduce the resolution time for each service incident ticket. However, there is a major problem where over time the knowledge in such FAQs tends to become outdated. Much of the knowledge captured in the FAQ requires periodic updates in response to new insights or new trends in the problems addressed in order to maintain its usefulness for problem resolution. These updates require a systematic approach to define the exact portion of the FAQ and its content. Therefore, we are working on a novel method to hierarchically structure the FAQ and automate the updates of its structure and content. We use structured information and the unstructured text information with the timelines of the information in the service incident tickets. We cluster the tickets by structured category information, by keywords, and by keyword modifiers for the unstructured text information. We also calculate an urgency score based on trends, resolution times, and priorities. We carefully studied the tickets of one of our projects over a 2.5-year time period. After the first 6 months, we started to create FAQs and confirmed they improved the resolution times. We continued observing over the next 2 years to assess the ongoing effectiveness of our method for the automatic FAQ updates. We improved the ratio of tickets covered by the FAQ from 32.3% to 68.9% during this time. Also, the average time reduction of ticket resolution was between 31.6% and 43.9%. Subjective analysis showed more than 75% reported that the FAQ system was useful in reducing ticket resolution times.

Keywords: FAQ system, resolution time, service incident tickets, IT system maintenance

Procedia PDF Downloads 311
269 DLtrace: Toward Understanding and Testing Deep Learning Information Flow in Deep Learning-Based Android Apps

Authors: Jie Zhang, Qianyu Guo, Tieyi Zhang, Zhiyong Feng, Xiaohong Li

Abstract:

With the widespread popularity of mobile devices and the development of artificial intelligence (AI), deep learning (DL) has been extensively applied in Android apps. Compared with traditional Android apps (traditional apps), deep learning based Android apps (DL-based apps) need to use more third-party application programming interfaces (APIs) to complete complex DL inference tasks. However, existing methods (e.g., FlowDroid) for detecting sensitive information leakage in Android apps cannot be directly used to detect DL-based apps as they are difficult to detect third-party APIs. To solve this problem, we design DLtrace; a new static information flow analysis tool that can effectively recognize third-party APIs. With our proposed trace and detection algorithms, DLtrace can also efficiently detect privacy leaks caused by sensitive APIs in DL-based apps. Moreover, using DLtrace, we summarize the non-sequential characteristics of DL inference tasks in DL-based apps and the specific functionalities provided by DL models for such apps. We propose two formal definitions to deal with the common polymorphism and anonymous inner-class problems in the Android static analyzer. We conducted an empirical assessment with DLtrace on 208 popular DL-based apps in the wild and found that 26.0% of the apps suffered from sensitive information leakage. Furthermore, DLtrace has a more robust performance than FlowDroid in detecting and identifying third-party APIs. The experimental results demonstrate that DLtrace expands FlowDroid in understanding DL-based apps and detecting security issues therein.

Keywords: mobile computing, deep learning apps, sensitive information, static analysis

Procedia PDF Downloads 133