Search results for: pervasive computing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1147

Search results for: pervasive computing

217 Extended Kalman Filter and Markov Chain Monte Carlo Method for Uncertainty Estimation: Application to X-Ray Fluorescence Machine Calibration and Metal Testing

Authors: S. Bouhouche, R. Drai, J. Bast

Abstract:

This paper is concerned with a method for uncertainty evaluation of steel sample content using X-Ray Fluorescence method. The considered method of analysis is a comparative technique based on the X-Ray Fluorescence; the calibration step assumes the adequate chemical composition of metallic analyzed sample. It is proposed in this work a new combined approach using the Kalman Filter and Markov Chain Monte Carlo (MCMC) for uncertainty estimation of steel content analysis. The Kalman filter algorithm is extended to the model identification of the chemical analysis process using the main factors affecting the analysis results; in this case, the estimated states are reduced to the model parameters. The MCMC is a stochastic method that computes the statistical properties of the considered states such as the probability distribution function (PDF) according to the initial state and the target distribution using Monte Carlo simulation algorithm. Conventional approach is based on the linear correlation, the uncertainty budget is established for steel Mn(wt%), Cr(wt%), Ni(wt%) and Mo(wt%) content respectively. A comparative study between the conventional procedure and the proposed method is given. This kind of approaches is applied for constructing an accurate computing procedure of uncertainty measurement.

Keywords: Kalman filter, Markov chain Monte Carlo, x-ray fluorescence calibration and testing, steel content measurement, uncertainty measurement

Procedia PDF Downloads 285
216 Improving Cheon-Kim-Kim-Song (CKKS) Performance with Vector Computation and GPU Acceleration

Authors: Smaran Manchala

Abstract:

Homomorphic Encryption (HE) enables computations on encrypted data without requiring decryption, mitigating data vulnerability during processing. Usable Fully Homomorphic Encryption (FHE) could revolutionize secure data operations across cloud computing, AI training, and healthcare, providing both privacy and functionality, however, the computational inefficiency of schemes like Cheon-Kim-Kim-Song (CKKS) hinders their widespread practical use. This study focuses on optimizing CKKS for faster matrix operations through the implementation of vector computation parallelization and GPU acceleration. The variable effects of vector parallelization on GPUs were explored, recognizing that while parallelization typically accelerates operations, it could introduce overhead that results in slower runtimes, especially in smaller, less computationally demanding operations. To assess performance, two neural network models, MLPN and CNN—were tested on the MNIST dataset using both ARM and x86-64 architectures, with CNN chosen for its higher computational demands. Each test was repeated 1,000 times, and outliers were removed via Z-score analysis to measure the effect of vector parallelization on CKKS performance. Model accuracy was also evaluated under CKKS encryption to ensure optimizations did not compromise results. According to the results of the trail runs, applying vector parallelization had a 2.63X efficiency increase overall with a 1.83X performance increase for x86-64 over ARM architecture. Overall, these results suggest that the application of vector parallelization in tandem with GPU acceleration significantly improves the efficiency of CKKS even while accounting for vector parallelization overhead, providing impact in future zero trust operations.

Keywords: CKKS scheme, runtime efficiency, fully homomorphic encryption (FHE), GPU acceleration, vector parallelization

Procedia PDF Downloads 27
215 Design, Implementation and Evaluation of Health and Social Justice Trainings in Nigeria

Authors: Juliet Sorensen, Anna Maitland

Abstract:

Introduction: Characterized by lack of water and sanitation, food insecurity, and low access to hospitals and clinics, informal urban settlements in Lagos, Nigeria have very poor health outcomes. With little education and a general inability to demand basic rights, these communities are often disempowered and isolated from understanding, claiming, or owning their health needs. Utilizing community-based participatory research characterized by interdisciplinary, cross-cultural partnerships, evidence-based assessments, and both primary and secondary source research, a holistic health education and advocacy program was developed in Lagos to address health barriers for targeted communities. This includes a first of its kind guide formulated to teach community-based health educators how to transmit health information to low-literacy Nigerian audiences while supporting behavior change models and social support mechanisms. This paper discusses the interdisciplinary contributions to developing a health education program while also looking at the need for greater beneficiary ownership and implementation of health justice and access. Methods: In March 2016, an interdisciplinary group of medical, legal, and business graduate students and faculty from Northwestern University conduced a Health Needs Assessment (HNA) in Lagos with a partner and a local non-governmental organization. The HNA revealed that members of informal urban communities in Lagos were lacking basic health literacy, but desired to remedy this lacuna. Further, the HNA revealed that even where the government mandates specific services, many vulnerable populations are unable to access these services. The HNA concluded that a program focused on education, advocacy, and organizing around anatomy, maternal and sexual health, infectious disease and malaria, HIV/AIDS, emergency care, and water and sanitation would respond to stated needs while also building capacity in communities to address health barriers. Results: Based on the HNA, including both primary and secondary source research on integrated health education approaches and behavior change models and responsive, adaptive material development, a holistic program was developed for the Lagos partners and first implemented in November 2016. This program trained community-nominated health educators in adult, low-literacy, knowledge exchange approaches, utilizing information identified by communities as a priority. After a second training in March 2017, these educators will teach community-based groups and will support and facilitate behavior change models and peer-support methods around basic issues like hand washing and disease transmission. They will be supported by community paralegals who will help ensure that newly trained community groups can act on education around access, such as receiving free vaccinations, maternal health care, and HIV/AIDS medicines. Materials will continue to be updated as needs and issues arise, with a focus on identifying best practices around health improvements that can be shared across these partner communities. Conclusion: These materials are the first of their kind, and address a void of health information and understanding pervasive in informal-urban Lagos communities. Initial feedback indicates high levels of commitment and interest, as well as investment by communities in these materials, largely because they are responsive, targeted, and build community capacity. This methodology is an important step in dignity-based health justice solutions, albeit in the process of refinement.

Keywords: community health educators, interdisciplinary and cross cultural partnerships, health justice and access, Nigeria

Procedia PDF Downloads 248
214 Bi-Criteria Vehicle Routing Problem for Possibility Environment

Authors: Bezhan Ghvaberidze

Abstract:

A multiple criteria optimization approach for the solution of the Fuzzy Vehicle Routing Problem (FVRP) is proposed. For the possibility environment the levels of movements between customers are calculated by the constructed simulation interactive algorithm. The first criterion of the bi-criteria optimization problem - minimization of the expectation of total fuzzy travel time on closed routes is constructed for the FVRP. A new, second criterion – maximization of feasibility of movement on the closed routes is constructed by the Choquet finite averaging operator. The FVRP is reduced to the bi-criteria partitioning problem for the so called “promising” routes which were selected from the all admissible closed routes. The convenient selection of the “promising” routes allows us to solve the reduced problem in the real-time computing. For the numerical solution of the bi-criteria partitioning problem the -constraint approach is used. An exact algorithm is implemented based on D. Knuth’s Dancing Links technique and the algorithm DLX. The Main objective was to present the new approach for FVRP, when there are some difficulties while moving on the roads. This approach is called FVRP for extreme conditions (FVRP-EC) on the roads. Also, the aim of this paper was to construct the solving model of the constructed FVRP. Results are illustrated on the numerical example where all Pareto-optimal solutions are found. Also, an approach for more complex model FVRP with time windows was developed. A numerical example is presented in which optimal routes are constructed for extreme conditions on the roads.

Keywords: combinatorial optimization, Fuzzy Vehicle routing problem, multiple objective programming, possibility theory

Procedia PDF Downloads 488
213 4D Modelling of Low Visibility Underwater Archaeological Excavations Using Multi-Source Photogrammetry in the Bulgarian Black Sea

Authors: Rodrigo Pacheco-Ruiz, Jonathan Adams, Felix Pedrotti

Abstract:

This paper introduces the applicability of underwater photogrammetric survey within challenging conditions as the main tool to enhance and enrich the process of documenting archaeological excavation through the creation of 4D models. Photogrammetry was being attempted on underwater archaeological sites at least as early as the 1970s’ and today the production of traditional 3D models is becoming a common practice within the discipline. Photogrammetry underwater is more often implemented to record exposed underwater archaeological remains and less so as a dynamic interpretative tool.  Therefore, it tends to be applied in bright environments and when underwater visibility is > 1m, reducing its implementation on most submerged archaeological sites in more turbid conditions. Recent years have seen significant development of better digital photographic sensors and the improvement of optical technology, ideal for darker environments. Such developments, in tandem with powerful processing computing systems, have allowed underwater photogrammetry to be used by this research as a standard recording and interpretative tool. Using multi-source photogrammetry (5, GoPro5 Hero Black cameras) this paper presents the accumulation of daily (4D) underwater surveys carried out in the Early Bronze Age (3,300 BC) to Late Ottoman (17th Century AD) archaeological site of Ropotamo in the Bulgarian Black Sea under challenging conditions (< 0.5m visibility). It proves that underwater photogrammetry can and should be used as one of the main recording methods even in low light and poor underwater conditions as a way to better understand the complexity of the underwater archaeological record.

Keywords: 4D modelling, Black Sea Maritime Archaeology Project, multi-source photogrammetry, low visibility underwater survey

Procedia PDF Downloads 238
212 Female Masochism, Jouissance, and (Re)workings of Trauma: An Ethnographic Study of the Bondage, Discipline, Dominance, Submission, Sadism, and Masochism Scene in Post-WWII Japan

Authors: Maari Sugawara

Abstract:

This ethnographic research interrogates female masochism within contemporary Japan, focusing on fifteen female BDSM (Bondage, Discipline, Dominance, Submission, Sadism, and Masochism) practitioners who identify as masochists, bottoms, and/or submissives. The study employs semi-structured interviews with these practitioners, representing diverse backgrounds and ages, to explore the intersection of sexuality and individual and/or collective trauma. The study focuses on a specific group of sadomasochists who, as survivors of gender and sexual violence, reenact their trauma through BDSM practices. This exploration draws on feminist performance studies, postcolonial studies, psychoanalysis, and affect analysis to highlight the complexities of female masochism. In a cultural milieu that often reduces female masochism to mere compliance with heteropatriarchy, this study argues that specific masochistic practices transcend submission, serving as vital strategies for confronting trauma and dismantling entrenched cultural narratives. Engaging with Lacan’s concept of feminine jouissance and the notion of "creative masochism" in the context of Japan's proximity to the imperial US, the study facilitates a nuanced exploration of female masochistic enjoyment. The study shows that these practices can act as both a means of survival and a mode of resilience, challenging dominant narratives that portray masochism solely as a form of subjugation, drawing on feminist performance studies, postcolonial studies, psychoanalysis, and affect analysis. It interprets masochism as a complex terrain of affective engagement, where shared suffering and consensual pain foster transformative possibilities. By analyzing BDSM as a cultural site, this research reframes masochism not only as a personal negotiation of pain but also as a broader allegory for Japan’s ongoing geopolitical self-positioning. Central to this analysis is the concept of "creative masochism," which positions masochism as both a metaphor and a practice through which Japan addresses its historical subordination to the United States. This framework allows for a deeper understanding of how participants' lived desires intersect with national narratives, illuminating the relationship between personal experiences and larger socio-political dynamics. It incorporates sadomasochistic metaphors into Japan-U.S. interactions, reflecting underlying patterns of submission, resistance, and cultural negotiation. Additionally, this research examines the effects, affects, and limitations of masochism within the post-WWII Japanese context, providing insights into how masochism can reshape one's relationship with their surroundings. This study challenges the notion that female masochism is entirely subsumed by hegemonic structures, revealing instead that subjects can assert their autonomy within their experiences of pleasure and pain. The consensual enactment of violence within these encounters emerges as a complex and ambivalent process, wherein pain transforms into a generative force for reimagining alternative forms of sociality and belonging. Additionally, the research identifies contradictions and connections between the personal and political, examining how kink practices shape participants' daily lives and identities, and vice versa, highlighting the profound impact of these practices on their sense of self and community. Ultimately, it reaffirms agency in the face of pervasive heteronormative power dynamics, suggesting that masochism can serve as a site of both resistance and redefinition.

Keywords: female masochism, BDSM, Japan, masochism, trauma, sexual violence

Procedia PDF Downloads 23
211 PsyVBot: Chatbot for Accurate Depression Diagnosis using Long Short-Term Memory and NLP

Authors: Thaveesha Dheerasekera, Dileeka Sandamali Alwis

Abstract:

The escalating prevalence of mental health issues, such as depression and suicidal ideation, is a matter of significant global concern. It is plausible that a variety of factors, such as life events, social isolation, and preexisting physiological or psychological health conditions, could instigate or exacerbate these conditions. Traditional approaches to diagnosing depression entail a considerable amount of time and necessitate the involvement of adept practitioners. This underscores the necessity for automated systems capable of promptly detecting and diagnosing symptoms of depression. The PsyVBot system employs sophisticated natural language processing and machine learning methodologies, including the use of the NLTK toolkit for dataset preprocessing and the utilization of a Long Short-Term Memory (LSTM) model. The PsyVBot exhibits a remarkable ability to diagnose depression with a 94% accuracy rate through the analysis of user input. Consequently, this resource proves to be efficacious for individuals, particularly those enrolled in academic institutions, who may encounter challenges pertaining to their psychological well-being. The PsyVBot employs a Long Short-Term Memory (LSTM) model that comprises a total of three layers, namely an embedding layer, an LSTM layer, and a dense layer. The stratification of these layers facilitates a precise examination of linguistic patterns that are associated with the condition of depression. The PsyVBot has the capability to accurately assess an individual's level of depression through the identification of linguistic and contextual cues. The task is achieved via a rigorous training regimen, which is executed by utilizing a dataset comprising information sourced from the subreddit r/SuicideWatch. The diverse data present in the dataset ensures precise and delicate identification of symptoms linked with depression, thereby guaranteeing accuracy. PsyVBot not only possesses diagnostic capabilities but also enhances the user experience through the utilization of audio outputs. This feature enables users to engage in more captivating and interactive interactions. The PsyVBot platform offers individuals the opportunity to conveniently diagnose mental health challenges through a confidential and user-friendly interface. Regarding the advancement of PsyVBot, maintaining user confidentiality and upholding ethical principles are of paramount significance. It is imperative to note that diligent efforts are undertaken to adhere to ethical standards, thereby safeguarding the confidentiality of user information and ensuring its security. Moreover, the chatbot fosters a conducive atmosphere that is supportive and compassionate, thereby promoting psychological welfare. In brief, PsyVBot is an automated conversational agent that utilizes an LSTM model to assess the level of depression in accordance with the input provided by the user. The demonstrated accuracy rate of 94% serves as a promising indication of the potential efficacy of employing natural language processing and machine learning techniques in tackling challenges associated with mental health. The reliability of PsyVBot is further improved by the fact that it makes use of the Reddit dataset and incorporates Natural Language Toolkit (NLTK) for preprocessing. PsyVBot represents a pioneering and user-centric solution that furnishes an easily accessible and confidential medium for seeking assistance. The present platform is offered as a modality to tackle the pervasive issue of depression and the contemplation of suicide.

Keywords: chatbot, depression diagnosis, LSTM model, natural language process

Procedia PDF Downloads 71
210 Change Detection Analysis on Support Vector Machine Classifier of Land Use and Land Cover Changes: Case Study on Yangon

Authors: Khin Mar Yee, Mu Mu Than, Kyi Lint, Aye Aye Oo, Chan Mya Hmway, Khin Zar Chi Winn

Abstract:

The dynamic changes of Land Use and Land Cover (LULC) changes in Yangon have generally resulted the improvement of human welfare and economic development since the last twenty years. Making map of LULC is crucially important for the sustainable development of the environment. However, the exactly data on how environmental factors influence the LULC situation at the various scales because the nature of the natural environment is naturally composed of non-homogeneous surface features, so the features in the satellite data also have the mixed pixels. The main objective of this study is to the calculation of accuracy based on change detection of LULC changes by Support Vector Machines (SVMs). For this research work, the main data was satellite images of 1996, 2006 and 2015. Computing change detection statistics use change detection statistics to compile a detailed tabulation of changes between two classification images and Support Vector Machines (SVMs) process was applied with a soft approach at allocation as well as at a testing stage and to higher accuracy. The results of this paper showed that vegetation and cultivated area were decreased (average total 29 % from 1996 to 2015) because of conversion to the replacing over double of the built up area (average total 30 % from 1996 to 2015). The error matrix and confidence limits led to the validation of the result for LULC mapping.

Keywords: land use and land cover change, change detection, image processing, support vector machines

Procedia PDF Downloads 140
209 Gulfnet: The Advent of Computer Networking in Saudi Arabia and Its Social Impact

Authors: Abdullah Almowanes

Abstract:

The speed of adoption of new information and communication technologies is often seen as an indicator of the growth of knowledge- and technological innovation-based regional economies. Indeed, technological progress and scientific inquiry in any society have undergone a particularly profound transformation with the introduction of computer networks. In the spring of 1981, the Bitnet network was launched to link thousands of nodes all over the world. In 1985 and as one of the first adopters of Bitnet, Saudi Arabia launched a Bitnet-based network named Gulfnet that linked computer centers, universities, and libraries of Saudi Arabia and other Gulf countries through high speed communication lines. In this paper, the origins and the deployment of Gulfnet are discussed as well as social, economical, political, and cultural ramifications of the new information reality created by the network. Despite its significance, the social and cultural aspects of Gulfnet have not been investigated in history of science and technology literature to a satisfactory degree before. The presented research is based on an extensive archival research aimed at seeking out and analyzing of primary evidence from archival sources and records. During its decade and a half-long existence, Gulfnet demonstrated that the scope and functionality of public computer networks in Saudi Arabia have to be fine-tuned for compliance with Islamic culture and political system of the country. It also helped lay the groundwork for the subsequent introduction of the Internet. Since 1980s, in just few decades, the proliferation of computer networks has transformed communications world-wide.

Keywords: Bitnet, computer networks, computing and culture, Gulfnet, Saudi Arabia

Procedia PDF Downloads 247
208 Reimagining Kinships: Queering the Labor of Care and Motherhood in Japan’s Rental Family Services

Authors: Maari Sugawara

Abstract:

This study investigates the constructed notion of “motherhood” and queered forms of care in contemporary Japan, focusing on rental family services. In Japan, the concept of motherhood is often equated with womanhood, reflecting a pervasive ideology that views motherhood as an essential aspect of a woman's societal role, particularly amidst economic recovery and an aging population. This study interrogates these gendered expectations by linking rental family services, particularly the role of rental mothers, to traditional caregiving roles. It critiques the gendered construction of domestic labor and aims to expand conceptions of alternative family structures and caregiving roles beyond normative frameworks. Emerging in the 1980s to provide companionship for the elderly, rental family services have evolved to meet diverse social needs, with paid actors fulfilling familial roles at various social events. Despite their growing prevalence, academic exploration of this phenomenon remains limited. This research aims to fill that gap by investigating the cultural, social, and economic factors fueling the popularity of rental family services and analyzing their implications for contemporary understandings of family dynamics and care labor in Japan. Furthermore, this study underscores the disproportionate domestic labor burden women in Japan bear, often managing time-intensive household tasks, which creates a "double burden" for those in full-time employment. Care work, including elderly and disability support, is undervalued and typically compensated at near-minimum wage levels, with women predominantly filling these low-wage roles. This gender disparity in Japan's care industry contributes to labor shortages in caregiving and childcare, highlighting broader structural inequities in the labor market. Through semi-structured qualitative interviews with fifteen rental mothers, this study investigates their experiences, motivations, role dynamics, and emotional labor. It critically examines whether the labor performed by rental family actors constitutes a subversive practice deserving of appropriate compensation. Utilizing a role-playing method, the author engages with rental mothers as if they were her own, reflecting the dynamics of compensated labor. This interaction delves into the economic and emotional aspects of constructed motherhood, facilitating a broader inquiry into the value of both productive and reproductive labor in Japan. The study also investigates the relationship between sex work and rental family services within the socio-economic landscape, recognizing the links between the welfare sector and female employment in legal sex work. Although distinct, these sectors merit joint consideration due to the commonality of male clients in both industries. This research engages with theoretical perspectives framing mobile sex work as inherently queer, directly challenging the dominance of heteronormativity. The agency exercised by sex workers complicates narratives of conformity and deviance, underscoring the need to reevaluate caregiving labor in both paid and unpaid contexts. Ultimately, this research critiques the intersection of gender, care, and labor in contemporary Japan by examining the undervaluation of traditional caregiving roles alongside the labor involved in rental family services. It challenges Japanese policies that equate womanhood with motherhood and explores the potential of viewing outsourced care as queered maternal and non-reproductive labor, advocating for the recognition of alternative family structures and non-reproductive forms of motherhood.

Keywords: motherhood, alternative family structures, carework, Japan, queer studies

Procedia PDF Downloads 19
207 Field-Free Orbital Hall Current-Induced Deterministic Switching in the MO/Co₇₁Gd₂₉/Ru Structure

Authors: Zelalem Abebe Bekele, Kun Lei, Xiukai Lan, Xiangyu Liu, Hui Wen, Kaiyou Wang

Abstract:

Spin-polarized currents offer an efficient means of manipulating the magnetization of a ferromagnetic layer for big data and neuromorphic computing. Research has shown that the orbital Hall effect (OHE) can produce orbital currents, potentially surpassing the counter spin currents induced by the spin Hall effect. However, it’s essential to note that orbital currents alone cannot exert torque directly on a ferromagnetic layer, necessitating a conversion process from orbital to spin currents. Here, we present an efficient method for achieving perpendicularly magnetized spin-orbit torque (SOT) switching by harnessing the localized orbital Hall current generated from a Mo layer within a Mo/CoGd device. Our investigation reveals a remarkable enhancement in the interface-induced planar Hall effect (PHE) within the Mo/CoGd bilayer, resulting in the generation of a z-polarized planar current for manipulating the magnetization of CoGd layer without the need for an in-plane magnetic field. Furthermore, the Mo layer induces out-of-plane orbital current, boosting the in-plane and out-of-plane spin polarization by converting the orbital current into spin current within the dual-property CoGd layer. At the optimal Mo layer thickness, a low critical magnetization switching current density of 2.51×10⁶ A cm⁻² is achieved. This breakthrough opens avenues for all-electrical control energy-efficient magnetization switching through orbital current, advancing the field of spin-orbitronics.

Keywords: spin-orbit torque, orbital hall effect, spin hall current, orbital hall current, interface-generated planar hall current, anisotropic magnetoresistance

Procedia PDF Downloads 57
206 Emergence of Information Centric Networking and Web Content Mining: A Future Efficient Internet Architecture

Authors: Sajjad Akbar, Rabia Bashir

Abstract:

With the growth of the number of users, the Internet usage has evolved. Due to its key design principle, there is an incredible expansion in its size. This tremendous growth of the Internet has brought new applications (mobile video and cloud computing) as well as new user’s requirements i.e. content distribution environment, mobility, ubiquity, security and trust etc. The users are more interested in contents rather than their communicating peer nodes. The current Internet architecture is a host-centric networking approach, which is not suitable for the specific type of applications. With the growing use of multiple interactive applications, the host centric approach is considered to be less efficient as it depends on the physical location, for this, Information Centric Networking (ICN) is considered as the potential future Internet architecture. It is an approach that introduces uniquely named data as a core Internet principle. It uses the receiver oriented approach rather than sender oriented. It introduces the naming base information system at the network layer. Although ICN is considered as future Internet architecture but there are lot of criticism on it which mainly concerns that how ICN will manage the most relevant content. For this Web Content Mining(WCM) approaches can help in appropriate data management of ICN. To address this issue, this paper contributes by (i) discussing multiple ICN approaches (ii) analyzing different Web Content Mining approaches (iii) creating a new Internet architecture by merging ICN and WCM to solve the data management issues of ICN. From ICN, Content-Centric Networking (CCN) is selected for the new architecture, whereas, Agent-based approach from Web Content Mining is selected to find most appropriate data.

Keywords: agent based web content mining, content centric networking, information centric networking

Procedia PDF Downloads 475
205 Embedded System of Signal Processing on FPGA: Underwater Application Architecture

Authors: Abdelkader Elhanaoui, Mhamed Hadji, Rachid Skouri, Said Agounad

Abstract:

The purpose of this paper is to study the phenomenon of acoustic scattering by using a new method. The signal processing (Fast Fourier Transform FFT Inverse Fast Fourier Transform iFFT and BESSEL functions) is widely applied to obtain information with high precision accuracy. Signal processing has a wider implementation in general-purpose pro-cessors. Our interest was focused on the use of FPGAs (Field-Programmable Gate Ar-rays) in order to minimize the computational complexity in single processor architecture, then be accelerated on FPGA and meet real-time and energy efficiency requirements. Gen-eral-purpose processors are not efficient for signal processing. We implemented the acous-tic backscattered signal processing model on the Altera DE-SOC board and compared it to Odroid xu4. By comparison, the computing latency of Odroid xu4 and FPGA is 60 sec-onds and 3 seconds, respectively. The detailed SoC FPGA-based system has shown that acoustic spectra are performed up to 20 times faster than the Odroid xu4 implementation. FPGA-based system of processing algorithms is realized with an absolute error of about 10⁻³. This study underlines the increasing importance of embedded systems in underwater acoustics, especially in non-destructive testing. It is possible to obtain information related to the detection and characterization of submerged cells. So we have achieved good exper-imental results in real-time and energy efficiency.

Keywords: DE1 FPGA, acoustic scattering, form function, signal processing, non-destructive testing

Procedia PDF Downloads 79
204 Bioethanol Production from Wild Sorghum (Sorghum arundinacieum) and Spear Grass (Heteropogon contortus)

Authors: Adeyinka Adesanya, Isaac Bamgboye

Abstract:

There is a growing need to develop the processes to produce renewable fuels and chemicals due to the economic, political, and environmental concerns associated with fossil fuels. Lignocellulosic biomass is an excellent renewable feedstock because it is both abundant and inexpensive. This project aims at producing bioethanol from lignocellulosic plants (Sorghum Arundinacieum and Heteropogon Contortus) by biochemical means, computing the energy audit of the process and determining the fuel properties of the produced ethanol. Acid pretreatment (0.5% H2SO4 solution) and enzymatic hydrolysis (using malted barley as enzyme source) were employed. The ethanol yield of wild sorghum was found to be 20% while that of spear grass was 15%. The fuel properties of the bioethanol from wild sorghum are 1.227 centipoise for viscosity, 1.10 g/cm3 for density, 0.90 for specific gravity, 78 °C for boiling point and the cloud point was found to be below -30 °C. That of spear grass was 1.206 centipoise for viscosity, 0.93 g/cm3 for density 1.08 specific gravity, 78 °C for boiling point and the cloud point was also found to be below -30 °C. The energy audit shows that about 64 % of the total energy was used up during pretreatment, while product recovery which was done manually demanded about 31 % of the total energy. Enzymatic hydrolysis, fermentation, and distillation total energy input were 1.95 %, 1.49 % and 1.04 % respectively, the alcoholometric strength of bioethanol from wild sorghum was found to be 47 % and the alcoholometric strength of bioethanol from spear grass was 72 %. Also, the energy efficiency of the bioethanol production for both grasses was 3.85 %.

Keywords: lignocellulosic biomass, wild sorghum, spear grass, biochemical conversion

Procedia PDF Downloads 236
203 Digital Manufacturing: Evolution and a Process Oriented Approach to Align with Business Strategy

Authors: Abhimanyu Pati, Prabir K. Bandyopadhyay

Abstract:

The paper intends to highlight the significance of Digital Manufacturing (DM) strategy in support and achievement of business strategy and goals of any manufacturing organization. Towards this end, DM initiatives have been given a process perspective, while not undermining its technological significance, with a view to link its benefits directly with fulfilment of customer needs and expectations in a responsive and cost-effective manner. A digital process model has been proposed to categorize digitally enabled organizational processes with a view to create synergistic groups, which adopt and use digital tools having similar characteristics and functionalities. This will throw future opportunities for researchers and developers to create a unified technology environment for integration and orchestration of processes. Secondly, an effort has been made to apply “what” and “how” features of Quality Function Deployment (QFD) framework to establish the relationship between customers’ needs – both for external and internal customers, and the features of various digital processes, which support for the achievement of these customer expectations. The paper finally concludes that in the present highly competitive environment, business organizations cannot thrive to sustain unless they understand the significance of digital strategy and integrate it with their business strategy with a clearly defined implementation roadmap. A process-oriented approach to DM strategy will help business executives and leaders to appreciate its value propositions and its direct link to organization’s competitiveness.

Keywords: knowledge management, cloud computing, knowledge management approaches, cloud-based knowledge management

Procedia PDF Downloads 310
202 Computerized Analysis of Phonological Structure of 10,400 Brazilian Sign Language Signs

Authors: Wanessa G. Oliveira, Fernando C. Capovilla

Abstract:

Capovilla and Raphael’s Libras Dictionary documents a corpus of 4,200 Brazilian Sign Language (Libras) signs. Duduchi and Capovilla’s software SignTracking permits users to retrieve signs even when ignoring the gloss corresponding to it and to discover the meaning of all 4,200 signs sign simply by clicking on graphic menus of the sign characteristics (phonemes). Duduchi and Capovilla have discovered that the ease with which any given sign can be retrieved is an inverse function of the average popularity of its component phonemes. Thus, signs composed of rare (distinct) phonemes are easier to retrieve than are those composed of common phonemes. SignTracking offers a means of computing the average popularity of the phonemes that make up each one of 4,200 signs. It provides a precise measure of the degree of ease with which signs can be retrieved, and sign meanings can be discovered. Duduchi and Capovilla’s logarithmic model proved valid: The degree with which any given sign can be retrieved is an inverse function of the arithmetic mean of the logarithm of the popularity of each component phoneme. Capovilla, Raphael and Mauricio’s New Libras Dictionary documents a corpus of 10,400 Libras signs. The present analysis revealed Libras DNA structure by mapping the incidence of 501 sign phonemes resulting from the layered distribution of five parameters: 163 handshape phonemes (CherEmes-ManusIculi); 34 finger shape phonemes (DactilEmes-DigitumIculi); 55 hand placement phonemes (ArtrotoToposEmes-ArticulatiLocusIculi); 173 movement dimension phonemes (CinesEmes-MotusIculi) pertaining to direction, frequency, and type; and 76 Facial Expression phonemes (MascarEmes-PersonalIculi).

Keywords: Brazilian sign language, lexical retrieval, libras sign, sign phonology

Procedia PDF Downloads 346
201 Cryptographic Resource Allocation Algorithm Based on Deep Reinforcement Learning

Authors: Xu Jie

Abstract:

As a key network security method, cryptographic services must fully cope with problems such as the wide variety of cryptographic algorithms, high concurrency requirements, random job crossovers, and instantaneous surges in workloads. Its complexity and dynamics also make it difficult for traditional static security policies to cope with the ever-changing situation. Cyber Threats and Environment. Traditional resource scheduling algorithms are inadequate when facing complex decision-making problems in dynamic environments. A network cryptographic resource allocation algorithm based on reinforcement learning is proposed, aiming to optimize task energy consumption, migration cost, and fitness of differentiated services (including user, data, and task security) by modeling the multi-job collaborative cryptographic service scheduling problem as a multi-objective optimized job flow scheduling problem and using a multi-agent reinforcement learning method, efficient scheduling and optimal configuration of cryptographic service resources are achieved. By introducing reinforcement learning, resource allocation strategies can be adjusted in real-time in a dynamic environment, improving resource utilization and achieving load balancing. Experimental results show that this algorithm has significant advantages in path planning length, system delay and network load balancing and effectively solves the problem of complex resource scheduling in cryptographic services.

Keywords: cloud computing, cryptography on-demand service, reinforcement learning, workflow scheduling

Procedia PDF Downloads 18
200 Transforming Healthcare with Immersive Visualization: An Analysis of Virtual and Holographic Health Information Platforms

Authors: Hossein Miri, Zhou YongQi, Chan Bormei-Suy

Abstract:

The development of advanced technologies and innovative solutions has opened up exciting new possibilities for revolutionizing healthcare systems. One such emerging concept is the use of virtual and holographic health information platforms that aim to provide interactive and personalized medical information to users. This paper provides a review of notable virtual and holographic health information platforms. It begins by highlighting the need for information visualization and 3D representation in healthcare. It then proceeds to provide background knowledge on information visualization and historical developments in 3D visualization technology. Additional domain knowledge concerning holography, holographic computing, and mixed reality is then introduced, followed by highlighting some of their common applications and use cases. After setting the scene and defining the context, the need and importance of virtual and holographic visualization in medicine are discussed. Subsequently, some of the current research areas and applications of digital holography and holographic technology are explored, alongside the importance and role of virtual and holographic visualization in genetics and genomics. An analysis of the key principles and concepts underlying virtual and holographic health information systems is presented, as well as their potential implications for healthcare are pointed out. The paper concludes by examining the most notable existing mixed-reality applications and systems that help doctors visualize diagnostic and genetic data and assist in patient education and communication. This paper is intended to be a valuable resource for researchers, developers, and healthcare professionals who are interested in the use of virtual and holographic technologies to improve healthcare.

Keywords: virtual, holographic, health information platform, personalized interactive medical information

Procedia PDF Downloads 89
199 Development of an Integrated Route Information Management Software

Authors: Oluibukun G. Ajayi, Joseph O. Odumosu, Oladimeji T. Babafemi, Azeez Z. Opeyemi, Asaleye O. Samuel

Abstract:

The need for the complete automation of every procedure of surveying and most especially, its engineering applications cannot be overemphasized due to the many demerits of the conventional manual or analogue approach. This paper presents the summarized details of the development of a Route Information Management (RIM) software. The software, codenamed ‘AutoROUTE’, was encoded using Microsoft visual studio-visual basic package, and it offers complete automation of the computational procedures and plan production involved in route surveying. It was experimented using a route survey data (longitudinal profile and cross sections) of a 2.7 km road which stretches from Dama to Lunko village in Minna, Niger State, acquired with the aid of a Hi-Target DGPS receiver. The developed software (AutoROUTE) is capable of computing the various simple curve parameters, horizontal curve, and vertical curve, and it can also plot road alignment, longitudinal profile, and cross-section with a capability to store this on the SQL incorporated into the Microsoft visual basic software. The plotted plans with AutoROUTE were compared with the plans produced with the conventional AutoCAD Civil 3D software, and AutoROUTE proved to be more user-friendly and accurate because it plots in three decimal places whereas AutoCAD plots in two decimal places. Also, it was discovered that AutoROUTE software is faster in plotting and the stages involved is less cumbersome compared to AutoCAD Civil 3D software.

Keywords: automated systems, cross sections, curves, engineering construction, longitudinal profile, route surveying

Procedia PDF Downloads 149
198 A Next-Generation Blockchain-Based Data Platform: Leveraging Decentralized Storage and Layer 2 Scaling for Secure Data Management

Authors: Kenneth Harper

Abstract:

The rapid growth of data-driven decision-making across various industries necessitates advanced solutions to ensure data integrity, scalability, and security. This study introduces a decentralized data platform built on blockchain technology to improve data management processes in high-volume environments such as healthcare and financial services. The platform integrates blockchain networks using Cosmos SDK and Polkadot Substrate alongside decentralized storage solutions like IPFS and Filecoin, and coupled with decentralized computing infrastructure built on top of Avalanche. By leveraging advanced consensus mechanisms, we create a scalable, tamper-proof architecture that supports both structured and unstructured data. Key features include secure data ingestion, cryptographic hashing for robust data lineage, and Zero-Knowledge Proof mechanisms that enhance privacy while ensuring compliance with regulatory standards. Additionally, we implement performance optimizations through Layer 2 scaling solutions, including ZK-Rollups, which provide low-latency data access and trustless data verification across a distributed ledger. The findings from this exercise demonstrate significant improvements in data accessibility, reduced operational costs, and enhanced data integrity when tested in real-world scenarios. This platform reference architecture offers a decentralized alternative to traditional centralized data storage models, providing scalability, security, and operational efficiency.

Keywords: blockchain, cosmos SDK, decentralized data platform, IPFS, ZK-Rollups

Procedia PDF Downloads 28
197 Development of a Matlab® Program for the Bi-Dimensional Truss Analysis Using the Stiffness Matrix Method

Authors: Angel G. De Leon Hernandez

Abstract:

A structure is defined as a physical system or, in certain cases, an arrangement of connected elements, capable of bearing certain loads. The structures are presented in every part of the daily life, e.g., in the designing of buildings, vehicles and mechanisms. The main goal of a structure designer is to develop a secure, aesthetic and maintainable system, considering the constraint imposed to every case. With the advances in the technology during the last decades, the capabilities of solving engineering problems have increased enormously. Nowadays the computers, play a critical roll in the structural analysis, pitifully, for university students the vast majority of these software are inaccessible due to the high complexity and cost they represent, even when the software manufacturers offer student versions. This is exactly the reason why the idea of developing a more reachable and easy-to-use computing tool. This program is designed as a tool for the university students enrolled in courser related to the structures analysis and designs, as a complementary instrument to achieve a better understanding of this area and to avoid all the tedious calculations. Also, the program can be useful for graduated engineers in the field of structural design and analysis. A graphical user interphase is included in the program to make it even simpler to operate it and understand the information requested and the obtained results. In the present document are included the theoretical basics in which the program is based to solve the structural analysis, the logical path followed in order to develop the program, the theoretical results, a discussion about the results and the validation of those results.

Keywords: stiffness matrix method, structural analysis, Matlab® applications, programming

Procedia PDF Downloads 122
196 Efficient DNN Training on Heterogeneous Clusters with Pipeline Parallelism

Authors: Lizhi Ma, Dan Liu

Abstract:

Pipeline parallelism has been widely used to accelerate distributed deep learning to alleviate GPU memory bottlenecks and to ensure that models can be trained and deployed smoothly under limited graphics memory conditions. However, in highly heterogeneous distributed clusters, traditional model partitioning methods are not able to achieve load balancing. The overlap of communication and computation is also a big challenge. In this paper, HePipe is proposed, an efficient pipeline parallel training method for highly heterogeneous clusters. According to the characteristics of the neural network model pipeline training task, oriented to the 2-level heterogeneous cluster computing topology, a training method based on the 2-level stage division of neural network modeling and partitioning is designed to improve the parallelism. Additionally, a multi-forward 1F1B scheduling strategy is designed to accelerate the training time of each stage by executing the computation units in advance to maximize the overlap between the forward propagation communication and backward propagation computation. Finally, a dynamic recomputation strategy based on task memory requirement prediction is proposed to improve the fitness ratio of task and memory, which improves the throughput of the cluster and solves the memory shortfall problem caused by memory differences in heterogeneous clusters. The empirical results show that HePipe improves the training speed by 1.6×−2.2× over the existing asynchronous pipeline baselines.

Keywords: pipeline parallelism, heterogeneous cluster, model training, 2-level stage partitioning

Procedia PDF Downloads 19
195 Automatic Staging and Subtype Determination for Non-Small Cell Lung Carcinoma Using PET Image Texture Analysis

Authors: Seyhan Karaçavuş, Bülent Yılmaz, Ömer Kayaaltı, Semra İçer, Arzu Taşdemir, Oğuzhan Ayyıldız, Kübra Eset, Eser Kaya

Abstract:

In this study, our goal was to perform tumor staging and subtype determination automatically using different texture analysis approaches for a very common cancer type, i.e., non-small cell lung carcinoma (NSCLC). Especially, we introduced a texture analysis approach, called Law’s texture filter, to be used in this context for the first time. The 18F-FDG PET images of 42 patients with NSCLC were evaluated. The number of patients for each tumor stage, i.e., I-II, III or IV, was 14. The patients had ~45% adenocarcinoma (ADC) and ~55% squamous cell carcinoma (SqCCs). MATLAB technical computing language was employed in the extraction of 51 features by using first order statistics (FOS), gray-level co-occurrence matrix (GLCM), gray-level run-length matrix (GLRLM), and Laws’ texture filters. The feature selection method employed was the sequential forward selection (SFS). Selected textural features were used in the automatic classification by k-nearest neighbors (k-NN) and support vector machines (SVM). In the automatic classification of tumor stage, the accuracy was approximately 59.5% with k-NN classifier (k=3) and 69% with SVM (with one versus one paradigm), using 5 features. In the automatic classification of tumor subtype, the accuracy was around 92.7% with SVM one vs. one. Texture analysis of FDG-PET images might be used, in addition to metabolic parameters as an objective tool to assess tumor histopathological characteristics and in automatic classification of tumor stage and subtype.

Keywords: cancer stage, cancer cell type, non-small cell lung carcinoma, PET, texture analysis

Procedia PDF Downloads 326
194 Continuous FAQ Updating for Service Incident Ticket Resolution

Authors: Kohtaroh Miyamoto

Abstract:

As enterprise computing becomes more and more complex, the costs and technical challenges of IT system maintenance and support are increasing rapidly. One popular approach to managing IT system maintenance is to prepare and use an FAQ (Frequently Asked Questions) system to manage and reuse systems knowledge. Such an FAQ system can help reduce the resolution time for each service incident ticket. However, there is a major problem where over time the knowledge in such FAQs tends to become outdated. Much of the knowledge captured in the FAQ requires periodic updates in response to new insights or new trends in the problems addressed in order to maintain its usefulness for problem resolution. These updates require a systematic approach to define the exact portion of the FAQ and its content. Therefore, we are working on a novel method to hierarchically structure the FAQ and automate the updates of its structure and content. We use structured information and the unstructured text information with the timelines of the information in the service incident tickets. We cluster the tickets by structured category information, by keywords, and by keyword modifiers for the unstructured text information. We also calculate an urgency score based on trends, resolution times, and priorities. We carefully studied the tickets of one of our projects over a 2.5-year time period. After the first 6 months, we started to create FAQs and confirmed they improved the resolution times. We continued observing over the next 2 years to assess the ongoing effectiveness of our method for the automatic FAQ updates. We improved the ratio of tickets covered by the FAQ from 32.3% to 68.9% during this time. Also, the average time reduction of ticket resolution was between 31.6% and 43.9%. Subjective analysis showed more than 75% reported that the FAQ system was useful in reducing ticket resolution times.

Keywords: FAQ system, resolution time, service incident tickets, IT system maintenance

Procedia PDF Downloads 340
193 DLtrace: Toward Understanding and Testing Deep Learning Information Flow in Deep Learning-Based Android Apps

Authors: Jie Zhang, Qianyu Guo, Tieyi Zhang, Zhiyong Feng, Xiaohong Li

Abstract:

With the widespread popularity of mobile devices and the development of artificial intelligence (AI), deep learning (DL) has been extensively applied in Android apps. Compared with traditional Android apps (traditional apps), deep learning based Android apps (DL-based apps) need to use more third-party application programming interfaces (APIs) to complete complex DL inference tasks. However, existing methods (e.g., FlowDroid) for detecting sensitive information leakage in Android apps cannot be directly used to detect DL-based apps as they are difficult to detect third-party APIs. To solve this problem, we design DLtrace; a new static information flow analysis tool that can effectively recognize third-party APIs. With our proposed trace and detection algorithms, DLtrace can also efficiently detect privacy leaks caused by sensitive APIs in DL-based apps. Moreover, using DLtrace, we summarize the non-sequential characteristics of DL inference tasks in DL-based apps and the specific functionalities provided by DL models for such apps. We propose two formal definitions to deal with the common polymorphism and anonymous inner-class problems in the Android static analyzer. We conducted an empirical assessment with DLtrace on 208 popular DL-based apps in the wild and found that 26.0% of the apps suffered from sensitive information leakage. Furthermore, DLtrace has a more robust performance than FlowDroid in detecting and identifying third-party APIs. The experimental results demonstrate that DLtrace expands FlowDroid in understanding DL-based apps and detecting security issues therein.

Keywords: mobile computing, deep learning apps, sensitive information, static analysis

Procedia PDF Downloads 179
192 A New Multi-Target, Multi-Agent Search and Rescue Path Planning Approach

Authors: Jean Berger, Nassirou Lo, Martin Noel

Abstract:

Perfectly suited for natural or man-made emergency and disaster management situations such as flood, earthquakes, tornadoes, or tsunami, multi-target search path planning for a team of rescue agents is known to be computationally hard, and most techniques developed so far come short to successfully estimate optimality gap. A novel mixed-integer linear programming (MIP) formulation is proposed to optimally solve the multi-target multi-agent discrete search and rescue (SAR) path planning problem. Aimed at maximizing cumulative probability of successful target detection, it captures anticipated feedback information associated with possible observation outcomes resulting from projected path execution, while modeling agent discrete actions over all possible moving directions. Problem modeling further takes advantage of network representation to encompass decision variables, expedite compact constraint specification, and lead to substantial problem-solving speed-up. The proposed MIP approach uses CPLEX optimization machinery, efficiently computing near-optimal solutions for practical size problems, while giving a robust upper bound obtained from Lagrangean integrality constraint relaxation. Should eventually a target be positively detected during plan execution, a new problem instance would simply be reformulated from the current state, and then solved over the next decision cycle. A computational experiment shows the feasibility and the value of the proposed approach.

Keywords: search path planning, search and rescue, multi-agent, mixed-integer linear programming, optimization

Procedia PDF Downloads 372
191 DWDM Network Implementation in the Honduran Telecommunications Company "Hondutel"

Authors: Tannia Vindel, Carlos Mejia, Damaris Araujo, Carlos Velasquez, Darlin Trejo

Abstract:

The DWDM (Dense Wavelenght Division Multiplexing) is in constant growth around the world by consumer demand to meet their needs. Since its inception in this operation arises the need for a system which enable us to expand the communication of an entire nation to improve the computing trends of their societies according to their customs and geographical location. The Honduran Company of Telecommunications (HONDUTEL), provides the internet services and data transport technology with a PDH and SDH, which represents in the Republic of Honduras C. A., the option of viability for the consumer in terms of purchase value and its ease of acquisition; but does not have the efficiency in terms of technological advance and represents an obstacle that limits the long-term socio-economic development in comparison with other countries in the region and to be able to establish a competition between telecommunications companies that are engaged in this heading. For that reason we propose to establish a new technological trend implemented in Europe and that is applied in our country that allows us to provide a data transfer in broadband as it is DWDM, in this way we will have a stable service and quality that will allow us to compete in this globalized world, and that must be replaced by one that would provide a better service and which must be in the forefront. Once implemented the DWDM is build upon the existing resources, such as the equipment used, and you will be given life to a new stage providing a business image to the Republic of Honduras C,A, as a nation, to ensure the data transport and broadband internet to a meaningful relationship. Same benefits in the first instance to existing customers and to all the institutions were bidden to these public and private need of such services.

Keywords: demultiplexers, light detectors, multiplexers, optical amplifiers, optical fibers, PDH, SDH

Procedia PDF Downloads 265
190 Design and Implementation of Low-code Model-building Methods

Authors: Zhilin Wang, Zhihao Zheng, Linxin Liu

Abstract:

This study proposes a low-code model-building approach that aims to simplify the development and deployment of artificial intelligence (AI) models. With an intuitive way to drag and drop and connect components, users can easily build complex models and integrate multiple algorithms for training. After the training is completed, the system automatically generates a callable model service API. This method not only lowers the technical threshold of AI development and improves development efficiency but also enhances the flexibility of algorithm integration and simplifies the deployment process of models. The core strength of this method lies in its ease of use and efficiency. Users do not need to have a deep programming background and can complete the design and implementation of complex models with a simple drag-and-drop operation. This feature greatly expands the scope of AI technology, allowing more non-technical people to participate in the development of AI models. At the same time, the method performs well in algorithm integration, supporting many different types of algorithms to work together, which further improves the performance and applicability of the model. In the experimental part, we performed several performance tests on the method. The results show that compared with traditional model construction methods, this method can make more efficient use, save computing resources, and greatly shorten the model training time. In addition, the system-generated model service interface has been optimized for high availability and scalability, which can adapt to the needs of different application scenarios.

Keywords: low-code, model building, artificial intelligence, algorithm integration, model deployment

Procedia PDF Downloads 31
189 Comparative Evaluation of Vanishing Interfacial Tension Approach for Minimum Miscibility Pressure Determination

Authors: Waqar Ahmad Butt, Gholamreza Vakili Nezhaad, Ali Soud Al Bemani, Yahya Al Wahaibi

Abstract:

Minimum miscibility pressure (MMP) plays a great role in determining the displacement efficiency of different gas injection processes. Experimental techniques for MMP determination include industrially recommended slim tube, vanishing interfacial tension (VIT) and rising bubble apparatus (RBA). In this paper, MMP measurement study using slim tube and VIT experimental techniques for two different crude oil samples (M and N) both in live and stock tank oil forms is being presented. VIT measured MMP values for both 'M' and 'N' live crude oils were close to slim tube determined MMP values with 6.4 and 5 % deviation respectively. Whereas for both oil samples in stock tank oil form, VIT measured MMP showed a higher unacceptable deviation from slim tube determined MMP. This higher difference appears to be related to high stabilized crude oil heavier fraction and lack of multiple contacts miscibility. None of the different nine deployed crude oil and CO2 MMP computing correlations could result in reliable MMP, close to slim tube determined MMP. Since VIT determined MMP values for both considered live crude oils are in close match with slim tube determined MMP values, it confirms reliable, reproducible, rapid and cheap alternative for live crude oil MMP determination. Whereas VIT MMP determination for stock tank oil case needed further investigation about stabilization / destabilization mechanism of oil heavier ends and multiple contacts miscibility development issues.

Keywords: minimum miscibility pressure, interfacial tension, multiple contacts miscibility, heavier ends

Procedia PDF Downloads 269
188 A Survey of Field Programmable Gate Array-Based Convolutional Neural Network Accelerators

Authors: Wei Zhang

Abstract:

With the rapid development of deep learning, neural network and deep learning algorithms play a significant role in various practical applications. Due to the high accuracy and good performance, Convolutional Neural Networks (CNNs) especially have become a research hot spot in the past few years. However, the size of the networks becomes increasingly large scale due to the demands of the practical applications, which poses a significant challenge to construct a high-performance implementation of deep learning neural networks. Meanwhile, many of these application scenarios also have strict requirements on the performance and low-power consumption of hardware devices. Therefore, it is particularly critical to choose a moderate computing platform for hardware acceleration of CNNs. This article aimed to survey the recent advance in Field Programmable Gate Array (FPGA)-based acceleration of CNNs. Various designs and implementations of the accelerator based on FPGA under different devices and network models are overviewed, and the versions of Graphic Processing Units (GPUs), Application Specific Integrated Circuits (ASICs) and Digital Signal Processors (DSPs) are compared to present our own critical analysis and comments. Finally, we give a discussion on different perspectives of these acceleration and optimization methods on FPGA platforms to further explore the opportunities and challenges for future research. More helpfully, we give a prospect for future development of the FPGA-based accelerator.

Keywords: deep learning, field programmable gate array, FPGA, hardware accelerator, convolutional neural networks, CNN

Procedia PDF Downloads 129