Search results for: computational vision
1076 Development of Beeswax-Discharge Writing Material for Visually Impaired Persons
Authors: K. Doi, T. Nishimura, H. Fujimoto, T. Tanaka
Abstract:
It has been known that visually impaired persons have some problems in getting visual information. Therefore, information accessibility for the visually impaired persons is very important in a current information society. Some application software with read-aloud function for using personal computer and smartphone are getting more and more popular among visually impaired persons in the world. On the other hand, it is also very important for being able to learn how to read and write characters such as Braille and Visual character. Braille typewriter has been widely used in learning Braille. And also raised-line drawing kits as writing material has been used for decades for especially acquired visually impaired persons. However, there are some drawbacks such as the drawn line cannot be erased. Moreover, visibility of drawing lines is not so good for visually impaired with low vision. We had significant number of requests for developing new writing material for especially acquired visually impaired persons instead of raised-line drawing kits. For conducting development research of novel writing material, we could receive a research grant from ministry of health, labor and welfare in Japanese government. In this research, we developed writing material typed pens and pencils with Beeswax-discharge instead of conventional raised-line drawing kits. This writing material was equipped with cartridge heater for melting beeswax and its heat controller. When this pen users held down the pen tip on the regular paper such as fine paper and so on, the melted beeswax could be discharged from pen tip with valve structure. The beeswax was discharged at 100 gf of holding down force based on results of our previous trial study. The shape of pen tip was semispherical for becoming low friction between pen tip and surface of paper. We conducted one basic experiment to evaluate influence of the curvature of pen tip on ease to write. Concretely, the conditions of curvature was 0.15, 0.35, 0.50, 1.00 mm. The following four interval scales were used as indexes of subjective assessment during writing such as feeling of smooth motion of pen, feeling of comfortable writing, sense of security and feeling of writing fatigue. Ten subjects were asked to participate in this experiment. The results reveal that subjects could draw easily when the radius of the pen tip was 1.00 mm, and lines drawn with beeswax-discharge writing material were easy to perceive.Keywords: beeswax-discharge writing material, raised-line drawing kits, visually impaired persons, pen tip
Procedia PDF Downloads 3091075 Rethinking Urban Informality through the Lens of Inclusive Planning and Governance in Contemporary Cities: A Case Study of Johannesburg, South Africa
Authors: Blessings Masuku
Abstract:
Background: Considering that Africa is urbanizing faster than any other region globally, managing cities in the global South has become the centerpiece for the New Urban Agenda (i.e., a shared vision of how we rethink, rebuild, and manage our cities for a better and more sustainable future). This study is centered on governance and planning of urban informality practices with particular reference to the relationship between the state, informal actors (e.g., informal traders and informal dwellers), and other city stakeholders who are public space users (commuters, businesses, and environmental activists), and how informal actors organize themselves to lobby the state and claim for their rights in the city, and how they navigate their everyday livelihood strategies. Aim: The purpose of this study is to examine and interrogate contemporary approaches, policy and regulatory frameworks to urban spatial planning and management of informality in one of South Africa’s busiest and major cities, Johannesburg. Setting: The study uses the metropolitan region of the city of Johannesburg, South Africa to understand how this contemporary industrial city manages urban informality practices, including the use of public space, land zoning and street life, and paying a closer look at what progress has been made and gaps in their inclusive urban policy frameworks. Methods: This study utilized a qualitative approach that includes surveys (open-ended questions), archival research (i., e policy and other key document reviews), and key interviews mainly with city officials, and informality actors. A thematic analysis was used to analyze the data collected. Contribution: This study contributes to large urban informality scholarship in the global South cities by exploring how major cities particularly in Africa regulate and manage informality patterns and practices in their quest to build “utopian” smart cities. This study also brings a different perspective on the hacking ways used by the informal actors to resist harsh regulations and remain invisible in the city, which is something that previous literature has barely delved in-depth.Keywords: inclusive planning and governance, infrastructure systems, livelihood strategies urban informality, urban space
Procedia PDF Downloads 721074 Application and Evaluation of Teaching-Learning Guides Based on Swebok for the Requirements Engineering Area
Authors: Mauro Callejas-Cuervo, Andrea Catherine Alarcon-Aldana, Lorena Paola Castillo-Guerra
Abstract:
The software industry requires highly-trained professionals, capable of developing the roles integrated in the cycle of software development. That is why a large part of the task is the responsibility of higher education institutions; often through a curriculum established to orientate the academic development of the students. It is so that nowadays there are different models that support proposals for the improvement of the curricula for the area of Software Engineering, such as ACM, IEEE, ABET, Swebok, of which the last stands out, given that it manages and organises the knowledge of Software Engineering and offers a vision of theoretical and practical aspects. Moreover, it has been applied by different universities in the pursuit of achieving coverage in delivering the different topics and increasing the professional quality of future graduates. This research presents the structure of teaching and learning guides from the objectives of training and methodological strategies immersed in the levels of learning of Bloom’s taxonomy with which it is intended to improve the delivery of the topics in the area of Requirements Engineering. Said guides were implemented and validated in a course of Requirements Engineering of the Systems and Computer Engineering programme in the Universidad Pedagógica y Tecnológica de Colombia (Pedagogical and Technological University of Colombia) using a four stage methodology: definition of the evaluation model, implementation of the guides, guide evaluation, and analysis of the results. After the collection and analysis of the data, the results show that in six out of the seven topics proposed in the Swebok guide, the percentage of students who obtained total marks within the 'High grade' level, that is between 4.0 and 4.6 (on a scale of 0.0 to 5.0), was higher than the percentage of students who obtained marks within the 'Acceptable' range of 3.0 to 3.9. In 86% of the topics and the strategies proposed, the teaching and learning guides facilitated the comprehension, analysis, and articulation of the concepts and processes of the students. In addition, they mainly indicate that the guides strengthened the argumentative and interpretative competencies, while the remaining 14% denotes the need to reinforce the strategies regarding the propositive competence, given that it presented the lowest average.Keywords: pedagogic guide, pedagogic strategies, requirements engineering, Swebok, teaching-learning process
Procedia PDF Downloads 2861073 Robust ResNets for Chemically Reacting Flows
Authors: Randy Price, Harbir Antil, Rainald Löhner, Fumiya Togashi
Abstract:
Chemically reacting flows are common in engineering applications such as hypersonic flow, combustion, explosions, manufacturing process, and environmental assessments. The number of reactions in combustion simulations can exceed 100, making a large number of flow and combustion problems beyond the capabilities of current supercomputers. Motivated by this, deep neural networks (DNNs) will be introduced with the goal of eventually replacing the existing chemistry software packages with DNNs. The DNNs used in this paper are motivated by the Residual Neural Network (ResNet) architecture. In the continuum limit, ResNets become an optimization problem constrained by an ODE. Such a feature allows the use of ODE control techniques to enhance the DNNs. In this work, DNNs are constructed, which update the species un at the nᵗʰ timestep to uⁿ⁺¹ at the n+1ᵗʰ timestep. Parallel DNNs are trained for each species, taking in uⁿ as input and outputting one component of uⁿ⁺¹. These DNNs are applied to multiple species and reactions common in chemically reacting flows such as H₂-O₂ reactions. Experimental results show that the DNNs are able to accurately replicate the dynamics in various situations and in the presence of errors.Keywords: chemical reacting flows, computational fluid dynamics, ODEs, residual neural networks, ResNets
Procedia PDF Downloads 1201072 Oxygen Enriched Co-Combustion of Sub-Bituminous Coal/Biomass Waste Fuel Blends
Authors: Chaouki Ghenai
Abstract:
Computational Fluid Dynamic analysis of co-combustion of coal/biomass waste fuel blends is presented in this study. The main objective of this study is to investigate the effects of biomass portions (0%, 10%, 20%, 30%: weight percent) blended with coal and oxygen concentrations (21% for air, 35%, 50%, 75% and 100 % for pure oxygen) on the combustion performance and emissions. The goal is to reduce the air emissions from power plants coal combustion. Sub-bituminous Nigerian coal with calorific value of 32.51 MJ/kg and sawdust (biomass) with calorific value of 16.68 MJ/kg is used in this study. Coal/Biomass fuel blends co-combustion is modeled using mixture fraction/pdf approach for non-premixed combustion and Discrete Phase Modeling (DPM) to predict the trajectories and the heat/mass transfer of the fuel blend particles. The results show the effects of oxygen concentrations and biomass portions in the coal/biomass fuel blends on the gas and particles temperatures, the flow field, the devolitization and burnout rates inside the combustor and the CO2 and NOX emissions at the exit from the combustor. The results obtained in the course of this study show the benefits of enriching combustion air with oxygen and blending biomass waste with coal for reducing the harmful emissions from coal power plants.Keywords: co-combustion, coal, biomass, fuel blends, CFD, air emissions
Procedia PDF Downloads 4181071 A Method for Clinical Concept Extraction from Medical Text
Authors: Moshe Wasserblat, Jonathan Mamou, Oren Pereg
Abstract:
Natural Language Processing (NLP) has made a major leap in the last few years, in practical integration into medical solutions; for example, extracting clinical concepts from medical texts such as medical condition, medication, treatment, and symptoms. However, training and deploying those models in real environments still demands a large amount of annotated data and NLP/Machine Learning (ML) expertise, which makes this process costly and time-consuming. We present a practical and efficient method for clinical concept extraction that does not require costly labeled data nor ML expertise. The method includes three steps: Step 1- the user injects a large in-domain text corpus (e.g., PubMed). Then, the system builds a contextual model containing vector representations of concepts in the corpus, in an unsupervised manner (e.g., Phrase2Vec). Step 2- the user provides a seed set of terms representing a specific medical concept (e.g., for the concept of the symptoms, the user may provide: ‘dry mouth,’ ‘itchy skin,’ and ‘blurred vision’). Then, the system matches the seed set against the contextual model and extracts the most semantically similar terms (e.g., additional symptoms). The result is a complete set of terms related to the medical concept. Step 3 –in production, there is a need to extract medical concepts from the unseen medical text. The system extracts key-phrases from the new text, then matches them against the complete set of terms from step 2, and the most semantically similar will be annotated with the same medical concept category. As an example, the seed symptom concepts would result in the following annotation: “The patient complaints on fatigue [symptom], dry skin [symptom], and Weight loss [symptom], which can be an early sign for Diabetes.” Our evaluations show promising results for extracting concepts from medical corpora. The method allows medical analysts to easily and efficiently build taxonomies (in step 2) representing their domain-specific concepts, and automatically annotate a large number of texts (in step 3) for classification/summarization of medical reports.Keywords: clinical concepts, concept expansion, medical records annotation, medical records summarization
Procedia PDF Downloads 1351070 Methodical Approach for the Integration of a Digital Factory Twin into the Industry 4.0 Processes
Authors: R. Hellmuth
Abstract:
The orientation of flexibility and adaptability with regard to factory planning is at machine and process level. Factory buildings are not the focus of current research. Factory planning has the task of designing products, plants, processes, organization, areas and the construction of a factory. The adaptability of a factory can be divided into three types: spatial, organizational and technical adaptability. Spatial adaptability indicates the ability to expand and reduce the size of a factory. Here, the area-related breathing capacity plays the essential role. It mainly concerns the factory site, the plant layout and the production layout. The organizational ability to change enables the change and adaptation of organizational structures and processes. This includes structural and process organization as well as logistical processes and principles. New and reconfigurable operating resources, processes and factory buildings are referred to as technical adaptability. These three types of adaptability can be regarded independently of each other as undirected potentials of different characteristics. If there is a need for change, the types of changeability in the change process are combined to form a directed, complementary variable that makes change possible. When planning adaptability, importance must be attached to a balance between the types of adaptability. The vision of the intelligent factory building and the 'Internet of Things' presupposes the comprehensive digitalization of the spatial and technical environment. Through connectivity, the factory building must be empowered to support a company's value creation process by providing media such as light, electricity, heat, refrigeration, etc. In the future, communication with the surrounding factory building will take place on a digital or automated basis. In the area of industry 4.0, the function of the building envelope belongs to secondary or even tertiary processes, but these processes must also be included in the communication cycle. An integrative view of a continuous communication of primary, secondary and tertiary processes is currently not yet available and is being developed with the aid of methods in this research work. A comparison of the digital twin from the point of view of production and the factory building will be developed. Subsequently, a tool will be elaborated to classify digital twins from the perspective of data, degree of visualization, and the trades. Thus a contribution is made to better integrate the secondary and tertiary processes in a factory into the added value.Keywords: adaptability, digital factory twin, factory planning, industry 4.0
Procedia PDF Downloads 1561069 3D Interferometric Imaging Using Compressive Hardware Technique
Authors: Mor Diama L. O., Matthieu Davy, Laurent Ferro-Famil
Abstract:
In this article, inverse synthetic aperture radar (ISAR) is combined with compressive imaging techniques in order to perform 3D interferometric imaging. Interferometric ISAR (InISAR) imaging relies on a two-dimensional antenna array providing diversities in the elevation and azimuth directions. However, the signals measured over several antennas must be acquired by coherent receivers resulting in costly and complex hardware. This paper proposes to use a chaotic cavity as a compressive device to encode the signals arising from several antennas into a single output port. These signals are then reconstructed by solving an inverse problem. Our approach is demonstrated experimentally with a 3-elements L-shape array connected to a metallic compressive enclosure. The interferometric phases estimated from a unique broadband signal are used to jointly estimate the target’s effective rotation rate and the height of the dominant scattering centers of our target. Our experimental results show that the use of the compressive device does not adversely affect the performance of our imaging process. This study opens new perspectives to reduce the hardware complexity of high-resolution ISAR systems.Keywords: interferometric imaging, inverse synthetic aperture radar, compressive device, computational imaging
Procedia PDF Downloads 1601068 Synthesis, Characterization, Computational Study, Antimicrobial Evaluation, in Vivo Toxicity Study of Manganese (II) and Copper (II) Complexes with Derivative Sulfa-drug
Authors: Afaf Bouchoucha, Karima Si Larbi, Mohamed Amine Bourouaia, Salah.Boulanouar, Safia.Djabbar
Abstract:
The synthesis, characterization and comparative biological study of manganese (II) and copper (II) complexes with an heterocyclic ligand used in pharmaceutical field (Scheme 1), were reported. Two kinds of complexes were obtained with derivative sulfonamide, [M (L)₂ (H₂O)₂].H₂O and [M (L)₂ (Cl)₂]3H₂O. These complexes have been prepared and characterized by elemental analysis, FAB mass, ESR magnetic measurements, FTIR, UV-Visible spectra and conductivity. Their stability constants have been determined by potentiometric methods in a water-ethanol (90:10 v/v) mixture at a 0.2 mol l-1 ionic strength (NaCl) and at 25.0 ± 0.1 ºC using Sirko program. DFT calculations were done using B3LYP/6-31G(d) and B3LYP/LanL2DZ. The antimicrobial activity of ligand and complexes against the species Escherichia coli, P. aeruginosa, Klebsiella pneumoniae, S. aureus, Bacillus subtilisan, Candida albicans, Candida tropicalis, Saccharomyces, Aspergillus fumigatus and Aspergillus terreus has been carried out and compared using agar-diffusion method. Also, the toxicity study was evaluated on synchesis complexes using Mice of NMRI strain.Keywords: hetterocyclic ligand, complex, stability constant, antimicrobial activity, DFT, acute and genotoxicity study
Procedia PDF Downloads 1211067 A Parametric Study on Aerodynamic Performance of Tyre Using CFD
Authors: Sowntharya L.
Abstract:
Aerodynamics is the most important factor when it comes to resistive forces such as lift, drag and side forces acting on the vehicle. In passenger vehicles, reducing the drag will not only unlock the door for higher achievable speed but will also reduce the fuel consumption of the vehicle. Generally, tyre contributes significantly to the overall aerodynamics of the vehicle. Hence, understanding the air-flow behaviour around the tyre is vital to optimize the aerodynamic performance in the early stage of design process. Nowadays, aerodynamic simulation employing Computational Fluid Dynamics (CFD) is gaining more importance as it reduces the number of physical wind-tunnel experiments during vehicle development process. This research develops a methodology to predict aerodynamic drag of a standalone tyre using Numerical CFD Solver and to validate the same using a wind tunnel experiment. A parametric study was carried out on different tread pattern tyres such as slick, circumferential groove & patterned tyre in stationary and rotating boundary conditions. In order to represent wheel rotation contact with the ground, moving reference frame (MRF) approach was used in this study. Aerodynamic parameters such as drag lift & air flow behaviour around the tire were simulated and compared with experimental results.Keywords: aerodynamics, CFD, drag, MRF, wind-tunnel
Procedia PDF Downloads 1941066 Heat Transfer Enhancement Using Aluminium Oxide Nanofluid: Effect of the Base Fluid
Authors: M. Amoura, M. Benmoussa, N. Zeraibi
Abstract:
The flow and heat transfer is an important phenomenon in engineering systems due to its wide application in electronic cooling, heat exchangers, double pane windows etc.. The enhancement of heat transfer in these systems is an essential topic from an energy saving perspective. Lower heat transfer performance when conventional fluids, such as water, engine oil and ethylene glycol are used hinders improvements in performance and causes a consequent reduction in the size of such systems. The use of solid particles as an additive suspended into the base fluid is a technique for heat transfer enhancement. Therefore, the heat transfer enhancement in a horizontal circular tube that is maintained at a constant temperature under laminar regime has been investigated numerically. A computational code applied to the problem by use of the finite volume method was developed. Nanofluid was made by dispersion of Al2O3 nanoparticles in pure water and ethylene glycol. Results illustrate that the suspended nanoparticles increase the heat transfer with an increase in the nanoparticles volume fraction and for a considered range of Reynolds numbers. On the other hand, the heat transfer is very sensitive to the base fluid.Keywords: Al2O3 nanoparticles, circular tube, heat transfert enhancement, numerical simulation
Procedia PDF Downloads 3221065 Bathymetric Change of Brahmaputra River and Its Influence on Flooding Scenario
Authors: Arup Kumar Sarma, Rohan Kar
Abstract:
The development of physical model of River like Brahmaputra, which finds its origin in the Chema Yundung glacier of Tibet and flows through India and Bangladesh, is always expensive and very much time consuming. With the advancement of computational technique, mathematical modeling has found wide application. MIKE 21C is one such commercial software, developed by Danish Hydraulic Institute (DHI), with the depth-averaged approach and a two-dimensional curvilinear finite-difference model, which is capable of modeling hydrodynamic and morphological processes with some limitations. The main purpose of this study are to generate bathymetry of the River Brahmaputra starting from “Sadia” at upstream to “Dhubri,” at downstream stretching a distance of approximately 695 km, for four different years: 1957, 1971, 1977, and 1981 over the grid generated in the MIKE 21C and to carry out the hydrodynamic simulation for these years to analyze the effect of bathymetry change on the surface water elevation. The study has established that bathymetric change can influence the flood level significantly in some of the river reaches and therefore the modification or updating of regular bathymetry is very much essential for the reliable flood routing in alluvial rivers.Keywords: bathymetry, brahmaputra river, hydrodynamic model, surface water elevation
Procedia PDF Downloads 4551064 How Did a Blind Child Begin Understanding Her “Blind Self”?: A Longitudinal Analysis Of Conversation between Her and Adults
Authors: Masahiro Nochi
Abstract:
This study explores the process in which a Japanese child with congenital blindness deepens understanding of the condition of being “unable to see” and develops the idea of “blind self,” despite having no direct experience of vision. The rehabilitation activities of a child with a congenital visual impairment that were video-recorded from 1 to 6 years old were analyzed qualitatively. The duration of the video was about 80 hours. The recordings were transcribed verbatim, and the episodes in which the child used the words related to the act of “looking” were extracted. Detailed transcripts were constructed referencing the notations of conversation analysis. Characteristics of interactions in those episodes were identified and compared longitudinally. Results showed that the child used the expression "look" under certain interaction patterns and her body expressions and interaction with adults developed in conjunction with the development of language use. Four stages were identified. At the age of 1, interactions involving “look” began to occur. The child said "Look" in the sequence: the child’s “Look,” an adult’s “I’m looking,” certain performances by the child, and the adult’s words of praise. At the age of 3, the child began to behave in accordance with the spatial attributes of the act of "looking," such as turning her face to the adult’s voice before saying, “Look.” She also began to use the expression “Keep looking,” which seemed to reflect her understanding of the temporality of the act of “looking.” At the age of 4, the use of “Look” or “Keep looking” became three times more frequent. She also started to refer to the act of looking in the future, such as “Come and look at my puppy someday.” At the age of 5, she moved her hands toward the adults when she was holding something she wanted to show them. She seemed to understand that people could see the object more clearly when it was in close priximity. About that time, she began to say “I cannot see” to her mother, which suggested a heightened understanding of her own blindness. The findings indicate that as she grew up, the child came to utilize nonverbal behavior before and after the order "Look" to make the progress of the interaction with adults even more certain. As a result, actions that reflect the characteristics of the sighted person's visual experience were incorporated into the interaction chain. The purpose of "Look," with which she intended to attract the adult's attention at first, changed and became something that requests a confirmation she was unable to make herself. It is considered that such a change in the use of the word as well as interaction with sighted adults reflected her heightened self-awareness as someone who could not do what sighted people could do easily. A blind child can gradually deepen their understanding of their own characteristics of blindness among sighted people around them. The child can also develop “blind self” by learning how to interact with others even without direct visual experiences.Keywords: blindness, child development, conversation analysis, self-concept
Procedia PDF Downloads 1211063 Prediction of Trailing-Edge Noise under Adverse-Pressure Gradient Effect
Authors: Li Chen
Abstract:
For an aerofoil or hydrofoil in high Reynolds number flows, broadband noise is generated efficiently as the result of the turbulence convecting over the trailing edge. This noise can be related to the surface pressure fluctuations, which can be predicted by either CFD or empirical models. However, in reality, the aerofoil or hydrofoil often operates at an angle of attack. Under this situation, the flow is subjected to an Adverse-Pressure-Gradient (APG), and as a result, a flow separation may occur. This study is to assess trailing-edge noise models for such flows. In the present work, the trailing-edge noise from a 2D airfoil at 6 degree of angle of attach is investigated. Under this condition, the flow is experiencing a strong APG, and the flow separation occurs. The flow over the airfoil with a chord of 300 mm, equivalent to a Reynold Number 4x10⁵, is simulated using RANS with the SST k-ɛ turbulent model. The predicted surface pressure fluctuations are compared with the published experimental data and empirical models, and show a good agreement with the experimental data. The effect of the APG on the trailing edge noise is discussed, and the associated trailing edge noise is calculated.Keywords: aero-acoustics, adverse-pressure gradient, computational fluid dynamics, trailing-edge noise
Procedia PDF Downloads 3371062 COVID-19 Case: A Definition of Infodemia through Online Italian Journalism
Authors: Concetta Papapicco
Abstract:
The spreading of new Coronavirus (COVID-19) in addition to becoming a global phenomenon, following the declaration of a pandemic state, has generated excessive access to information, sometimes not thoroughly screened, which makes it difficult to navigate a given topic because of the difficulty of finding reliable sources. As a result, there is a high level of contagion, understood as the spread of the virus, but also as the spread of information in a viral and harmful way, which prompted the World Health Organization to coin the term Infodemia to give 'a name' the phenomenon of excessive information. With neologism 'Infodemia', the World Health Organization (OMS) wanted, in these days when fear of the coronavirus is raging, point out that perhaps the greatest danger of global society in the age of social media. This phenomenon is the distortion of reality in the rumble of echoes and comments of the global community on real or often invented facts. The general purpose of the exploratory study is to investigate how the coronavirus situation is described from journalistic communication. Starting from La Repubblica online, as a reference journalistic magazine, as a specific objective, the research aims to understand the way in which journalistic communication describes the phenomenon of the COVID-19 virus spread, the spread of contagion and restrictive measures of social distancing in the Italian context. The study starts from the hypothesis that if the circulation of information helps to create a social representation of the phenomenon, the excessive accessibility to sources of information (Infodemia) can be modulated by the 'how' the phenomenon is described by the journalists. The methodology proposed, in fact, in the exploratory study is a quanti-qualitative (mixed) method. A Content Analysis with the SketchEngine software is carried out first. In support of the Content Analysis, a Diatextual Analysis was carried out. The Diatextual Analysis is a qualitative analysis useful to detect in the analyzed texts, that is the online articles of La Repubblica on the topic of coronavirus, Subjectivity, Argomentativity, and Mode. The research focuses mainly on 'Mode' or 'How' are the events related to coronavirus in the online articles of La Repubblica about COVID-19 phenomenon. The results show the presence of the contrast vision about COVID-19 situation in Italy.Keywords: coronavirus, Italian infodemia, La Republica online, mix method
Procedia PDF Downloads 1221061 A Computational Investigation of Potential Drugs for Cholesterol Regulation to Treat Alzheimer’s Disease
Authors: Marina Passero, Tianhua Zhai, Zuyi (Jacky) Huang
Abstract:
Alzheimer’s disease has become a major public health issue, as indicated by the increasing populations of Americans living with Alzheimer’s disease. After decades of extensive research in Alzheimer’s disease, only seven drugs have been approved by Food and Drug Administration (FDA) to treat Alzheimer’s disease. Five of these drugs were designed to treat the dementia symptoms, and only two drugs (i.e., Aducanumab and Lecanemab) target the progression of Alzheimer’s disease, especially the accumulation of amyloid-b plaques. However, controversial comments were raised for the accelerated approvals of either Aducanumab or Lecanemab, especially with concerns on safety and side effects of these two drugs. There is still an urgent need for further drug discovery to target the biological processes involved in the progression of Alzheimer’s disease. Excessive cholesterol has been found to accumulate in the brain of those with Alzheimer’s disease. Cholesterol can be synthesized in both the blood and the brain, but the majority of biosynthesis in the adult brain takes place in astrocytes and is then transported to the neurons via ApoE. The blood brain barrier separates cholesterol metabolism in the brain from the rest of the body. Various proteins contribute to the metabolism of cholesterol in the brain, which offer potential targets for Alzheimer’s treatment. In the astrocytes, SREBP cleavage-activating protein (SCAP) binds to Sterol Regulatory Element-binding Protein 2 (SREBP2) in order to transport the complex from the endoplasmic reticulum to the Golgi apparatus. Cholesterol is secreted out of the astrocytes by ATP-Binding Cassette A1 (ABCA1) transporter. Lipoprotein receptors such as triggering receptor expressed on myeloid cells 2 (TREM2) internalize cholesterol into the microglia, while lipoprotein receptors such as Low-density lipoprotein receptor-related protein 1 (LRP1) internalize cholesterol into the neuron. Cytochrome P450 Family 46 Subfamily A Member 1 (CYP46A1) converts excess cholesterol to 24S-hydroxycholesterol (24S-OHC). Cholesterol has been approved for its direct effect on the production of amyloid-beta and tau proteins. The addition of cholesterol to the brain promotes the activity of beta-site amyloid precursor protein cleaving enzyme 1 (BACE1), secretase, and amyloid precursor protein (APP), which all aid in amyloid-beta production. The reduction of cholesterol esters in the brain have been found to reduce phosphorylated tau levels in mice. In this work, a computational pipeline was developed to identify the protein targets involved in cholesterol regulation in brain and further to identify chemical compounds as the inhibitors of a selected protein target. Since extensive evidence shows the strong correlation between brain cholesterol regulation and Alzheimer’s disease, a detailed literature review on genes or pathways related to the brain cholesterol synthesis and regulation was first conducted in this work. An interaction network was then built for those genes so that the top gene targets were identified. The involvement of these genes in Alzheimer’s disease progression was discussed, which was followed by the investigation of existing clinical trials for those targets. A ligand-protein docking program was finally developed to screen 1.5 million chemical compounds for the selected protein target. A machine learning program was developed to evaluate and predict the binding interaction between chemical compounds and the protein target. The results from this work pave the way for further drug discovery to regulate brain cholesterol to combat Alzheimer’s disease.Keywords: Alzheimer’s disease, drug discovery, ligand-protein docking, gene-network analysis, cholesterol regulation
Procedia PDF Downloads 751060 Dynamic Analysis and Vibration Response of Thermoplastic Rolling Elements in a Rotor Bearing System
Authors: Nesrine Gaaliche
Abstract:
This study provides a finite element dynamic model for analyzing rolling bearing system vibration response. The vibration responses of polypropylene bearings with and without defects are studied using FE analysis and compared to experimental data. The viscoelastic behavior of thermoplastic is investigated in this work to evaluate the influence of material flexibility and damping viscosity. The vibrations are detected using 3D dynamic analysis. Peak vibrations are more noticeable in an inner ring defect than in an outer ring defect, according to test data. The performance of thermoplastic bearings is compared to that of metal parts using vibration signals. Both the test and numerical results show that Polypropylene bearings exhibit less vibration than steel counterparts. Unlike bearings made from metal, polypropylene bearings absorb vibrations and handle shaft misalignments. Following validation of the overall vibration spectrum data, Von Mises stresses inside the rings are assessed under high loads. Stress is significantly high under the balls, according to the simulation findings. For the test cases, the computational findings correspond closely to the experimental results.Keywords: viscoelastic, FE analysis, polypropylene, bearings
Procedia PDF Downloads 1061059 A Study of the Trade-off Energy Consumption-Performance-Schedulability for DVFS Multicore Systems
Authors: Jalil Boudjadar
Abstract:
Dynamic Voltage and Frequency Scaling (DVFS) multicore platforms are promising execution platforms that enable high computational performance, less energy consumption and flexibility in scheduling the system processes. However, the resulting interleaving and memory interference together with per-core frequency tuning make real-time guarantees hard to be delivered. Besides, energy consumption represents a strong constraint for the deployment of such systems on energy-limited settings. Identifying the system configurations that would achieve a high performance and consume less energy while guaranteeing the system schedulability is a complex task in the design of modern embedded systems. This work studies the trade-off between energy consumption, cores utilization and memory bottleneck and their impact on the schedulability of DVFS multicore time-critical systems with a hierarchy of shared memories. We build a model-based framework using Parametrized Timed Automata of UPPAAL to analyze the mutual impact of performance, energy consumption and schedulability of DVFS multicore systems, and demonstrate the trade-off on an actual case study.Keywords: time-critical systems, multicore systems, schedulability analysis, energy consumption, performance analysis
Procedia PDF Downloads 1071058 Hydrogen Storage Optimisation: Development of Advanced Tools for Improved Permeability Modelling in Materials
Authors: Sirine Sayed, Mahrez Ait Mohammed, Mourad Nachtane, Abdelwahed Barkaoui, Khalid Bouziane, Mostapha Tarfaoui
Abstract:
This study addresses a critical challenge in transitioning to a hydrogen-based economy by introducing and validating a one-dimensional (1D) tool for modelling hydrogen permeability through hybrid materials, focusing on tank applications. The model developed integrates rigorous experimental validation, published data, and advanced computational modelling using the PanDiffusion framework, significantly enhancing its validity and applicability. By elucidating complex interactions between material properties, storage system configurations, and operational parameters, the tool demonstrates its capability to optimize design and operational parameters in real-world scenarios, as illustrated through a case study of hydrogen leakage. This comprehensive approach to assessing hydrogen permeability contributes significantly to overcoming key barriers in hydrogen infrastructure development, potentially accelerating the widespread adoption of hydrogen technology across various industrial sectors and marking a crucial step towards a more sustainable energy future.Keywords: hydrogen storage, composite tank, permeability modelling, PanDiffusion, energy carrier, transportation technology
Procedia PDF Downloads 181057 Resource Creation Using Natural Language Processing Techniques for Malay Translated Qur'an
Authors: Nor Diana Ahmad, Eric Atwell, Brandon Bennett
Abstract:
Text processing techniques for English have been developed for several decades. But for the Malay language, text processing methods are still far behind. Moreover, there are limited resources, tools for computational linguistic analysis available for the Malay language. Therefore, this research presents the use of natural language processing (NLP) in processing Malay translated Qur’an text. As the result, a new language resource for Malay translated Qur’an was created. This resource will help other researchers to build the necessary processing tools for the Malay language. This research also develops a simple question-answer prototype to demonstrate the use of the Malay Qur’an resource for text processing. This prototype has been developed using Python. The prototype pre-processes the Malay Qur’an and an input query using a stemming algorithm and then searches for occurrences of the query word stem. The result produced shows improved matching likelihood between user query and its answer. A POS-tagging algorithm has also been produced. The stemming and tagging algorithms can be used as tools for research related to other Malay texts and can be used to support applications such as information retrieval, question answering systems, ontology-based search and other text analysis tasks.Keywords: language resource, Malay translated Qur'an, natural language processing (NLP), text processing
Procedia PDF Downloads 3181056 A Novel Way to Create Qudit Quantum Error Correction Codes
Authors: Arun Moorthy
Abstract:
Quantum computing promises to provide algorithmic speedups for a number of tasks; however, similar to classical computing, effective error-correcting codes are needed. Current quantum computers require costly equipment to control each particle, so having fewer particles to control is ideal. Although traditional quantum computers are built using qubits (2-level systems), qudits (more than 2-levels) are appealing since they can have an equivalent computational space using fewer particles, meaning fewer particles need to be controlled. Currently, qudit quantum error-correction codes are available for different level qudit systems; however, these codes have sometimes overly specific constraints. When building a qudit system, it is important for researchers to have access to many codes to satisfy their requirements. This project addresses two methods to increase the number of quantum error correcting codes available to researchers. The first method is generating new codes for a given set of parameters. The second method is generating new error-correction codes by using existing codes as a starting point to generate codes for another level (i.e., a 5-level system code on a 2-level system). So, this project builds a website that researchers can use to generate new error-correction codes or codes based on existing codes.Keywords: qudit, error correction, quantum, qubit
Procedia PDF Downloads 1611055 Comparison of Irradiance Decomposition and Energy Production Methods in a Solar Photovoltaic System
Authors: Tisciane Perpetuo e Oliveira, Dante Inga Narvaez, Marcelo Gradella Villalva
Abstract:
Installations of solar photovoltaic systems have increased considerably in the last decade. Therefore, it has been noticed that monitoring of meteorological data (solar irradiance, air temperature, wind velocity, etc.) is important to predict the potential of a given geographical area in solar energy production. In this sense, the present work compares two computational tools that are capable of estimating the energy generation of a photovoltaic system through correlation analyzes of solar radiation data: PVsyst software and an algorithm based on the PVlib package implemented in MATLAB. In order to achieve the objective, it was necessary to obtain solar radiation data (measured and from a solarimetric database), analyze the decomposition of global solar irradiance in direct normal and horizontal diffuse components, as well as analyze the modeling of the devices of a photovoltaic system (solar modules and inverters) for energy production calculations. Simulated results were compared with experimental data in order to evaluate the performance of the studied methods. Errors in estimation of energy production were less than 30% for the MATLAB algorithm and less than 20% for the PVsyst software.Keywords: energy production, meteorological data, irradiance decomposition, solar photovoltaic system
Procedia PDF Downloads 1421054 Laboratory and Numerical Hydraulic Modelling of Annular Pipe Electrocoagulation Reactors
Authors: Alejandra Martin-Dominguez, Javier Canto-Rios, Velitchko Tzatchkov
Abstract:
Electrocoagulation is a water treatment technology that consists of generating coagulant species in situ by electrolytic oxidation of sacrificial anode materials triggered by electric current. It removes suspended solids, heavy metals, emulsified oils, bacteria, colloidal solids and particles, soluble inorganic pollutants and other contaminants from water, offering an alternative to the use of metal salts or polymers and polyelectrolyte addition for breaking stable emulsions and suspensions. The method essentially consists of passing the water being treated through pairs of consumable conductive metal plates in parallel, which act as monopolar electrodes, commonly known as ‘sacrificial electrodes’. Physicochemical, electrochemical and hydraulic processes are involved in the efficiency of this type of treatment. While the physicochemical and electrochemical aspects of the technology have been extensively studied, little is known about the influence of the hydraulics. However, the hydraulic process is fundamental for the reactions that take place at the electrode boundary layers and for the coagulant mixing. Electrocoagulation reactors can be open (with free water surface) and closed (pressurized). Independently of the type of rector, hydraulic head loss is an important factor for its design. The present work focuses on the study of the total hydraulic head loss and flow velocity and pressure distribution in electrocoagulation reactors with single or multiple concentric annular cross sections. An analysis of the head loss produced by hydraulic wall shear friction and accessories (minor head losses) is presented, and compared to the head loss measured on a semi-pilot scale laboratory model for different flow rates through the reactor. The tests included laminar, transitional and turbulent flow. The observed head loss was compared also to the head loss predicted by several known conceptual theoretical and empirical equations, specific for flow in concentric annular pipes. Four single concentric annular cross section and one multiple concentric annular cross section reactor configuration were studied. The theoretical head loss resulted higher than the observed in the laboratory model in some of the tests, and lower in others of them, depending also on the assumed value for the wall roughness. Most of the theoretical models assume that the fluid elements in all annular sections have the same velocity, and that flow is steady, uniform and one-dimensional, with the same pressure and velocity profiles in all reactor sections. To check the validity of such assumptions, a computational fluid dynamics (CFD) model of the concentric annular pipe reactor was implemented using the ANSYS Fluent software, demonstrating that pressure and flow velocity distribution inside the reactor actually is not uniform. Based on the analysis, the equations that predict better the head loss in single and multiple annular sections were obtained. Other factors that may impact the head loss, such as the generation of coagulants and gases during the electrochemical reaction, the accumulation of hydroxides inside the reactor, and the change of the electrode material with time, are also discussed. The results can be used as tools for design and scale-up of electrocoagulation reactors, to be integrated into new or existing water treatment plants.Keywords: electrocoagulation reactors, hydraulic head loss, concentric annular pipes, computational fluid dynamics model
Procedia PDF Downloads 2181053 Acceleration of Lagrangian and Eulerian Flow Solvers via Graphics Processing Units
Authors: Pooya Niksiar, Ali Ashrafizadeh, Mehrzad Shams, Amir Hossein Madani
Abstract:
There are many computationally demanding applications in science and engineering which need efficient algorithms implemented on high performance computers. Recently, Graphics Processing Units (GPUs) have drawn much attention as compared to the traditional CPU-based hardware and have opened up new improvement venues in scientific computing. One particular application area is Computational Fluid Dynamics (CFD), in which mature CPU-based codes need to be converted to GPU-based algorithms to take advantage of this new technology. In this paper, numerical solutions of two classes of discrete fluid flow models via both CPU and GPU are discussed and compared. Test problems include an Eulerian model of a two-dimensional incompressible laminar flow case and a Lagrangian model of a two phase flow field. The CUDA programming standard is used to employ an NVIDIA GPU with 480 cores and a C++ serial code is run on a single core Intel quad-core CPU. Up to two orders of magnitude speed up is observed on GPU for a certain range of grid resolution or particle numbers. As expected, Lagrangian formulation is better suited for parallel computations on GPU although Eulerian formulation represents significant speed up too.Keywords: CFD, Eulerian formulation, graphics processing units, Lagrangian formulation
Procedia PDF Downloads 4171052 Numerical Design and Characterization of MOVPE Grown Nitride Based Semiconductors
Authors: J. Skibinski, P. Caban, T. Wejrzanowski, K. J. Kurzydlowski
Abstract:
In the present study numerical simulations of epitaxial growth of gallium nitride in Metal Organic Vapor Phase Epitaxy reactor AIX-200/4RF-S are addressed. The aim of this study was to design the optimal fluid flow and thermal conditions for obtaining the most homogeneous product. Since there are many agents influencing reactions on the crystal growth area such as temperature, pressure, gas flow or reactor geometry, it is difficult to design optimal process. Variations of process pressure and hydrogen mass flow rates have been considered. According to the fact that it’s impossible to determine experimentally the exact distribution of heat and mass transfer inside the reactor during crystal growth, detailed 3D modeling has been used to get an insight of the process conditions. Numerical simulations allow to understand the epitaxial process by calculation of heat and mass transfer distribution during growth of gallium nitride. Including chemical reactions in the numerical model allows to calculate the growth rate of the substrate. The present approach has been applied to enhance the performance of AIX-200/4RF-S reactor.Keywords: computational fluid dynamics, finite volume method, epitaxial growth, gallium nitride
Procedia PDF Downloads 4541051 A Quick Prediction for Shear Behaviour of RC Membrane Elements by Fixed-Angle Softened Truss Model with Tension-Stiffening
Authors: X. Wang, J. S. Kuang
Abstract:
The Fixed-angle Softened Truss Model with Tension-stiffening (FASTMT) has a superior performance in predicting the shear behaviour of reinforced concrete (RC) membrane elements, especially for the post-cracking behaviour. Nevertheless, massive computational work is inevitable due to the multiple transcendental equations involved in the stress-strain relationship. In this paper, an iterative root-finding technique is introduced to FASTMT for solving quickly the transcendental equations of the tension-stiffening effect of RC membrane elements. This fast FASTMT, which performs in MATLAB, uses the bisection method to calculate the tensile stress of the membranes. By adopting the simplification, the elapsed time of each loop is reduced significantly and the transcendental equations can be solved accurately. Owing to the high efficiency and good accuracy as compared with FASTMT, the fast FASTMT can be further applied in quick prediction of shear behaviour of complex large-scale RC structures.Keywords: bisection method, FASTMT, iterative root-finding technique, reinforced concrete membrane
Procedia PDF Downloads 2731050 About Multi-Resolution Techniques for Large Eddy Simulation of Reactive Multi-Phase Flows
Authors: Giacomo Rossi, Bernardo Favini, Eugenio Giacomazzi, Franca Rita Picchia, Nunzio Maria Salvatore Arcidiacono
Abstract:
A numerical technique for mesh refinement in the HeaRT (Heat Release and Transfer) numerical code is presented. In the CFD framework, Large Eddy Simulation (LES) approach is gaining in importance as a tool for simulating turbulent combustion processes, also if this approach has an high computational cost due to the complexity of the turbulent modeling and the high number of grid points necessary to obtain a good numerical solution. In particular, when a numerical simulation of a big domain is performed with a structured grid, the number of grid points can increase so much that the simulation becomes impossible: this problem can be overcame with a mesh refinement technique. Mesh refinement technique developed for HeaRT numerical code (a staggered finite difference code) is based on an high order reconstruction of the variables at the grid interfaces by means of a least square quasi-ENO interpolation: numerical code is written in modern Fortran (2003 standard of newer) and is parallelized using domain decomposition and message passing interface (MPI) standard.Keywords: LES, multi-resolution, ENO, fortran
Procedia PDF Downloads 3661049 Whole Body Cooling Hypothermia Treatment Modelling Using a Finite Element Thermoregulation Model
Authors: Ana Beatriz C. G. Silva, Luiz Carlos Wrobel, Fernando Luiz B. Ribeiro
Abstract:
This paper presents a thermoregulation model using the finite element method to perform numerical analyses of brain cooling procedures as a contribution to the investigation on the use of therapeutic hypothermia after ischemia in adults. The use of computational methods can aid clinicians to observe body temperature using different cooling methods without the need of invasive techniques, and can thus be a valuable tool to assist clinical trials simulating different cooling options that can be used for treatment. In this work, we developed a FEM package applied to the solution of the continuum bioheat Pennes equation. Blood temperature changes were considered using a blood pool approach and a lumped analysis for intravascular catheter method of blood cooling. Some analyses are performed using a three-dimensional mesh based on a complex geometry obtained from computed tomography medical images, considering a cooling blanket and a intravascular catheter. A comparison is made between the results obtained and the effects of each case in brain temperature reduction in a required time, maintenance of body temperature at moderate hypothermia levels and gradual rewarming.Keywords: brain cooling, finite element method, hypothermia treatment, thermoregulation
Procedia PDF Downloads 3111048 Vaccine Development for Newcastle Disease Virus in Poultry
Authors: Muhammad Asif Rasheed
Abstract:
Newcastle disease virus (NDV), an avian orthoavulavirus, is a causative agent of Newcastle disease named (NDV) and can cause even the epidemics when the disease is not treated. Previously several vaccines based on attenuated and inactivated viruses have been reported, which are rendered useless with the passage of time due to versatile changes in viral genome. Therefore, we aimed to develop an effective multi-epitope vaccine against the haemagglutinin neuraminidase (HN) protein of 26 NDV strains from Pakistan through a modern immunoinformatic approaches. As a result, a vaccine chimaera was constructed by combining T-cell and B-cell epitopes with the appropriate linkers and adjuvant. The designed vaccine was highly immunogenic, non-allergen, and antigenic; therefore, the potential 3D-structureof multi epitope vaccine was constructed, refined, and validated. A molecular docking study of a multiepitope vaccine candidate with the chicken Toll-like receptor-4 indicated successful binding. An In silico immunological simulation was used to evaluate the candidate vaccine's ability to elicit an effective immune response. According to the computational studies, the proposed multiepitope vaccine is physically stable and may induce immune responses, whichsuggested it a strong candidate against 26 Newcastle disease virus strains from Pakistan. A wet lab study is under process to confirm the results.Keywords: epitopes, newcastle disease virus, paramyxovirus virus, vaccine
Procedia PDF Downloads 1201047 Turbulent Forced Convection of Cu-Water Nanofluid: CFD Models Comparison
Authors: I. Behroyan, P. Ganesan, S. He, S. Sivasankaran
Abstract:
This study compares the predictions of five types of Computational Fluid Dynamics (CFD) models, including two single-phase models (i.e. Newtonian and non-Newtonian) and three two-phase models (Eulerian-Eulerian, mixture and Eulerian-Lagrangian), to investigate turbulent forced convection of Cu-water nanofluid in a tube with a constant heat flux on the tube wall. The Reynolds (Re) number of the flow is between 10,000 and 25,000, while the volume fraction of Cu particles used is in the range of 0 to 2%. The commercial CFD package of ANSYS-Fluent is used. The results from the CFD models are compared with results from experimental investigations from literature. According to the results of this study, non-Newtonian single-phase model, in general, does not show a good agreement with Xuan and Li correlation in prediction of Nu number. Eulerian-Eulerian model gives inaccurate results expect for φ=0.5%. Mixture model gives a maximum error of 15%. Newtonian single-phase model and Eulerian-Lagrangian model, in overall, are the recommended models. This work can be used as a reference for selecting an appreciate model for future investigation. The study also gives a proper insight about the important factors such as Brownian motion, fluid behavior parameters and effective nanoparticle conductivity which should be considered or changed by the each model.Keywords: heat transfer, nanofluid, single-phase models, two-phase models
Procedia PDF Downloads 484