Search results for: artificial intelligence and genetic algorithms
3077 Using Serious Games to Integrate the Potential of Mass Customization into the Fuzzy Front-End of New Product Development
Authors: Michael N. O'Sullivan, Con Sheahan
Abstract:
Mass customization is the idea of offering custom products or services to satisfy the needs of each individual customer while maintaining the efficiency of mass production. Technologies like 3D printing and artificial intelligence have many start-ups hoping to capitalize on this dream of creating personalized products at an affordable price, and well established companies scrambling to innovate and maintain their market share. However, the majority of them are failing as they struggle to understand one key question – where does customization make sense? Customization and personalization only make sense where the value of the perceived benefit outweighs the cost to implement it. In other words, will people pay for it? Looking at the Kano Model makes it clear that it depends on the product. In products where customization is an inherent need, like prosthetics, mass customization technologies can be highly beneficial. However, for products that already sell as a standard, like headphones, offering customization is likely only an added bonus, and so the product development team must figure out if the customers’ perception of the added value of this feature will outweigh its premium price tag. This can be done through the use of a ‘serious game,’ whereby potential customers are given a limited budget to collaboratively buy and bid on potential features of the product before it is developed. If the group choose to buy customization over other features, then the product development team should implement it into their design. If not, the team should prioritize the features on which the customers have spent their budget. The level of customization purchased can also be translated to an appropriate production method, for example, the most expensive type of customization would likely be free-form design and could be achieved through digital fabrication, while a lower level could be achieved through short batch production. Twenty-five teams of final year students from design, engineering, construction and technology tested this methodology when bringing a product from concept through to production specification, and found that it allowed them to confidently decide what level of customization, if any, would be worth offering for their product, and what would be the best method of producing it. They also found that the discussion and negotiations between players during the game led to invaluable insights, and often decided to play a second game where they offered customers the option to buy the various customization ideas that had been discussed during the first game.Keywords: Kano model, mass customization, new product development, serious game
Procedia PDF Downloads 1343076 White Clover Trifolium repens L. Genetic Diversity and Salt Tolerance in Urban Area of Riga
Authors: Dace Grauda, Gunta Cekstere, Inta Belogrudova, Andis Karlsons, Isaak Rashal
Abstract:
Trifolium repens L. (white or Dutch clover) is a perennial herb, belongs to legume family (Leguminosae Juss.), spread extensively by stolons and seeds. The species is cultivated worldwide and was naturalized in many countries in meadows, yards, gardens, along roads and streets etc., especially in temperate regions. It is widespread also in grasslands throughout Riga, the capital of Latvia. The goal of this study was to investigate genetic structure of white clover population in Riga and to evaluate influence of different salt concentration on plants. For this purpose universal retrotranspozone based IRAP (Inter-Retrotransposon Amplified Polymorphism) method was used. The plant material was collected in different regions of Riga and in several urban areas of Latvia. Plant DNA was isolated from in silicogel dried leaves of using 1% CTAB (cetyltrimet-ammonium bromide) buffer DNA extraction procedure. Genetic structure of city population and wild populations were compared. Soil salinization is an important issue associated with low water resources and highly urbanized areas in aride and semi-aride climate conditions, as well as de-icing salt application to prevent ice formation on roads in winter. The T. repens variety ‘Daile’ (form giganteum), one of the often used component of urban greeneries, was studied in this investigation. Plants were grown from seeds and cultivated in the light conditions (18-25 C, 16h/8h of day/night, light intensity 3000 lx) in plastic pots (200 ml), filled with commercial neutralized (pH 5.9 ± 0.3) peat substrate with mineral nutrients. To analyse the impact of increased soil salinity treatments with gradually rising NaCl (0; 20; 40; 60; 80; 100 mM) levels were arranged. Plants were watered when necessary with deionised water to provide optimum substrate moisture 60-70%. The experiment was terminated six weeks after establishment. For analysis of mineral nutrients, dry plant material (above ground part and roots) was used. Decrease of Na content can be significant under elevated salinity till 20 mM NaCl. High NaCl concentrations in the substrate increase Na, Cl, Cu, Fe, and Mn accumulation, but reduce S, Mg, K content in the plant above ground parts. Abiotic stresses generally changes the levels of DNA metilation. Several candidate gene for salt tolerance will be analysed for DNA metilation level using Pyromark-Q24 advanced.Keywords: DNA metilation, IRAP, soil salinization, white clover
Procedia PDF Downloads 3643075 Describing the Fine Electronic Structure and Predicting Properties of Materials with ATOMIC MATTERS Computation System
Authors: Rafal Michalski, Jakub Zygadlo
Abstract:
We present the concept and scientific methods and algorithms of our computation system called ATOMIC MATTERS. This is the first presentation of the new computer package, that allows its user to describe physical properties of atomic localized electron systems subject to electromagnetic interactions. Our solution applies to situations where an unclosed electron 2p/3p/3d/4d/5d/4f/5f subshell interacts with an electrostatic potential of definable symmetry and external magnetic field. Our methods are based on Crystal Electric Field (CEF) approach, which takes into consideration the electrostatic ligands field as well as the magnetic Zeeman effect. The application allowed us to predict macroscopic properties of materials such as: Magnetic, spectral and calorimetric as a result of physical properties of their fine electronic structure. We emphasize the importance of symmetry of charge surroundings of atom/ion, spin-orbit interactions (spin-orbit coupling) and the use of complex number matrices in the definition of the Hamiltonian. Calculation methods, algorithms and convention recalculation tools collected in ATOMIC MATTERS were chosen to permit the prediction of magnetic and spectral properties of materials in isostructural series.Keywords: atomic matters, crystal electric field (CEF) spin-orbit coupling, localized states, electron subshell, fine electronic structure
Procedia PDF Downloads 3203074 The Effect of Artificial Intelligence on Petroleum Industry and Production
Authors: Mina Shokry Hanna Saleh Tadros
Abstract:
The centrality of the Petroleum Industry in the world energy is undoubted. The world economy almost runs and depends on petroleum. Petroleum industry is a multi-trillion industry; it turns otherwise poor and underdeveloped countries into wealthy nations and thrusts them at the center of international diplomacy. Although these developing nations lack the necessary technology to explore and exploit petroleum resources they are not without help as developed nations, represented by their multinational corporations are ready and willing to provide both the technical and managerial expertise necessary for the development of this natural resource. However, the exploration of these petroleum resources comes with, sometimes, grave, concomitant consequences. These consequences are especially pronounced with respect to the environment. From the British Petroleum Oil rig explosion and the resultant oil spillage and pollution in New Mexico, United States to the Mobil Oil spillage along Egyptian coast, the story and consequence is virtually the same. Egypt’s delta Region produces Nigeria’s petroleum which accounts for more than ninety-five percent of Nigeria’s foreign exchange earnings. Between 1999 and 2007, Egypt earned more than $400 billion from petroleum exports. Nevertheless, petroleum exploration and exploitation has devastated the Delta environment. From oil spillage which pollutes the rivers, farms and wetlands to gas flaring by the multi-national corporations; the consequences is similar-a region that has been devastated by petroleum exploitation. This paper thus seeks to examine the consequences and impact of petroleum pollution in the Egypt Delta with particular reference on the right of the people of Niger Delta to a healthy environment. The paper further seeks to examine the relevant international, regional instrument and Nigeria’s municipal laws that are meant to protect the result of the people of the Egypt Delta and their enforcement by the Nigerian State. It is quite worrisome that the Egypt Delta Region and its people have suffered and are still suffering grave violations of their right to a healthy environment as a result of petroleum exploitation in their region. The Egypt effort at best is half-hearted in its protection of the people’s right.Keywords: crude oil, fire, floating roof tank, lightning protection systemenvironment, exploration, petroleum, pollutionDuvernay petroleum system, oil generation, oil-source correlation, Re-Os
Procedia PDF Downloads 803073 Proposed Framework based on Classification of Vertical Handover Decision Strategies in Heterogeneous Wireless Networks
Authors: Shidrokh Goudarzi, Wan Haslina Hassan
Abstract:
Heterogeneous wireless networks are converging towards an all-IP network as part of the so-called next-generation network. In this paradigm, different access technologies need to be interconnected; thus, vertical handovers or vertical handoffs are necessary for seamless mobility. In this paper, we conduct a review of existing vertical handover decision-making mechanisms that aim to provide ubiquitous connectivity to mobile users. To offer a systematic comparison, we categorize these vertical handover measurement and decision structures based on their respective methodology and parameters. Subsequently, we analyze several vertical handover approaches in the literature and compare them according to their advantages and weaknesses. The paper compares the algorithms based on the network selection methods, complexity of the technologies used and efficiency in order to introduce our vertical handover decision framework. We find that vertical handovers on heterogeneous wireless networks suffer from the lack of a standard and efficient method to satisfy both user and network quality of service requirements at different levels including architectural, decision-making and protocols. Also, the consolidation of network terminal, cross-layer information, multi packet casting and intelligent network selection algorithm appears to be an optimum solution for achieving seamless service continuity in order to facilitate seamless connectivity.Keywords: heterogeneous wireless networks, vertical handovers, vertical handover metric, decision-making algorithms
Procedia PDF Downloads 3933072 Optimal Hybrid Linear and Nonlinear Control for a Quadcopter Drone
Authors: Xinhuang Wu, Yousef Sardahi
Abstract:
A hybrid and optimal multi-loop control structure combining linear and nonlinear control algorithms are introduced in this paper to regulate the position of a quadcopter unmanned aerial vehicle (UAV) driven by four brushless DC motors. To this end, a nonlinear mathematical model of the UAV is derived and then linearized around one of its operating points. Using the nonlinear version of the model, a sliding mode control is used to derive the control laws of the motor thrust forces required to drive the UAV to a certain position. The linear model is used to design two controllers, XG-controller and YG-controller, responsible for calculating the required roll and pitch to maneuver the vehicle to the desired X and Y position. Three attitude controllers are designed to calculate the desired angular rates of rotors, assuming that the Euler angles are minimal. After that, a many-objective optimization problem involving 20 design parameters and ten objective functions is formulated and solved by HypE (Hypervolume estimation algorithm), one of the widely used many-objective optimization algorithms approaches. Both stability and performance constraints are imposed on the optimization problem. The optimization results in terms of Pareto sets and fronts are obtained and show that some of the design objectives are competing. That is, when one objective goes down, the other goes up. Also, Numerical simulations conducted on the nonlinear UAV model show that the proposed optimization method is quite effective.Keywords: optimal control, many-objective optimization, sliding mode control, linear control, cascade controllers, UAV, drones
Procedia PDF Downloads 733071 Application of ANN for Estimation of Power Demand of Villages in Sulaymaniyah Governorate
Abstract:
Before designing an electrical system, the estimation of load is necessary for unit sizing and demand-generation balancing. The system could be a stand-alone system for a village or grid connected or integrated renewable energy to grid connection, especially as there are non–electrified villages in developing countries. In the classical model, the energy demand was found by estimating the household appliances multiplied with the amount of their rating and the duration of their operation, but in this paper, information exists for electrified villages could be used to predict the demand, as villages almost have the same life style. This paper describes a method used to predict the average energy consumed in each two months for every consumer living in a village by Artificial Neural Network (ANN). The input data are collected using a regional survey for samples of consumers representing typical types of different living, household appliances and energy consumption by a list of information, and the output data are collected from administration office of Piramagrun for each corresponding consumer. The result of this study shows that the average demand for different consumers from four villages in different months throughout the year is approximately 12 kWh/day, this model estimates the average demand/day for every consumer with a mean absolute percent error of 11.8%, and MathWorks software package MATLAB version 7.6.0 that contains and facilitate Neural Network Toolbox was used.Keywords: artificial neural network, load estimation, regional survey, rural electrification
Procedia PDF Downloads 1233070 The Impact of Artificial Intelligence on Legislations and Laws
Authors: Keroles Akram Saed Ghatas
Abstract:
The near future will bring significant changes in modern organizations and management due to the growing role of intangible assets and knowledge workers. The area of copyright, intellectual property, digital (intangible) assets and media redistribution appears to be one of the greatest challenges facing business and society in general and management sciences and organizations in particular. The proposed article examines the views and perceptions of fairness in digital media sharing among Harvard Law School's LL.M.s. Students, based on 50 qualitative interviews and 100 surveys. The researcher took an ethnographic approach to her research and entered the Harvard LL.M. in 2016. at, a Face book group that allows people to connect naturally and attend in-person and private events more easily. After listening to numerous students, the researcher conducted a quantitative survey among 100 respondents to assess respondents' perceptions of fairness in digital file sharing in various contexts (based on media price, its availability, regional licenses, copyright holder status, etc.). to understand better . .). Based on the survey results, the researcher conducted long-term, open-ended and loosely structured ethnographic interviews (50 interviews) to further deepen the understanding of the results. The most important finding of the study is that Harvard lawyers generally support digital piracy in certain contexts, despite having the best possible legal and professional knowledge. Interestingly, they are also more accepting of working for the government than the private sector. The results of this study provide a better understanding of how “fairness” is perceived by the younger generation of lawyers and pave the way for a more rational application of licensing laws.Keywords: cognitive impairments, communication disorders, death penalty, executive function communication disorders, cognitive disorders, capital murder, executive function death penalty, egyptian law absence, justice, political cases piracy, digital sharing, perception of fairness, legal profession
Procedia PDF Downloads 643069 Linguistic Cyberbullying, a Legislative Approach
Authors: Simona Maria Ignat
Abstract:
Bullying online has been an increasing studied topic during the last years. Different approaches, psychological, linguistic, or computational, have been applied. To our best knowledge, a definition and a set of characteristics of phenomenon agreed internationally as a common framework are still waiting for answers. Thus, the objectives of this paper are the identification of bullying utterances on Twitter and their algorithms. This research paper is focused on the identification of words or groups of words, categorized as “utterances”, with bullying effect, from Twitter platform, extracted on a set of legislative criteria. This set is the result of analysis followed by synthesis of law documents on bullying(online) from United States of America, European Union, and Ireland. The outcome is a linguistic corpus with approximatively 10,000 entries. The methods applied to the first objective have been the following. The discourse analysis has been applied in identification of keywords with bullying effect in texts from Google search engine, Images link. Transcription and anonymization have been applied on texts grouped in CL1 (Corpus linguistics 1). The keywords search method and the legislative criteria have been used for identifying bullying utterances from Twitter. The texts with at least 30 representations on Twitter have been grouped. They form the second corpus linguistics, Bullying utterances from Twitter (CL2). The entries have been identified by using the legislative criteria on the the BoW method principle. The BoW is a method of extracting words or group of words with same meaning in any context. The methods applied for reaching the second objective is the conversion of parts of speech to alphabetical and numerical symbols and writing the bullying utterances as algorithms. The converted form of parts of speech has been chosen on the criterion of relevance within bullying message. The inductive reasoning approach has been applied in sampling and identifying the algorithms. The results are groups with interchangeable elements. The outcomes convey two aspects of bullying: the form and the content or meaning. The form conveys the intentional intimidation against somebody, expressed at the level of texts by grammatical and lexical marks. This outcome has applicability in the forensic linguistics for establishing the intentionality of an action. Another outcome of form is a complex of graphemic variations essential in detecting harmful texts online. This research enriches the lexicon already known on the topic. The second aspect, the content, revealed the topics like threat, harassment, assault, or suicide. They are subcategories of a broader harmful content which is a constant concern for task forces and legislators at national and international levels. These topic – outcomes of the dataset are a valuable source of detection. The analysis of content revealed algorithms and lexicons which could be applied to other harmful contents. A third outcome of content are the conveyances of Stylistics, which is a rich source of discourse analysis of social media platforms. In conclusion, this corpus linguistics is structured on legislative criteria and could be used in various fields.Keywords: corpus linguistics, cyberbullying, legislation, natural language processing, twitter
Procedia PDF Downloads 863068 Talking Back to Hollywood: Museum Representation in Popular Culture as a Gateway to Understanding Public Perception
Authors: Jessica BrodeFrank, Beka Bryer, Lacey Wilson, Sierra Van Ryck deGroot
Abstract:
Museums are enjoying quite the moment in pop culture. From discussions of labor in Bob’s Burger to introducing cultural repatriation in The Black Panther, discussions of various museum issues are making their way to popular media. “Talking Back to Hollywood” analyzes the impact museums have on movies and television. The paper will highlight a series of cultural cameos and discuss what each reveals about critical themes in museums: repatriation, labor, obfuscated histories, institutional legacies, artificial intelligence, and holograms. Using a mixed methods approach to include surveys, descriptive research, thematic analysis, and context analysis, the authors of this paper will explore how we, as the museum staff, might begin to cite museums and movies together as texts. Drawing from their experience working in museums and public history, this contingent of mid-career professionals will highlight the impact museums have had on movies and television and the didactic lessons these portrayals can provide back to cultural heritage professionals. From tackling critical themes in museums such as repatriation, labor conditions/inequities, obfuscated histories, curatorial choice and control, institutional legacies, and more, this paper is grounded in the cultural zeitgeist of the 2000s and the message these media portrayals send to the public and the cultural heritage sector. In particular, the paper will examine how portrayals of AI, holograms, and more technology can be used as entry points for necessary discussions with the public on mistrust, misinformation, and emerging technologies. This paper will not only expose the legacy and cultural understanding of the museum field within popular culture but also will discuss actionable ways that public historians can use these portrayals as an entry point for discussions with the public, citing literature reviews and quantitative and qualitative analysis of survey results. As Hollywood is talking about museums, museums can use that to better connect to the audiences who feel comfortable at the cinema but are excluded from the museum.Keywords: museums, public memory, representation, popular culture
Procedia PDF Downloads 833067 Morphological and Molecular Abnormalities of the Skeletal Muscle Tissue from Pediatric Patient Affected by a Rare Genetic Chaperonopathy Associated with Motor Neuropathy
Authors: Leila Noori, Rosario Barone, Francesca Rappa, Antonella Marino Gammazza, Alessandra Maria Vitale, Giuseppe Donato Mangano, Giusy Sentiero, Filippo Macaluso, Kathryn H. Myburgh, Francesco Cappello, Federica Scalia
Abstract:
The neuromuscular system controls, directs, and allows movement of the body through the action of neural circuits, which include motor neurons, sensory neurons, and skeletal muscle fibers. Protein homeostasis of the involved cytotypes appears crucial to maintain the correct and prolonged functions of the neuromuscular system, and both neuronal cells and skeletal muscle fibers express significant quantities of protein chaperones, the molecular machinery responsible to maintain the protein turnover. Genetic mutations or defective post-translational modifications of molecular chaperones (i.e., genetic or acquired chaperonopathies) may lead to neuromuscular disorders called as neurochaperonopathies. The limited knowledge of the effects of the defective chaperones on skeletal muscle fibers and neurons impedes the progression of therapeutic approaches. A distinct genetic variation of CCT5 gene encoding for the subunit 5 of the chaperonin CCT (Chaperonin Containing TCP1; also known as TRiC, TCP1 Ring Complex) was recently described associated with severe distal motor neuropathy by our team. In this study, we investigated the histopathological abnormalities of the skeletal muscle biopsy of the pediatric patient affected by the mutation Leu224Val in the CCT5 subunit. We provide molecular and structural features of the diseased skeletal muscle tissue that we believe may be useful to identify undiagnosed cases of this rare genetic disorder. We investigated the histological abnormalities of the affected tissue via hematoxylin and eosin staining. Then we used immunofluorescence and qPCR techniques to explore the expression and distribution of CCT5 in diseased and healthy skeletal muscle tissue. Immunofluorescence and immunohistochemistry assays were performed to study the sarcomeric and structural proteins of skeletal muscle, including actin, myosin, tubulin, troponin-T, telethonin, and titin. We performed Western blot to examine the protein expression of CCT5 and some heat shock proteins, Hsp90, Hsp60, Hsp27, and α-B crystallin, along with the main client proteins of the CCT5, actin, and tubulin. Our findings revealed muscular atrophy, abnormal morphology, and different sizes of muscle fibers in affected tissue. The swollen nuclei and wide interfiber spaces were seen. Expression of CCT5 had been decreased and showed a different distribution pattern in the affected tissue. Altered expression, distribution, and bandage pattern were detected by confocal microscopy for the interested muscular proteins in tissue from the patient compared to the healthy control. Protein levels of the studied Hsps normally located at the Z-disk were reduced. Western blot results showed increased levels of the actin and tubulin proteins in the diseased skeletal muscle biopsy compared to healthy tissue. Chaperones must be expressed at high levels in skeletal muscle to counteract various stressors such as mechanical, oxidative, and thermal crises; therefore, it seems relevant that defects of molecular chaperones may result in damaged skeletal muscle fibers. So far, several chaperones or cochaperones involved in neuromuscular disorders have been defined. Our study shows that alteration of the CCT5 subunit is associated with the damaged structure of skeletal muscle fibers and alterations of chaperone system components and paves the way to explore possible alternative substrates of chaperonin CCT. However, further studies are underway to investigate the CCT mechanisms of action to design applicable therapeutic strategies.Keywords: molecular chaperones, neurochaperonopathy, neuromuscular system, protein homeostasis
Procedia PDF Downloads 713066 Improvements in Double Q-Learning for Anomalous Radiation Source Searching
Authors: Bo-Bin Xiaoa, Chia-Yi Liua
Abstract:
In the task of searching for anomalous radiation sources, personnel holding radiation detectors to search for radiation sources may be exposed to unnecessary radiation risk, and automated search using machines becomes a required project. The research uses various sophisticated algorithms, which are double Q learning, dueling network, and NoisyNet, of deep reinforcement learning to search for radiation sources. The simulation environment, which is a 10*10 grid and one shielding wall setting in it, improves the development of the AI model by training 1 million episodes. In each episode of training, the radiation source position, the radiation source intensity, agent position, shielding wall position, and shielding wall length are all set randomly. The three algorithms are applied to run AI model training in four environments where the training shielding wall is a full-shielding wall, a lead wall, a concrete wall, and a lead wall or a concrete wall appearing randomly. The 12 best performance AI models are selected by observing the reward value during the training period and are evaluated by comparing these AI models with the gradient search algorithm. The results show that the performance of the AI model, no matter which one algorithm, is far better than the gradient search algorithm. In addition, the simulation environment becomes more complex, the AI model which applied Double DQN combined Dueling and NosiyNet algorithm performs better.Keywords: double Q learning, dueling network, NoisyNet, source searching
Procedia PDF Downloads 1133065 Creating Energy Sustainability in an Enterprise
Authors: John Lamb, Robert Epstein, Vasundhara L. Bhupathi, Sanjeev Kumar Marimekala
Abstract:
As we enter the new era of Artificial Intelligence (AI) and Cloud Computing, we mostly rely on the Machine and Natural Language Processing capabilities of AI, and Energy Efficient Hardware and Software Devices in almost every industry sector. In these industry sectors, much emphasis is on developing new and innovative methods for producing and conserving energy and sustaining the depletion of natural resources. The core pillars of sustainability are economic, environmental, and social, which is also informally referred to as the 3 P's (People, Planet and Profits). The 3 P's play a vital role in creating a core Sustainability Model in the Enterprise. Natural resources are continually being depleted, so there is more focus and growing demand for renewable energy. With this growing demand, there is also a growing concern in many industries on how to reduce carbon emissions and conserve natural resources while adopting sustainability in corporate business models and policies. In our paper, we would like to discuss the driving forces such as Climate changes, Natural Disasters, Pandemic, Disruptive Technologies, Corporate Policies, Scaled Business Models and Emerging social media and AI platforms that influence the 3 main pillars of Sustainability (3P’s). Through this paper, we would like to bring an overall perspective on enterprise strategies and the primary focus on bringing cultural shifts in adapting energy-efficient operational models. Overall, many industries across the globe are incorporating core sustainability principles such as reducing energy costs, reducing greenhouse gas (GHG) emissions, reducing waste and increasing recycling, adopting advanced monitoring and metering infrastructure, reducing server footprint and compute resources (Shared IT services, Cloud computing, and Application Modernization) with the vision for a sustainable environment.Keywords: climate change, pandemic, disruptive technology, government policies, business model, machine learning and natural language processing, AI, social media platform, cloud computing, advanced monitoring, metering infrastructure
Procedia PDF Downloads 1113064 Pre-Service Mathematics Teachers’ Mental Construction in Solving Equations and Inequalities Using ACE Teaching Cycle
Authors: Abera Kotu, Girma Tesema, Mitiku Tadesse
Abstract:
This study investigated ACE supported instruction and pre-service mathematics teachers’ mental construction in solving equations and inequalities. A mixed approach with concurrent parallel design was employed. It was conducted on two intact groups of regular first-year pre-service mathematics teachers at Fiche College of Teachers’ Education in which one group was assigned as an intervention group and the other group as a comparison group using the lottery method. There were 33 participants in the intervention and 32 participants in the comparison. Six pre-service mathematics teachers were selected for interview using purposive sampling based on pre-test results. An instruction supported with ACE cycle was given to the intervention group for two weeks duration of time. Written tasks, interviews, and observations were used to collect data. Data collected from written tasks were analyzed quantitatively using independent samples t-test and effect size. Data collected from interviews and observations were analyzed narratively. The findings of the study uncovered that ACE-supported instruction has a moderate effect on Pre-service Mathematics Teachers’ levels of conceptualizations of action, process, object, ad schema. Moreover, the ACE supported group out scored and performed better than the usual traditional method supported groups across the levels of conceptualization. The majority of pre-service mathematics teachers’ levels of conceptualizations were at action and process levels and their levels of conceptualization were linked with genetic decomposition more at action and object levels than object and schema. The use of ACE supported instruction is recommended to improve pre-service mathematics teachers’ mental construction.Keywords: ACE teaching cycle, APOS theory, mental construction, genetic composition
Procedia PDF Downloads 173063 Climate Changes Impact on Artificial Wetlands
Authors: Carla Idely Palencia-Aguilar
Abstract:
Artificial wetlands play an important role at Guasca Municipality in Colombia, not only because they are used for the agroindustry, but also because more than 45 species were found, some of which are endemic and migratory birds. Remote sensing was used to determine the changes in the area occupied by water of artificial wetlands by means of Aster and Modis images for different time periods. Evapotranspiration was also determined by three methods: Surface Energy Balance System-Su (SEBS) algorithm, Surface Energy Balance- Bastiaanssen (SEBAL) algorithm, and Potential Evapotranspiration- FAO. Empirical equations were also developed to determine the relationship between Normalized Difference Vegetation Index (NDVI) versus net radiation, ambient temperature and rain with an obtained R2 of 0.83. Groundwater level fluctuations on a daily basis were studied as well. Data from a piezometer placed next to the wetland were fitted with rain changes (with two weather stations located at the proximities of the wetlands) by means of multiple regression and time series analysis, the R2 from the calculated and measured values resulted was higher than 0.98. Information from nearby weather stations provided information for ordinary kriging as well as the results for the Digital Elevation Model (DEM) developed by using PCI software. Standard models (exponential, spherical, circular, gaussian, linear) to describe spatial variation were tested. Ordinary Cokriging between height and rain variables were also tested, to determine if the accuracy of the interpolation would increase. The results showed no significant differences giving the fact that the mean result of the spherical function for the rain samples after ordinary kriging was 58.06 and a standard deviation of 18.06. The cokriging using for the variable rain, a spherical function; for height variable, the power function and for the cross variable (rain and height), the spherical function had a mean of 57.58 and a standard deviation of 18.36. Threatens of eutrophication were also studied, given the unconsciousness of neighbours and government deficiency. Water quality was determined over the years; different parameters were studied to determine the chemical characteristics of water. In addition, 600 pesticides were studied by gas and liquid chromatography. Results showed that coliforms, nitrogen, phosphorous and prochloraz were the most significant contaminants.Keywords: DEM, evapotranspiration, geostatistics, NDVI
Procedia PDF Downloads 1203062 Lockit: A Logic Locking Automation Software
Authors: Nemanja Kajtez, Yue Zhan, Basel Halak
Abstract:
The significant rise in the cost of manufacturing of nanoscale integrated circuits (IC) has led the majority of IC design companies to outsource the fabrication of their products to other companies, often located in different countries. This multinational nature of the hardware supply chain has led to a host of security threats, including IP piracy, IC overproduction, and Trojan insertion. To combat that, researchers have proposed logic locking techniques to protect the intellectual properties of the design and increase the difficulty of malicious modification of its functionality. However, the adoption of logic locking approaches is rather slow due to the lack of the integration with IC production process and the lack of efficacy of existing algorithms. This work automates the logic locking process by developing software using Python that performs the locking on a gate-level netlist and can be integrated with the existing digital synthesis tools. Analysis of the latest logic locking algorithms has demonstrated that the SFLL-HD algorithm is one of the most secure and versatile in trading-off levels of protection against different types of attacks and was thus selected for implementation. The presented tool can also be expanded to incorporate the latest locking mechanisms to keep up with the fast-paced development in this field. The paper also presents a case study to demonstrate the functionality of the tool and how it could be used to explore the design space and compare different locking solutions. The source code of this tool is available freely from (https://www.researchgate.net/publication/353195333_Source_Code_for_The_Lockit_Tool).Keywords: design automation, hardware security, IP piracy, logic locking
Procedia PDF Downloads 1833061 Effect of Phaseolus vulgaris Inoculation on P. vulgaris and Zea mays Growth and Yield Cultivated in Intercropping
Authors: Nour Elhouda Abed, Bedj Mimi, Wahid Slimani, Mourad Atif, Abdelhakim Ouzzane, Hocine Irekti, Abdelkader Bekki
Abstract:
The most frequent system of cereal production in Algeria is fallow-wheat. This is an extensive system that meets only the half needs some cereals and fodder demand. Resorption of fallow has become a strategic necessity to ensure food security in response to the instability of supply and the persistence of higher food prices on the world market. Despite several attempts to replace the fallow by crop cultures, choosing the best crop remains. Today, the agronomic and economic interests of legumes are demonstrated. However, their crop culture remains marginalized because of the weakness and instability of their performance. In the context of improving legumes and cereals crops as well as fallow resorption, we undertook to test, in the field, the effect of rhizobial inoculation of Phaseolus vulgaris in association with Zea Mays. We firstly studied the genetic diversity of rhizobial strains that nodulate P.vulgaris isolated from fifteen (15) different regions. ARDRA had shown 18 different genetic profiles. Symbiotic characterization highlighted a strain that highly significantly improved the fresh and dry weight of the host plant, in comparison to the negative control (un-inoculated) and the positive control (inoculated with the reference strain CIAT 899). In the field, the selected strain increased significantly the growth and yield of P.vulgaris and Zea Mays comparing to the non-inoculated control. However, the mix inoculation (selected strain+ Ciat 899) had not given the best parameters showing, thus, no synergy between the strains. These results indicate the replacing fallow by a crop legume in intercropping with cereals crops.Keywords: fallow, intercropping, inoculation, legumes-cereals
Procedia PDF Downloads 3673060 Enabling Cloud Adoption Based Secured Mobile Banking through Backend as a Service
Authors: P. S. Jagadeesh Kumar, S. Meenakshi Sundaram
Abstract:
With the increase of prevailing non-traditional rivalry, mobile banking experiences an ever changing commercial backdrop. Substantial customer demands have established to be more intricate as customers request more expediency and superintend over their banking services. To enterprise advance and modernization in mobile banking applications, it is gradually obligatory to deeply leapfrog the scuffle using business model transformation. The dramaturgical vicissitudes taking place in mobile banking entail advanced traditions to exploit security. By reforming and transforming older back office into integrated mobile banking applications, banks can engender a supple and nimble banking environment that can rapidly respond to new business requirements over cloud computing. Cloud computing is transfiguring ecosystems in numerous industries, and mobile banking is no exemption providing services innovation, greater flexibility to respond to improved security and enhanced business intelligence with less cost. Cloud technology offer secure deployment possibilities that can provision banks in developing new customer experiences, empower operative relationship and advance speed to efficient banking transaction. Cloud adoption is escalating quickly since it can be made secured for commercial mobile banking transaction through backend as a service in scrutinizing the security strategies of the cloud service provider along with the antiquity of transaction details and their security related practices.Keywords: cloud adoption, backend as a service, business intelligence, secured mobile banking
Procedia PDF Downloads 2543059 In Silico Analysis of Deleterious nsSNPs (Missense) of Dihydrolipoamide Branched-Chain Transacylase E2 Gene Associated with Maple Syrup Urine Disease Type II
Authors: Zainab S. Ahmed, Mohammed S. Ali, Nadia A. Elshiekh, Sami Adam Ibrahim, Ghada M. El-Tayeb, Ahmed H. Elsadig, Rihab A. Omer, Sofia B. Mohamed
Abstract:
Maple syrup urine (MSUD) is an autosomal recessive disease that causes a deficiency in the enzyme branched-chain alpha-keto acid (BCKA) dehydrogenase. The development of disease has been associated with SNPs in the DBT gene. Despite that, the computational analysis of SNPs in coding and noncoding and their functional impacts on protein level still remains unknown. Hence, in this study, we carried out a comprehensive in silico analysis of missense that was predicted to have a harmful influence on DBT structure and function. In this study, eight different in silico prediction algorithms; SIFT, PROVEAN, MutPred, SNP&GO, PhD-SNP, PANTHER, I-Mutant 2.0 and MUpo were used for screening nsSNPs in DBT including. Additionally, to understand the effect of mutations in the strength of the interactions that bind protein together the ELASPIC servers were used. Finally, the 3D structure of DBT was formed using Mutation3D and Chimera servers respectively. Our result showed that a total of 15 nsSNPs confirmed by 4 software (R301C, R376H, W84R, S268F, W84C, F276C, H452R, R178H, I355T, V191G, M444T, T174A, I200T, R113H, and R178C) were found damaging and can lead to a shift in DBT gene structure. Moreover, we found 7 nsSNPs located on the 2-oxoacid_dh catalytic domain, 5 nsSNPs on the E_3 binding domain and 3 nsSNPs on the Biotin Domain. So these nsSNPs may alter the putative structure of DBT’s domain. Furthermore, we detected all these nsSNPs are on the core residues of the protein and have the ability to change the stability of the protein. Additionally, we found W84R, S268F, and M444T have high significance, and they affected Leucine, Isoleucine, and Valine, which reduces or disrupt the function of BCKD complex, E2-subunit which the DBT gene encodes. In conclusion, based on our extensive in-silico analysis, we report 15 nsSNPs that have possible association with protein deteriorating and disease-causing abilities. These candidate SNPs can aid in future studies on Maple Syrup Urine Disease type II base in the genetic level.Keywords: DBT gene, ELASPIC, in silico analysis, UCSF chimer
Procedia PDF Downloads 2013058 Algorithm for Quantification of Pulmonary Fibrosis in Chest X-Ray Exams
Authors: Marcela de Oliveira, Guilherme Giacomini, Allan Felipe Fattori Alves, Ana Luiza Menegatti Pavan, Maria Eugenia Dela Rosa, Fernando Antonio Bacchim Neto, Diana Rodrigues de Pina
Abstract:
It is estimated that each year one death every 10 seconds (about 2 million deaths) in the world is attributed to tuberculosis (TB). Even after effective treatment, TB leaves sequelae such as, for example, pulmonary fibrosis, compromising the quality of life of patients. Evaluations of the aforementioned sequel are usually performed subjectively by radiology specialists. Subjective evaluation may indicate variations inter and intra observers. The examination of x-rays is the diagnostic imaging method most accomplished in the monitoring of patients diagnosed with TB and of least cost to the institution. The application of computational algorithms is of utmost importance to make a more objective quantification of pulmonary impairment in individuals with tuberculosis. The purpose of this research is the use of computer algorithms to quantify the pulmonary impairment pre and post-treatment of patients with pulmonary TB. The x-ray images of 10 patients with TB diagnosis confirmed by examination of sputum smears were studied. Initially the segmentation of the total lung area was performed (posteroanterior and lateral views) then targeted to the compromised region by pulmonary sequel. Through morphological operators and the application of signal noise tool, it was possible to determine the compromised lung volume. The largest difference found pre- and post-treatment was 85.85% and the smallest was 54.08%.Keywords: algorithm, radiology, tuberculosis, x-rays exam
Procedia PDF Downloads 4193057 Corpus-Based Neural Machine Translation: Empirical Study Multilingual Corpus for Machine Translation of Opaque Idioms - Cloud AutoML Platform
Authors: Khadija Refouh
Abstract:
Culture bound-expressions have been a bottleneck for Natural Language Processing (NLP) and comprehension, especially in the case of machine translation (MT). In the last decade, the field of machine translation has greatly advanced. Neural machine translation NMT has recently achieved considerable development in the quality of translation that outperformed previous traditional translation systems in many language pairs. Neural machine translation NMT is an Artificial Intelligence AI and deep neural networks applied to language processing. Despite this development, there remain some serious challenges that face neural machine translation NMT when translating culture bounded-expressions, especially for low resources language pairs such as Arabic-English and Arabic-French, which is not the case with well-established language pairs such as English-French. Machine translation of opaque idioms from English into French are likely to be more accurate than translating them from English into Arabic. For example, Google Translate Application translated the sentence “What a bad weather! It runs cats and dogs.” to “يا له من طقس سيء! تمطر القطط والكلاب” into the target language Arabic which is an inaccurate literal translation. The translation of the same sentence into the target language French was “Quel mauvais temps! Il pleut des cordes.” where Google Translate Application used the accurate French corresponding idioms. This paper aims to perform NMT experiments towards better translation of opaque idioms using high quality clean multilingual corpus. This Corpus will be collected analytically from human generated idiom translation. AutoML translation, a Google Neural Machine Translation Platform, is used as a custom translation model to improve the translation of opaque idioms. The automatic evaluation of the custom model will be compared to the Google NMT using Bilingual Evaluation Understudy Score BLEU. BLEU is an algorithm for evaluating the quality of text which has been machine-translated from one natural language to another. Human evaluation is integrated to test the reliability of the Blue Score. The researcher will examine syntactical, lexical, and semantic features using Halliday's functional theory.Keywords: multilingual corpora, natural language processing (NLP), neural machine translation (NMT), opaque idioms
Procedia PDF Downloads 1493056 Critical Design Futures: A Foresight 3.0 Approach to Business Transformation and Innovation
Authors: Nadya Patel, Jawn Lim
Abstract:
Foresight 3.0 is a synergistic methodology that encompasses systems analysis, future studies, capacity building, and forward planning. These components are interconnected, fostering a collective anticipatory intelligence that promotes societal resilience (Ravetz, 2020). However, traditional applications of these strands can often fall short, leading to missed opportunities and narrow perspectives. Therefore, Foresight 3.0 champions a holistic approach to tackling complex issues, focusing on systemic transformations and power dynamics. Businesses are pivotal in preparing the workforce for an increasingly uncertain and complex world. This necessitates the adoption of innovative tools and methodologies, such as Foresight 3.0, that can better equip young employees to anticipate and navigate future challenges. Firstly, the incorporation of its methodology into workplace training can foster a holistic perspective among employees. This approach encourages employees to think beyond the present and consider wider social, economic, and environmental contexts, thereby enhancing their problem-solving skills and resilience. This paper discusses our research on integrating Foresight 3.0's transformative principles with a newly developed Critical Design Futures (CDF) framework to equip organisations with the ability to innovate for the world's most complex social problems. This approach is grounded in 'collective forward intelligence,' enabling mutual learning, co-innovation, and co-production among a diverse stakeholder community, where business transformation and innovation are achieved.Keywords: business transformation, innovation, foresight, critical design
Procedia PDF Downloads 813055 Finite-Sum Optimization: Adaptivity to Smoothness and Loopless Variance Reduction
Authors: Bastien Batardière, Joon Kwon
Abstract:
For finite-sum optimization, variance-reduced gradient methods (VR) compute at each iteration the gradient of a single function (or of a mini-batch), and yet achieve faster convergence than SGD thanks to a carefully crafted lower-variance stochastic gradient estimator that reuses past gradients. Another important line of research of the past decade in continuous optimization is the adaptive algorithms such as AdaGrad, that dynamically adjust the (possibly coordinate-wise) learning rate to past gradients and thereby adapt to the geometry of the objective function. Variants such as RMSprop and Adam demonstrate outstanding practical performance that have contributed to the success of deep learning. In this work, we present AdaLVR, which combines the AdaGrad algorithm with loopless variance-reduced gradient estimators such as SAGA or L-SVRG that benefits from a straightforward construction and a streamlined analysis. We assess that AdaLVR inherits both good convergence properties from VR methods and the adaptive nature of AdaGrad: in the case of L-smooth convex functions we establish a gradient complexity of O(n + (L + √ nL)/ε) without prior knowledge of L. Numerical experiments demonstrate the superiority of AdaLVR over state-of-the-art methods. Moreover, we empirically show that the RMSprop and Adam algorithm combined with variance-reduced gradients estimators achieve even faster convergence.Keywords: convex optimization, variance reduction, adaptive algorithms, loopless
Procedia PDF Downloads 713054 Road Systems as Environmental Barriers: An Overview of Roadways in Their Function as Fences for Wildlife Movement
Authors: Rachael Bentley, Callahan Gergen, Brodie Thiede
Abstract:
Roadways have a significant impact on the environment in so far as they function as barriers to wildlife movement, both through road mortality and through resultant road avoidance. Roads have an im-mense presence worldwide, and it is predicted to increase substantially in the next thirty years. As roadways become even more common, it is important to consider their environmental impact, and to mitigate the negative effects which they have on wildlife and wildlife mobility. In a thorough analysis of several related studies, a common conclusion was that roads cause habitat fragmentation, which can lead split populations to evolve differently, for better or for worse. Though some populations adapted positively to roadways, becoming more resistant to road mortality, and more tolerant to noise and chemical contamination, many others experienced maladaptation, either due to chemical contamination in and around their environment, or because of genetic mutations from inbreeding when their population was fragmented too substantially to support a large enough group for healthy genetic exchange. Large mammals were especially susceptible to maladaptation from inbreed-ing, as they require larger areas to roam and therefore require even more space to sustain a healthy population. Regardless of whether a species evolved positively or negatively as a result of their proximity to a road, animals tended to avoid roads, making the genetic diversity from habitat fragmentation an exceedingly prevalent issue in the larger discussion of road ecology. Additionally, the consideration of solu-tions, such as overpasses and underpasses, is crucial to ensuring the long term survival of many wildlife populations. In studies addressing the effectiveness of overpasses and underpasses, it seemed as though animals adjusted well to these sorts of solutions, but strategic place-ment, as well as proper sizing, proper height, shelter from road noise, and other considerations were important in construction. When an underpass or overpass was well-built and well-shielded from human activity, animals’ usage of the structure increased significantly throughout its first five years, thus reconnecting previously divided populations. Still, these structures are costly and they are often unable to fully address certain issues such as light, noise, and contaminants from vehicles. Therefore, the need for further discussion of new, crea-tive solutions remains paramount. Roads are one of the most consistent and prominent features of today’s landscape, but their environmental impacts are largely overlooked. While roads are useful for connecting people, they divide landscapes and animal habitats. Therefore, further research and investment in possible solutions is necessary to mitigate the negative effects which roads have on wildlife mobility and to pre-vent issues from resultant habitat fragmentation.Keywords: fences, habitat fragmentation, roadways, wildlife mobility
Procedia PDF Downloads 1793053 Joubert Syndrome in Children as Multicentric Screening in Ten Different Places in World
Authors: Bajraktarevic Adnan, Djukic Branka, Sporisevic Lutvo, Krdzalic Zecevic Belma, Uzicanin Sajra, Hadzimuratovic Admir, Hadzimuratovic Hadzipasic Emina, Abduzaimovic Alisa, Kustric Amer, Suljevic Ismet, Serafi Ismail, Tahmiscija Indira, Khatib Hakam, Semic Jusufagic Aida, Haas Helmut, Vladicic Aleksandra, Aplenc Richard, Kadic Deovic Aida
Abstract:
Introduction: Joubert syndrome has an autosomal recessive pattern of inheritance. It is referred as the brain malfunctioning and caused due to the underdevelopment of the cerebellar vermis. Associated conditions involving the eye, the kidney, and ocular disease are well described. Aims: Research helps us better understand this diseases, Joubert syndrome and can lead to advances in diagnosis and treatment. Methods: Different several conditions have been described in which the molar tooth sign and characteristics of Joubert syndrome in ten different places in the world. Carrier testing and diagnosis are available if one of these gene mutations has been identified in an affected family member. Results: Authors have described eleven cases during twenty years of Joubert syndrome. It is a clinically and genetically heterogeneous group of disorders characterized by hypoplasia of the cerebellar vermis with the characteristic neuroradiologic molar tooth sign, and accompanying neurologic symptoms, including dysregulation of breathing pattern and developmental delay. We made confirmation of diagnosis in twin sisters with Joubert syndrome with renal anomalies. Ocular symptoms have existed in seven cases (63.64%) from total eleven. Eleven cases were different sex, five boys (45.45%) and six girls (54.44%). Conclusions: Joubert syndrome is inherited as an autosomal recessive genetic disorder with several features of the disease.Keywords: Joubert syndrome, cerebellooculorenal syndrome, autosomal recessive genetic disorder (ARGD), children
Procedia PDF Downloads 2783052 Automatic Identification and Classification of Contaminated Biodegradable Plastics using Machine Learning Algorithms and Hyperspectral Imaging Technology
Authors: Nutcha Taneepanichskul, Helen C. Hailes, Mark Miodownik
Abstract:
Plastic waste has emerged as a critical global environmental challenge, primarily driven by the prevalent use of conventional plastics derived from petrochemical refining and manufacturing processes in modern packaging. While these plastics serve vital functions, their persistence in the environment post-disposal poses significant threats to ecosystems. Addressing this issue necessitates approaches, one of which involves the development of biodegradable plastics designed to degrade under controlled conditions, such as industrial composting facilities. It is imperative to note that compostable plastics are engineered for degradation within specific environments and are not suited for uncontrolled settings, including natural landscapes and aquatic ecosystems. The full benefits of compostable packaging are realized when subjected to industrial composting, preventing environmental contamination and waste stream pollution. Therefore, effective sorting technologies are essential to enhance composting rates for these materials and diminish the risk of contaminating recycling streams. In this study, it leverage hyperspectral imaging technology (HSI) coupled with advanced machine learning algorithms to accurately identify various types of plastics, encompassing conventional variants like Polyethylene terephthalate (PET), Polypropylene (PP), Low density polyethylene (LDPE), High density polyethylene (HDPE) and biodegradable alternatives such as Polybutylene adipate terephthalate (PBAT), Polylactic acid (PLA), and Polyhydroxyalkanoates (PHA). The dataset is partitioned into three subsets: a training dataset comprising uncontaminated conventional and biodegradable plastics, a validation dataset encompassing contaminated plastics of both types, and a testing dataset featuring real-world packaging items in both pristine and contaminated states. Five distinct machine learning algorithms, namely Partial Least Squares Discriminant Analysis (PLS-DA), Support Vector Machine (SVM), Convolutional Neural Network (CNN), Logistic Regression, and Decision Tree Algorithm, were developed and evaluated for their classification performance. Remarkably, the Logistic Regression and CNN model exhibited the most promising outcomes, achieving a perfect accuracy rate of 100% for the training and validation datasets. Notably, the testing dataset yielded an accuracy exceeding 80%. The successful implementation of this sorting technology within recycling and composting facilities holds the potential to significantly elevate recycling and composting rates. As a result, the envisioned circular economy for plastics can be established, thereby offering a viable solution to mitigate plastic pollution.Keywords: biodegradable plastics, sorting technology, hyperspectral imaging technology, machine learning algorithms
Procedia PDF Downloads 803051 Power Iteration Clustering Based on Deflation Technique on Large Scale Graphs
Authors: Taysir Soliman
Abstract:
One of the current popular clustering techniques is Spectral Clustering (SC) because of its advantages over conventional approaches such as hierarchical clustering, k-means, etc. and other techniques as well. However, one of the disadvantages of SC is the time consuming process because it requires computing the eigenvectors. In the past to overcome this disadvantage, a number of attempts have been proposed such as the Power Iteration Clustering (PIC) technique, which is one of versions from SC; some of PIC advantages are: 1) its scalability and efficiency, 2) finding one pseudo-eigenvectors instead of computing eigenvectors, and 3) linear combination of the eigenvectors in linear time. However, its worst disadvantage is an inter-class collision problem because it used only one pseudo-eigenvectors which is not enough. Previous researchers developed Deflation-based Power Iteration Clustering (DPIC) to overcome problems of PIC technique on inter-class collision with the same efficiency of PIC. In this paper, we developed Parallel DPIC (PDPIC) to improve the time and memory complexity which is run on apache spark framework using sparse matrix. To test the performance of PDPIC, we compared it to SC, ESCG, ESCALG algorithms on four small graph benchmark datasets and nine large graph benchmark datasets, where PDPIC proved higher accuracy and better time consuming than other compared algorithms.Keywords: spectral clustering, power iteration clustering, deflation-based power iteration clustering, Apache spark, large graph
Procedia PDF Downloads 1893050 An Observer-Based Direct Adaptive Fuzzy Sliding Control with Adjustable Membership Functions
Authors: Alireza Gholami, Amir H. D. Markazi
Abstract:
In this paper, an observer-based direct adaptive fuzzy sliding mode (OAFSM) algorithm is proposed. In the proposed algorithm, the zero-input dynamics of the plant could be unknown. The input connection matrix is used to combine the sliding surfaces of individual subsystems, and an adaptive fuzzy algorithm is used to estimate an equivalent sliding mode control input directly. The fuzzy membership functions, which were determined by time consuming try and error processes in previous works, are adjusted by adaptive algorithms. The other advantage of the proposed controller is that the input gain matrix is not limited to be diagonal, i.e. the plant could be over/under actuated provided that controllability and observability are preserved. An observer is constructed to directly estimate the state tracking error, and the nonlinear part of the observer is constructed by an adaptive fuzzy algorithm. The main advantage of the proposed observer is that, the measured outputs is not limited to the first entry of a canonical-form state vector. The closed-loop stability of the proposed method is proved using a Lyapunov-based approach. The proposed method is applied numerically on a multi-link robot manipulator, which verifies the performance of the closed-loop control. Moreover, the performance of the proposed algorithm is compared with some conventional control algorithms.Keywords: adaptive algorithm, fuzzy systems, membership functions, observer
Procedia PDF Downloads 2063049 Use of Machine Learning Algorithms to Pediatric MR Images for Tumor Classification
Authors: I. Stathopoulos, V. Syrgiamiotis, E. Karavasilis, A. Ploussi, I. Nikas, C. Hatzigiorgi, K. Platoni, E. P. Efstathopoulos
Abstract:
Introduction: Brain and central nervous system (CNS) tumors form the second most common group of cancer in children, accounting for 30% of all childhood cancers. MRI is the key imaging technique used for the visualization and management of pediatric brain tumors. Initial characterization of tumors from MRI scans is usually performed via a radiologist’s visual assessment. However, different brain tumor types do not always demonstrate clear differences in visual appearance. Using only conventional MRI to provide a definite diagnosis could potentially lead to inaccurate results, and so histopathological examination of biopsy samples is currently considered to be the gold standard for obtaining definite diagnoses. Machine learning is defined as the study of computational algorithms that can use, complex or not, mathematical relationships and patterns from empirical and scientific data to make reliable decisions. Concerning the above, machine learning techniques could provide effective and accurate ways to automate and speed up the analysis and diagnosis for medical images. Machine learning applications in radiology are or could potentially be useful in practice for medical image segmentation and registration, computer-aided detection and diagnosis systems for CT, MR or radiography images and functional MR (fMRI) images for brain activity analysis and neurological disease diagnosis. Purpose: The objective of this study is to provide an automated tool, which may assist in the imaging evaluation and classification of brain neoplasms in pediatric patients by determining the glioma type, grade and differentiating between different brain tissue types. Moreover, a future purpose is to present an alternative way of quick and accurate diagnosis in order to save time and resources in the daily medical workflow. Materials and Methods: A cohort, of 80 pediatric patients with a diagnosis of posterior fossa tumor, was used: 20 ependymomas, 20 astrocytomas, 20 medulloblastomas and 20 healthy children. The MR sequences used, for every single patient, were the following: axial T1-weighted (T1), axial T2-weighted (T2), FluidAttenuated Inversion Recovery (FLAIR), axial diffusion weighted images (DWI), axial contrast-enhanced T1-weighted (T1ce). From every sequence only a principal slice was used that manually traced by two expert radiologists. Image acquisition was carried out on a GE HDxt 1.5-T scanner. The images were preprocessed following a number of steps including noise reduction, bias-field correction, thresholding, coregistration of all sequences (T1, T2, T1ce, FLAIR, DWI), skull stripping, and histogram matching. A large number of features for investigation were chosen, which included age, tumor shape characteristics, image intensity characteristics and texture features. After selecting the features for achieving the highest accuracy using the least number of variables, four machine learning classification algorithms were used: k-Nearest Neighbour, Support-Vector Machines, C4.5 Decision Tree and Convolutional Neural Network. The machine learning schemes and the image analysis are implemented in the WEKA platform and MatLab platform respectively. Results-Conclusions: The results and the accuracy of images classification for each type of glioma by the four different algorithms are still on process.Keywords: image classification, machine learning algorithms, pediatric MRI, pediatric oncology
Procedia PDF Downloads 1493048 Prediction of Road Accidents in Qatar by 2022
Authors: M. Abou-Amouna, A. Radwan, L. Al-kuwari, A. Hammuda, K. Al-Khalifa
Abstract:
There is growing concern over increasing incidences of road accidents and consequent loss of human life in Qatar. In light to the future planned event in Qatar, World Cup 2022; Qatar should put into consideration the future deaths caused by road accidents, and past trends should be considered to give a reasonable picture of what may happen in the future. Qatar roads should be arranged and paved in a way that accommodate high capacity of the population in that time, since then there will be a huge number of visitors from the world. Qatar should also consider the risk issues of road accidents raised in that period, and plan to maintain high level to safety strategies. According to the increase in the number of road accidents in Qatar from 1995 until 2012, an analysis of elements affecting and causing road accidents will be effectively studied. This paper aims to identify and criticize the factors that have high effect on causing road accidents in the state of Qatar, and predict the total number of road accidents in Qatar 2022. Alternative methods are discussed and the most applicable ones according to the previous researches are selected for further studies. The methods that satisfy the existing case in Qatar were the multiple linear regression model (MLR) and artificial neutral network (ANN). Those methods are analyzed and their findings are compared. We conclude that by using MLR the number of accidents in 2022 will become 355,226 accidents, and by using ANN 216,264 accidents. We conclude that MLR gave better results than ANN because the artificial neutral network doesn’t fit data with large range varieties.Keywords: road safety, prediction, accident, model, Qatar
Procedia PDF Downloads 258