Search results for: decision based artificial neural network
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 32965

Search results for: decision based artificial neural network

28675 Algorithms for Run-Time Task Mapping in NoC-Based Heterogeneous MPSoCs

Authors: M. K. Benhaoua, A. K. Singh, A. E. Benyamina, P. Boulet

Abstract:

Mapping parallelized tasks of applications onto these MPSoCs can be done either at design time (static) or at run-time (dynamic). Static mapping strategies find the best placement of tasks at design-time, and hence, these are not suitable for dynamic workload and seem incapable of runtime resource management. The number of tasks or applications executing in MPSoC platform can exceed the available resources, requiring efficient run-time mapping strategies to meet these constraints. This paper describes a new Spiral Dynamic Task Mapping heuristic for mapping applications onto NoC-based Heterogeneous MPSoC. This heuristic is based on packing strategy and routing Algorithm proposed also in this paper. Heuristic try to map the tasks of an application in a clustering region to reduce the communication overhead between the communicating tasks. The heuristic proposed in this paper attempts to map the tasks of an application that are most related to each other in a spiral manner and to find the best possible path load that minimizes the communication overhead. In this context, we have realized a simulation environment for experimental evaluations to map applications with varying number of tasks onto an 8x8 NoC-based Heterogeneous MPSoCs platform, we demonstrate that the new mapping heuristics with the new modified dijkstra routing algorithm proposed are capable of reducing the total execution time and energy consumption of applications when compared to state-of-the-art run-time mapping heuristics reported in the literature.

Keywords: multiprocessor system on chip, MPSoC, network on chip, NoC, heterogeneous architectures, run-time mapping heuristics, routing algorithm

Procedia PDF Downloads 475
28674 Understanding Tacit Knowledge and DIKW

Authors: Bahadir Aydin

Abstract:

Today it is difficult to reach accurate knowledge because of mass data. This huge data makes the environment more and more caotic. Data is a main piller of intelligence. There is a close tie between knowledge and intelligence. Information gathered from different sources can be modified, interpreted and classified by using knowledge development process. This process is applied in order to attain intelligence. Within this process the effect of knowledge is crucial. Knowledge is classified as explicit and tacit knowledge. Tacit knowledge can be seen as "only the tip of the iceberg”. This tacit knowledge accounts for much more than we guess in all intelligence cycle. If the concept of intelligence scrutinized, it can be seen that it contains risks, threats as well as success. The main purpose for all organization is to be succesful by eliminating risks and threats. Therefore, there is a need to connect or fuse existing information and the processes which can be used to develop it. By the help of process the decision-maker can be presented with a clear holistic understanding, as early as possible in the decision making process. Planning, execution and assessments are the key functions that connects to information to knowledge. Altering from the current traditional reactive approach to a proactive knowledge development approach would reduce extensive duplication of work in the organization. By new approach to this process, knowledge can be used more effectively.

Keywords: knowledge, intelligence cycle, tacit knowledge, KIDW

Procedia PDF Downloads 503
28673 Surveying the Effect of Cybernetics on Knowledge Management from Users' Viewpoint Who Are Members of Electronic Discussion Groups (ALA, ALIA)

Authors: Mitra Ghiasi, Roghayeh Ghorbani Bousari

Abstract:

Nowadays, the aim of the organizations is to gain sustainable competitive. So, developing their intellectual capital, encouraging innovation, increasing suitable performance can be done by knowledge management. Knowledge turns into science if knowledge is used to improve decision making, decision quality and make effective decisions. The current research intends to investigate the relationship between cybernetics and knowledge management from the perspective of users who are members of electronic discussion groups (ALA, ALIA). The research methodology is survey method, and it is a type of correlation research. Cybernetics and knowledge management questionnaires used for collecting data. The questionnaire that was designed in electronic format, distributed among two electronic discussion groups during 30 days and completed by 100 members of each electronic discussion groups. The finding of this research showed that although cybernetics has an impact on knowledge management, there is no significant difference between the ALA and ALIA user's view regard to effect of cybernetics on knowledge management. The results also indicated that this conceptual model is consistent with the data collected from the sample.

Keywords: ALA discussion group, ALIA discussion group, cybernetics, knowledge management

Procedia PDF Downloads 225
28672 Stress-Strain Relation for Human Trabecular Bone Based on Nanoindentation Measurements

Authors: Marek Pawlikowski, Krzysztof Jankowski, Konstanty Skalski, Anna Makuch

Abstract:

Nanoindentation or depth-sensing indentation (DSI) technique has proven to be very useful to measure mechanical properties of various tissues at a micro-scale. Bone tissue, both trabecular and cortical one, is one of the most commonly tested tissues by means of DSI. Most often such tests on bone samples are carried out to compare the mechanical properties of lamellar and interlamellar bone, osteonal bone as well as compact and cancellous bone. In the paper, a relation between stress and strain for human trabecular bone is presented. The relation is based on the results of nanoindentation tests. The formulation of a constitutive model for human trabecular bone is based on nanoindentation tests. In the study, the approach proposed by Olivier-Pharr is adapted. The tests were carried out on samples of trabecular tissue extracted from human femoral heads. The heads were harvested during surgeries of artificial hip joint implantation. Before samples preparation, the heads were kept in 95% alcohol in temperature 4 Celsius degrees. The cubic samples cut out of the heads were stored in the same conditions. The dimensions of the specimens were 25 mm x 25 mm x 20 mm. The number of 20 samples have been tested. The age range of donors was between 56 and 83 years old. The tests were conducted with the indenter spherical tip of the diameter 0.200 mm. The maximum load was P = 500 mN and the loading rate 500 mN/min. The data obtained from the DSI tests allows one only to determine bone behoviour in terms of nanoindentation force vs. nanoindentation depth. However, it is more interesting and useful to know the characteristics of trabecular bone in the stress-strain domain. This allows one to simulate trabecular bone behaviour in a more realistic way. The stress-strain curves obtained in the study show relation between the age and the mechanical behaviour of trabecular bone. It was also observed that the bone matrix of trabecular tissue indicates an ability of energy absorption.

Keywords: constitutive model, mechanical behaviour, nanoindentation, trabecular bone

Procedia PDF Downloads 201
28671 Nudging the Criminal Justice System into Listening to Crime Victims in Plea Agreements

Authors: Dana Pugach, Michal Tamir

Abstract:

Most criminal cases end with a plea agreement, an issue whose many aspects have been discussed extensively in legal literature. One important feature, however, has gained little notice, and that is crime victims’ place in plea agreements following the federal Crime Victims Rights Act of 2004. This law has provided victims some meaningful and potentially revolutionary rights, including the right to be heard in the proceeding and a right to appeal against a decision made while ignoring the victim’s rights. While victims’ rights literature has always emphasized the importance of such right, references to this provision in the general literature about plea agreements are sparse, if existing at all. Furthermore, there are a few cases only mentioning this right. This article purports to bridge between these two bodies of legal thinking – the vast literature concerning plea agreements and victims’ rights research– by using behavioral economics. The article will, firstly, trace the possible structural reasons for the failure of this right to be materialized. Relevant incentives of all actors involved will be identified as well as their inherent consequential processes that lead to the victims’ rights malfunction. Secondly, the article will use nudge theory in order to suggest solutions that will enhance incentives for the repeat players in the system (prosecution, judges, defense attorneys) and lead to the strengthening of weaker group’s interests – the crime victims. Behavioral psychology literature recognizes that the framework in which an individual confronts a decision can significantly influence his decision. Richard Thaler and Cass Sunstein developed the idea of ‘choice architecture’ - ‘the context in which people make decisions’ - which can be manipulated to make particular decisions more likely. Choice architectures can be changed by adjusting ‘nudges,’ influential factors that help shape human behavior, without negating their free choice. The nudges require decision makers to make choices instead of providing a familiar default option. In accordance with this theory, we suggest a rule, whereby a judge should inquire the victim’s view prior to accepting the plea. This suggestion leaves the judge’s discretion intact; while at the same time nudges her not to go directly to the default decision, i.e. automatically accepting the plea. Creating nudges that force actors to make choices is particularly significant when an actor intends to deviate from routine behaviors but experiences significant time constraints, as in the case of judges and plea bargains. The article finally recognizes some far reaching possible results of the suggestion. These include meaningful changes to the earlier stages of criminal process even before reaching court, in line with the current criticism of the plea agreements machinery.

Keywords: plea agreements, victims' rights, nudge theory, criminal justice

Procedia PDF Downloads 309
28670 Oxidovanadium(IV) and Dioxidovanadium(V) Complexes: Efficient Catalyst for Peroxidase Mimetic Activity and Oxidation

Authors: Mannar R. Maurya, Bithika Sarkar, Fernando Avecilla

Abstract:

Peroxidase activity is possibly successfully used for different industrial processes in medicine, chemical industry, food processing and agriculture. However, they bear some intrinsic drawback associated with denaturation by proteases, their special storage requisite and cost factor also. Now a day’s artificial enzyme mimics are becoming a research interest because of their significant applications over conventional organic enzymes for ease of their preparation, low price and good stability in activity and overcome the drawbacks of natural enzymes e.g serine proteases. At present, a large number of artificial enzymes have been synthesized by assimilating a catalytic center into a variety of schiff base complexes, ligand-anchoring, supramolecular complexes, hematin, porphyrin, nanoparticles to mimic natural enzymes. Although in recent years a several number of vanadium complexes have been reported by a continuing increase in interest in bioinorganic chemistry. To our best of knowledge, the investigation of artificial enzyme mimics of vanadium complexes is very less explored. Recently, our group has reported synthetic vanadium schiff base complexes capable of mimicking peroxidases. Herein, we have synthesized monoidovanadium(IV) and dioxidovanadium(V) complexes of pyrazoleone derivateis ( extensively studied on account of their broad range of pharmacological appication). All these complexes are characterized by various spectroscopic techniques like FT-IR, UV-Visible, NMR (1H, 13C and 51V), Elemental analysis, thermal studies and single crystal analysis. The peroxidase mimic activity has been studied towards oxidation of pyrogallol to purpurogallin with hydrogen peroxide at pH 7 followed by measuring kinetic parameters. The Michaelis-Menten behavior shows an excellent catalytic activity over its natural counterparts, e.g. V-HPO and HRP. The obtained kinetic parameters (Vmax, Kcat) were also compared with peroxidase and haloperoxidase enzymes making it a promising mimic of peroxidase catalyst. Also, the catalytic activity has been studied towards the oxidation of 1-phenylethanol in presence of H2O2 as an oxidant. Various parameters such as amount of catalyst and oxidant, reaction time, reaction temperature and solvent have been taken into consideration to get maximum oxidative products of 1-phenylethanol.

Keywords: oxovanadium(IV)/dioxidovanadium(V) complexes, NMR spectroscopy, Crystal structure, peroxidase mimic activity towards oxidation of pyrogallol, Oxidation of 1-phenylethanol

Procedia PDF Downloads 324
28669 A Bayesian Classification System for Facilitating an Institutional Risk Profile Definition

Authors: Roman Graf, Sergiu Gordea, Heather M. Ryan

Abstract:

This paper presents an approach for easy creation and classification of institutional risk profiles supporting endangerment analysis of file formats. The main contribution of this work is the employment of data mining techniques to support set up of the most important risk factors. Subsequently, risk profiles employ risk factors classifier and associated configurations to support digital preservation experts with a semi-automatic estimation of endangerment group for file format risk profiles. Our goal is to make use of an expert knowledge base, accuired through a digital preservation survey in order to detect preservation risks for a particular institution. Another contribution is support for visualisation of risk factors for a requried dimension for analysis. Using the naive Bayes method, the decision support system recommends to an expert the matching risk profile group for the previously selected institutional risk profile. The proposed methods improve the visibility of risk factor values and the quality of a digital preservation process. The presented approach is designed to facilitate decision making for the preservation of digital content in libraries and archives using domain expert knowledge and values of file format risk profiles. To facilitate decision-making, the aggregated information about the risk factors is presented as a multidimensional vector. The goal is to visualise particular dimensions of this vector for analysis by an expert and to define its profile group. The sample risk profile calculation and the visualisation of some risk factor dimensions is presented in the evaluation section.

Keywords: linked open data, information integration, digital libraries, data mining

Procedia PDF Downloads 406
28668 Improving Flash Flood Forecasting with a Bayesian Probabilistic Approach: A Case Study on the Posina Basin in Italy

Authors: Zviad Ghadua, Biswa Bhattacharya

Abstract:

The Flash Flood Guidance (FFG) provides the rainfall amount of a given duration necessary to cause flooding. The approach is based on the development of rainfall-runoff curves, which helps us to find out the rainfall amount that would cause flooding. An alternative approach, mostly experimented with Italian Alpine catchments, is based on determining threshold discharges from past events and on finding whether or not an oncoming flood has its magnitude more than some critical discharge thresholds found beforehand. Both approaches suffer from large uncertainties in forecasting flash floods as, due to the simplistic approach followed, the same rainfall amount may or may not cause flooding. This uncertainty leads to the question whether a probabilistic model is preferable over a deterministic one in forecasting flash floods. We propose the use of a Bayesian probabilistic approach in flash flood forecasting. A prior probability of flooding is derived based on historical data. Additional information, such as antecedent moisture condition (AMC) and rainfall amount over any rainfall thresholds are used in computing the likelihood of observing these conditions given a flash flood has occurred. Finally, the posterior probability of flooding is computed using the prior probability and the likelihood. The variation of the computed posterior probability with rainfall amount and AMC presents the suitability of the approach in decision making in an uncertain environment. The methodology has been applied to the Posina basin in Italy. From the promising results obtained, we can conclude that the Bayesian approach in flash flood forecasting provides more realistic forecasting over the FFG.

Keywords: flash flood, Bayesian, flash flood guidance, FFG, forecasting, Posina

Procedia PDF Downloads 119
28667 Dem Based Surface Deformation in Jhelum Valley: Insights from River Profile Analysis

Authors: Syed Amer Mahmood, Rao Mansor Ali Khan

Abstract:

This study deals with the remote sensing analysis of tectonic deformation and its implications to understand the regional uplift conditions in the lower Jhelum and eastern Potwar. Identification and mapping of active structures is an important issue in order to assess seismic hazards and to understand the Quaternary deformation of the region. Digital elevation models (DEMs) provide an opportunity to quantify land surface geometry in terms of elevation and its derivatives. Tectonic movement along the faults is often reflected by characteristic geomorphological features such as elevation, stream offsets, slope breaks and the contributing drainage area. The river profile analysis in this region using SRTM digital elevation model gives information about the tectonic influence on the local drainage network. The steepness and concavity indices have been calculated by power law of scaling relations under steady state conditions. An uplift rate map is prepared after carefully analysing the local drainage network showing uplift rates in mm/year. The active faults in the region control local drainages and the deflection of stream channels is a further evidence of the recent fault activity. The results show variable relative uplift conditions along MBT and Riasi and represent a wonderful example of the recency of uplift, as well as the influence of active tectonics on the evolution of young orogens.

Keywords: quaternary deformation, SRTM DEM, geomorphometric indices, active tectonics and MBT

Procedia PDF Downloads 335
28666 The Interplay between Autophagy and Macrophages' Polarization in Wound Healing: A Genetic Regulatory Network Analysis

Authors: Mayada Mazher, Ahmed Moustafa, Ahmed Abdellatif

Abstract:

Background: Autophagy is a eukaryotic, highly conserved catabolic process implicated in many pathophysiologies such as wound healing. Autophagy-associated genes serve as a scaffolding platform for signal transduction of macrophage polarization during the inflammatory phase of wound healing and tissue repair process. In the current study, we report a model for the interplay between autophagy-associated genes and macrophages polarization associated genes. Methods: In silico analysis was performed on 249 autophagy-related genes retrieved from the public autophagy database and gene expression data retrieved from Gene Expression Omnibus (GEO); GSE81922 and GSE69607 microarray data macrophages polarization 199 DEGS. An integrated protein-protein interaction network was constructed for autophagy and macrophage gene sets. The gene sets were then used for GO terms pathway enrichment analysis. Common transcription factors for autophagy and macrophages' polarization were identified. Finally, microRNAs enriched in both autophagy and macrophages were predicated. Results: In silico prediction of common transcription factors in DEGs macrophages and autophagy gene sets revealed a new role for the transcription factors, HOMEZ, GABPA, ELK1 and REL, that commonly regulate macrophages associated genes: IL6,IL1M, IL1B, NOS1, SOC3 and autophagy-related genes: Atg12, Rictor, Rb1cc1, Gaparab1, Atg16l1. Conclusions: Autophagy and macrophages' polarization are interdependent cellular processes, and both autophagy-related proteins and macrophages' polarization related proteins coordinate in tissue remodelling via transcription factors and microRNAs regulatory network. The current work highlights a potential new role for transcription factors HOMEZ, GABPA, ELK1 and REL in wound healing.

Keywords: autophagy related proteins, integrated network analysis, macrophages polarization M1 and M2, tissue remodelling

Procedia PDF Downloads 131
28665 Variance-Aware Routing and Authentication Scheme for Harvesting Data in Cloud-Centric Wireless Sensor Networks

Authors: Olakanmi Oladayo Olufemi, Bamifewe Olusegun James, Badmus Yaya Opeyemi, Adegoke Kayode

Abstract:

The wireless sensor network (WSN) has made a significant contribution to the emergence of various intelligent services or cloud-based applications. Most of the time, these data are stored on a cloud platform for efficient management and sharing among different services or users. However, the sensitivity of the data makes them prone to various confidentiality and performance-related attacks during and after harvesting. Various security schemes have been developed to ensure the integrity and confidentiality of the WSNs' data. However, their specificity towards particular attacks and the resource constraint and heterogeneity of WSNs make most of these schemes imperfect. In this paper, we propose a secure variance-aware routing and authentication scheme with two-tier verification to collect, share, and manage WSN data. The scheme is capable of classifying WSN into different subnets, detecting any attempt of wormhole and black hole attack during harvesting, and enforcing access control on the harvested data stored in the cloud. The results of the analysis showed that the proposed scheme has more security functionalities than other related schemes, solves most of the WSNs and cloud security issues, prevents wormhole and black hole attacks, identifies the attackers during data harvesting, and enforces access control on the harvested data stored in the cloud at low computational, storage, and communication overheads.

Keywords: data block, heterogeneous IoT network, data harvesting, wormhole attack, blackhole attack access control

Procedia PDF Downloads 57
28664 Identification of Vulnerable Zone Due to Cyclone-Induced Storm Surge in the Exposed Coast of Bangladesh

Authors: Mohiuddin Sakib, Fatin Nihal, Rabeya Akter, Anisul Haque, Munsur Rahman, Wasif-E-Elahi

Abstract:

Surge generating cyclones are one of the deadliest natural disasters that threaten the life of coastal environment and communities worldwide. Due to the geographic location, ‘low lying alluvial plain, geomorphologic characteristics and 710 kilometers exposed coastline, Bangladesh is considered as one of the greatest vulnerable country for storm surge flooding. Bay of Bengal is possessing the highest potential of creating storm surge inundation to the coastal areas. Bangladesh is the most exposed country to tropical cyclone with an average of four cyclone striking every years. Frequent cyclone landfall made the country one of the worst sufferer within the world for cyclone induced storm surge flooding and casualties. During the years from 1797 to 2009 Bangladesh has been hit by 63 severe cyclones with strengths of different magnitudes. Though detailed studies were done focusing on the specific cyclone like Sidr or Aila, no study was conducted where vulnerable areas of exposed coast were identified based on the strength of cyclones. This study classifies the vulnerable areas of the exposed coast based on storm surge inundation depth and area due to cyclones of varying strengths. Classification of the exposed coast based on hazard induced cyclonic vulnerability will help the decision makers to take appropriate policies for reducing damage and loss.

Keywords: cyclone, landfall, storm surge, exposed coastline, vulnerability

Procedia PDF Downloads 376
28663 Towards a Balancing Medical Database by Using the Least Mean Square Algorithm

Authors: Kamel Belammi, Houria Fatrim

Abstract:

imbalanced data set, a problem often found in real world application, can cause seriously negative effect on classification performance of machine learning algorithms. There have been many attempts at dealing with classification of imbalanced data sets. In medical diagnosis classification, we often face the imbalanced number of data samples between the classes in which there are not enough samples in rare classes. In this paper, we proposed a learning method based on a cost sensitive extension of Least Mean Square (LMS) algorithm that penalizes errors of different samples with different weight and some rules of thumb to determine those weights. After the balancing phase, we applythe different classifiers (support vector machine (SVM), k- nearest neighbor (KNN) and multilayer neuronal networks (MNN)) for balanced data set. We have also compared the obtained results before and after balancing method.

Keywords: multilayer neural networks, k- nearest neighbor, support vector machine, imbalanced medical data, least mean square algorithm, diabetes

Procedia PDF Downloads 517
28662 Towards Creative Movie Title Generation Using Deep Neural Models

Authors: Simon Espigolé, Igor Shalyminov, Helen Hastie

Abstract:

Deep machine learning techniques including deep neural networks (DNN) have been used to model language and dialogue for conversational agents to perform tasks, such as giving technical support and also for general chit-chat. They have been shown to be capable of generating long, diverse and coherent sentences in end-to-end dialogue systems and natural language generation. However, these systems tend to imitate the training data and will only generate the concepts and language within the scope of what they have been trained on. This work explores how deep neural networks can be used in a task that would normally require human creativity, whereby the human would read the movie description and/or watch the movie and come up with a compelling, interesting movie title. This task differs from simple summarization in that the movie title may not necessarily be derivable from the content or semantics of the movie description. Here, we train a type of DNN called a sequence-to-sequence model (seq2seq) that takes as input a short textual movie description and some information on e.g. genre of the movie. It then learns to output a movie title. The idea is that the DNN will learn certain techniques and approaches that the human movie titler may deploy that may not be immediately obvious to the human-eye. To give an example of a generated movie title, for the movie synopsis: ‘A hitman concludes his legacy with one more job, only to discover he may be the one getting hit.’; the original, true title is ‘The Driver’ and the one generated by the model is ‘The Masquerade’. A human evaluation was conducted where the DNN output was compared to the true human-generated title, as well as a number of baselines, on three 5-point Likert scales: ‘creativity’, ‘naturalness’ and ‘suitability’. Subjects were also asked which of the two systems they preferred. The scores of the DNN model were comparable to the scores of the human-generated movie title, with means m=3.11, m=3.12, respectively. There is room for improvement in these models as they were rated significantly less ‘natural’ and ‘suitable’ when compared to the human title. In addition, the human-generated title was preferred overall 58% of the time when pitted against the DNN model. These results, however, are encouraging given the comparison with a highly-considered, well-crafted human-generated movie title. Movie titles go through a rigorous process of assessment by experts and focus groups, who have watched the movie. This process is in place due to the large amount of money at stake and the importance of creating an effective title that captures the audiences’ attention. Our work shows progress towards automating this process, which in turn may lead to a better understanding of creativity itself.

Keywords: creativity, deep machine learning, natural language generation, movies

Procedia PDF Downloads 311
28661 Multi-Objective Optimization in Carbon Abatement Technology Cycles (CAT) and Related Areas: Survey, Developments and Prospects

Authors: Hameed Rukayat Opeyemi, Pericles Pilidis, Pagone Emanuele

Abstract:

An infinitesimal increase in performance can have immense reduction in operating and capital expenses in a power generation system. Therefore, constant studies are being carried out to improve both conventional and novel power cycles. Globally, power producers are constantly researching on ways to minimize emission and to collectively downsize the total cost rate of power plants. A substantial spurt of developmental technologies of low carbon cycles have been suggested and studied, however they all have their limitations and financial implication. In the area of carbon abatement in power plants, three major objectives conflict: The cost rate of the plant, Power output and Environmental impact. Since, an increase in one of this parameter directly affects the other. This poses a multi-objective problem. It is paramount to be able to discern the point where improving one objective affects the other. Hence, the need for a Pareto-based optimization algorithm. Pareto-based optimization algorithm helps to find those points where improving one objective influences another objective negatively and stops there. The application of Pareto-based optimization algorithm helps the user/operator/designer make an informed decision. This paper sheds more light on areas that multi-objective optimization has been applied in carbon abatement technologies in the last five years, developments and prospects.

Keywords: gas turbine, low carbon technology, pareto optimal, multi-objective optimization

Procedia PDF Downloads 777
28660 Lessons from Implementation of a Network-Wide Safety Huddle in Behavioral Health

Authors: Deborah Weidner, Melissa Morgera

Abstract:

The model of care delivery in the Behavioral Health Network (BHN) is integrated across all five regions of Hartford Healthcare and thus spans the entirety of the state of Connecticut, with care provided in seven inpatient settings and over 30 ambulatory outpatient locations. While safety has been a core priority of the BHN in alignment with High Reliability practices, safety initiatives have historically been facilitated locally in each region or within each entity, with interventions implemented locally as opposed to throughout the network. To address this, the BHN introduced a network wide Safety Huddle during 2022. Launched in January, the BHN Safety Huddle brought together internal stakeholders, including medical and administrative leaders, along with executive institute leadership, quality, and risk management. By bringing leaders together and introducing a network-wide safety huddle into the way we work, the benefit has been an increase in awareness of safety events occurring in behavioral health areas as well as increased systemization of countermeasures to prevent future events. One significant discussion topic presented in huddles has pertained to environmental design and patient access to potentially dangerous items, addressing some of the most relevant factors resulting in harm to patients in inpatient and emergency settings for behavioral health patients. The safety huddle has improved visibility of potential environmental safety risks through the generation of over 15 safety alerts cascaded throughout the BHN and also spurred a rapid improvement project focused on standardization of patient belonging searches to reduce patient access to potentially dangerous items on inpatient units. Safety events pertaining to potentially dangerous items decreased by 31% as a result of standardized interventions implemented across the network and as a result of increased awareness. A second positive outcome originating from the BHN Safety Huddle was implementation of a recommendation to increase the emergency Narcan®(naloxone) supply on hand in ambulatory settings of the BHN after incidents involving accidental overdose resulted in higher doses of naloxone administration. By increasing the emergency supply of naloxone on hand in all ambulatory and residential settings, colleagues are better prepared to respond in an emergency situation should a patient experience an overdose while on site. Lastly, discussions in safety huddle spurred a new initiative within the BHN to improve responsiveness to assaultive incidents through a consultation service. This consult service, aligned with one of the network’s improvement priorities to reduce harm events related to assaultive incidents, was borne out of discussion in huddle in which it was identified that additional interventions may be needed in providing clinical care to patients who are experiencing multiple and/ or frequent safety events.

Keywords: quality, safety, behavioral health, risk management

Procedia PDF Downloads 72
28659 Passive Retrofitting Strategies for Windows in Hot and Humid Climate Vijayawada

Authors: Monica Anumula

Abstract:

Nowadays human beings attain comfort zone artificially for heating, cooling and lighting the spaces they live, and their main importance is given to aesthetics of building and they are not designed to protect themselves from climate. They depend on artificial sources of energy resulting in energy wastage. In order to reduce the amount of energy being spent in the construction industry and Energy Package goals by 2020, new ways of constructing houses is required. The larger part of energy consumption of a building is directly related to architectural aspects hence nature has to be integrated into the building design to attain comfort zone and reduce the dependency on artificial source of energy. The research is to develop bioclimatic design strategies and techniques for the walls and roofs of Vijayawada houses. Study and analysis of design strategies and techniques of various cases like Kerala, Mangalore etc. for similar kind of climate is examined in this paper. Understanding the vernacular architecture and modern techniques of that various cases and implementing in the housing of Vijayawada not only decreases energy consumption but also enhances socio cultural values of Vijayawada. This study focuses on the comparison of vernacular techniques and modern building bio climatic strategies to attain thermal comfort and energy reduction in hot and humid climate. This research provides further thinking of new strategies which include both vernacular and modern bioclimatic techniques.

Keywords: bioclimatic design, energy consumption, hot and humid climates, thermal comfort

Procedia PDF Downloads 165
28658 In-Flight Radiometric Performances Analysis of an Airborne Optical Payload

Authors: Caixia Gao, Chuanrong Li, Lingli Tang, Lingling Ma, Yaokai Liu, Xinhong Wang, Yongsheng Zhou

Abstract:

Performances analysis of remote sensing sensor is required to pursue a range of scientific research and application objectives. Laboratory analysis of any remote sensing instrument is essential, but not sufficient to establish a valid inflight one. In this study, with the aid of the in situ measurements and corresponding image of three-gray scale permanent artificial target, the in-flight radiometric performances analyses (in-flight radiometric calibration, dynamic range and response linearity, signal-noise-ratio (SNR), radiometric resolution) of self-developed short-wave infrared (SWIR) camera are performed. To acquire the inflight calibration coefficients of the SWIR camera, the at-sensor radiances (Li) for the artificial targets are firstly simulated with in situ measurements (atmosphere parameter and spectral reflectance of the target) and viewing geometries using MODTRAN model. With these radiances and the corresponding digital numbers (DN) in the image, a straight line with a formulation of L = G × DN + B is fitted by a minimization regression method, and the fitted coefficients, G and B, are inflight calibration coefficients. And then the high point (LH) and the low point (LL) of dynamic range can be described as LH= (G × DNH + B) and LL= B, respectively, where DNH is equal to 2n − 1 (n is the quantization number of the payload). Meanwhile, the sensor’s response linearity (δ) is described as the correlation coefficient of the regressed line. The results show that the calibration coefficients (G and B) are 0.0083 W·sr−1m−2µm−1 and −3.5 W·sr−1m−2µm−1; the low point of dynamic range is −3.5 W·sr−1m−2µm−1 and the high point is 30.5 W·sr−1m−2µm−1; the response linearity is approximately 99%. Furthermore, a SNR normalization method is used to assess the sensor’s SNR, and the normalized SNR is about 59.6 when the mean value of radiance is equal to 11.0 W·sr−1m−2µm−1; subsequently, the radiometric resolution is calculated about 0.1845 W•sr-1m-2μm-1. Moreover, in order to validate the result, a comparison of the measured radiance with a radiative-transfer-code-predicted over four portable artificial targets with reflectance of 20%, 30%, 40%, 50% respectively, is performed. It is noted that relative error for the calibration is within 6.6%.

Keywords: calibration and validation site, SWIR camera, in-flight radiometric calibration, dynamic range, response linearity

Procedia PDF Downloads 259
28657 Informing Lighting Designs Through a Comprehensive Review of Light Pollution Impacts

Authors: Stephen M. Simmons, Stuart W. Baur, William L. Gillis

Abstract:

In recent years, increasing concern has been shown towards the issue of light pollution, especially with the spread of brighter, more blue-rich LED bulbs. Much research has been conducted in order to study the effects of artificial light at night, and many adverse impacts have been discovered, such as circadian disruption, degradation of the night sky, and interference oftheprocesses and behaviors of plants and animals. Despite a plethora of informationin the literature regarding the numerous illeffects of this type of pollution, there does not appear to be a complete summary of these impacts, including their magnitudes, which would facilitate the balancing of risks and benefits in the design of an exterior lighting system. This paperprovides a comprehensive review of the known impacts of light pollution, divided into four categories - human health, night sky, plants, and animals; additionally, it includes a synopsis of what likely remains unknown at this point in time. This review will attempt to showcase the relative significance of differentimpacts within each category, as well as their sensitivity to changes in lighting specifications (brightness, color temperature, shielding, and mounting height). Methods to be employed in this research include an extensive literature review and the gathering of expert knowledge and opinions. The findings of this review will be used to inform the creation of an optimized lighting design for the Missouri University of Science and Technology campus. It is hoped that future research willexplore the known impacts of light pollution further, as well as search for what still remains to be found regarding the consequencesof artificial light at night.

Keywords: comprehensive review, impacts, light pollution, lighting design, literature review

Procedia PDF Downloads 119
28656 Artificial Intelligence for Generative Modelling

Authors: Shryas Bhurat, Aryan Vashistha, Sampreet Dinakar Nayak, Ayush Gupta

Abstract:

As the technology is advancing more towards high computational resources, there is a paradigm shift in the usage of these resources to optimize the design process. This paper discusses the usage of ‘Generative Design using Artificial Intelligence’ to build better models that adapt the operations like selection, mutation, and crossover to generate results. The human mind thinks of the simplest approach while designing an object, but the intelligence learns from the past & designs the complex optimized CAD Models. Generative Design takes the boundary conditions and comes up with multiple solutions with iterations to come up with a sturdy design with the most optimal parameter that is given, saving huge amounts of time & resources. The new production techniques that are at our disposal allow us to use additive manufacturing, 3D printing, and other innovative manufacturing techniques to save resources and design artistically engineered CAD Models. Also, this paper discusses the Genetic Algorithm, the Non-Domination technique to choose the right results using biomimicry that has evolved for current habitation for millions of years. The computer uses parametric models to generate newer models using an iterative approach & uses cloud computing to store these iterative designs. The later part of the paper compares the topology optimization technology with Generative Design that is previously being used to generate CAD Models. Finally, this paper shows the performance of algorithms and how these algorithms help in designing resource-efficient models.

Keywords: genetic algorithm, bio mimicry, generative modeling, non-dominant techniques

Procedia PDF Downloads 133
28655 Voice Liveness Detection Using Kolmogorov Arnold Networks

Authors: Arth J. Shah, Madhu R. Kamble

Abstract:

Voice biometric liveness detection is customized to certify an authentication process of the voice data presented is genuine and not a recording or synthetic voice. With the rise of deepfakes and other equivalently sophisticated spoofing generation techniques, it’s becoming challenging to ensure that the person on the other end is a live speaker or not. Voice Liveness Detection (VLD) system is a group of security measures which detect and prevent voice spoofing attacks. Motivated by the recent development of the Kolmogorov-Arnold Network (KAN) based on the Kolmogorov-Arnold theorem, we proposed KAN for the VLD task. To date, multilayer perceptron (MLP) based classifiers have been used for the classification tasks. We aim to capture not only the compositional structure of the model but also to optimize the values of univariate functions. This study explains the mathematical as well as experimental analysis of KAN for VLD tasks, thereby opening a new perspective for scientists to work on speech and signal processing-based tasks. This study emerges as a combination of traditional signal processing tasks and new deep learning models, which further proved to be a better combination for VLD tasks. The experiments are performed on the POCO and ASVSpoof 2017 V2 database. We used Constant Q-transform, Mel, and short-time Fourier transform (STFT) based front-end features and used CNN, BiLSTM, and KAN as back-end classifiers. The best accuracy is 91.26 % on the POCO database using STFT features with the KAN classifier. In the ASVSpoof 2017 V2 database, the lowest EER we obtained was 26.42 %, using CQT features and KAN as a classifier.

Keywords: Kolmogorov Arnold networks, multilayer perceptron, pop noise, voice liveness detection

Procedia PDF Downloads 18
28654 Diagnostic Performance of Mean Platelet Volume in the Diagnosis of Acute Myocardial Infarction: A Meta-Analysis

Authors: Kathrina Aseanne Acapulco-Gomez, Shayne Julieane Morales, Tzar Francis Verame

Abstract:

Mean platelet volume (MPV) is the most accurate measure of the size of platelets and is routinely measured by most automated hematological analyzers. Several studies have shown associations between MPV and cardiovascular risks and outcomes. Although its measurement may provide useful data, MPV remains to be a diagnostic tool that is yet to be included in routine clinical decision making. The aim of this systematic review and meta-analysis is to determine summary estimates of the diagnostic accuracy of mean platelet volume for the diagnosis of myocardial infarction among adult patients with angina and/or its equivalents in terms of sensitivity, specificity, diagnostic odds ratio, and likelihood ratios, and to determine the difference of the mean MPV values between those with MI and those in the non-MI controls. The primary search was done through search in electronic databases PubMed, Cochrane Review CENTRAL, HERDIN (Health Research and Development Information Network), Google Scholar, Philippine Journal of Pathology, and Philippine College of Physicians Philippine Journal of Internal Medicine. The reference list of original reports was also searched. Cross-sectional, cohort, and case-control articles studying the diagnostic performance of mean platelet volume in the diagnosis of acute myocardial infarction in adult patients were included in the study. Studies were included if: (1) CBC was taken upon presentation to the ER or upon admission (within 24 hours of symptom onset); (2) myocardial infarction was diagnosed with serum markers, ECG, or according to accepted guidelines by the Cardiology societies (American Heart Association (AHA), American College of Cardiology (ACC), European Society of Cardiology (ESC); and, (3) if outcomes were measured as significant difference AND/OR sensitivity and specificity. The authors independently screened for inclusion of all the identified potential studies as a result of the search. Eligible studies were appraised using well-defined criteria. Any disagreement between the reviewers was resolved through discussion and consensus. The overall mean MPV value of those with MI (9.702 fl; 95% CI 9.07 – 10.33) was higher than in those of the non-MI control group (8.85 fl; 95% CI 8.23 – 9.46). Interpretation of the calculated t-value of 2.0827 showed that there was a significant difference in the mean MPV values of those with MI and those of the non-MI controls. The summary sensitivity (Se) and specificity (Sp) for MPV were 0.66 (95% CI; 0.59 - 0.73) and 0.60 (95% CI; 0.43 – 0.75), respectively. The pooled diagnostic odds ratio (DOR) was 2.92 (95% CI; 1.90 – 4.50). The positive likelihood ratio of MPV in the diagnosis of myocardial infarction was 1.65 (95% CI; 1.20 – 22.27), and the negative likelihood ratio was 0.56 (95% CI; 0.50 – 0.64). The intended role for MPV in the diagnostic pathway of myocardial infarction would perhaps be best as a triage tool. With a DOR of 2.92, MPV values can discriminate between those who have MI and those without. For a patient with angina presenting with elevated MPV values, it is 1.65 times more likely that he has MI. Thus, it is implied that the decision to treat a patient with angina or its equivalents as a case of MI could be supported by an elevated MPV value.

Keywords: mean platelet volume, MPV, myocardial infarction, angina, chest pain

Procedia PDF Downloads 68
28653 Machine Learning Analysis of Eating Disorders Risk, Physical Activity and Psychological Factors in Adolescents: A Community Sample Study

Authors: Marc Toutain, Pascale Leconte, Antoine Gauthier

Abstract:

Introduction: Eating Disorders (ED), such as anorexia, bulimia, and binge eating, are psychiatric illnesses that mostly affect young people. The main symptoms concern eating (restriction, excessive food intake) and weight control behaviors (laxatives, vomiting). Psychological comorbidities (depression, executive function disorders, etc.) and problematic behaviors toward physical activity (PA) are commonly associated with ED. Acquaintances on ED risk factors are still lacking, and more community sample studies are needed to improve prevention and early detection. To our knowledge, studies are needed to specifically investigate the link between ED risk level, PA, and psychological risk factors in a community sample of adolescents. The aim of this study is to assess the relation between ED risk level, exercise (type, frequency, and motivations for engaging in exercise), and psychological factors based on the Jacobi risk factors model. We suppose that a high risk of ED will be associated with the practice of high caloric cost PA, motivations oriented to weight and shape control, and psychological disturbances. Method: An online survey destined for students has been sent to several middle schools and colleges in northwest France. This survey combined several questionnaires, the Eating Attitude Test-26 assessing ED risk; the Exercise Motivation Inventory–2 assessing motivations toward PA; the Hospital Anxiety and Depression Scale assessing anxiety and depression, the Contour Drawing Rating Scale; and the Body Esteem Scale assessing body dissatisfaction, Rosenberg Self-esteem Scale assessing self-esteem, the Exercise Dependence Scale-Revised assessing PA dependence, the Multidimensional Assessment of Interoceptive Awareness assessing interoceptive awareness and the Frost Multidimensional Perfectionism Scale assessing perfectionism. Machine learning analysis will be performed in order to constitute groups with a tree-based model clustering method, extract risk profile(s) with a bootstrap method comparison, and predict ED risk with a prediction method based on a decision tree-based model. Expected results: 1044 complete records have already been collected, and the survey will be closed at the end of May 2022. Records will be analyzed with a clustering method and a bootstrap method in order to reveal risk profile(s). Furthermore, a predictive tree decision method will be done to extract an accurate predictive model of ED risk. This analysis will confirm typical main risk factors and will give more data on presumed strong risk factors such as exercise motivations and interoceptive deficit. Furthermore, it will enlighten particular risk profiles with a strong level of proof and greatly contribute to improving the early detection of ED and contribute to a better understanding of ED risk factors.

Keywords: eating disorders, risk factors, physical activity, machine learning

Procedia PDF Downloads 70
28652 Modelling of Relocation and Battery Autonomy Problem on Electric Cars Sharing Dynamic by Using Discrete Event Simulation and Petri Net

Authors: Taha Benarbia, Kay W. Axhausen, Anugrah Ilahi

Abstract:

Electric car sharing system as ecologic transportation increasing in the world. The complexity of managing electric car sharing systems, especially one-way trips and battery autonomy have direct influence to on supply and demand of system. One must be able to precisely model the demand and supply of these systems to better operate electric car sharing and estimate its effect on mobility management and the accessibility that it provides in urban areas. In this context, our work focus to develop performances optimization model of the system based on discrete event simulation and stochastic Petri net. The objective is to search optimal decisions and management parameters of the system in order to fulfil at best demand while minimizing undesirable situations. In this paper, we present new model of electric cars sharing with relocation based on monitoring system. The proposed approach also help to precise the influence of battery charging level on the behaviour of system as important decision parameter of this complex and dynamical system.

Keywords: electric car-sharing systems, smart mobility, Petri nets modelling, discrete event simulation

Procedia PDF Downloads 164
28651 Awareness and Utilization of Social Network Tools among Agricultural Science Students in Colleges of Education in Ogun State, Nigeria

Authors: Adebowale Olukayode Efunnowo

Abstract:

This study was carried out to assess the awareness and utilization of Social Network Tools (SNTs) among agricultural science students in Colleges of Education in Ogun State, Nigeria. Simple random sampling techniques were used to select 280 respondents from the study area. Descriptive statistics was used to describe the objectives while Pearson Product Moment Correlation was used to test the hypothesis. The result showed that the majority (71.8%) of the respondents were single, with a mean age of 20 years. Almost all (95.7%) the respondents were aware of Facebook and 2go as a Social Network Tools (SNTs) while 85.0% of the respondents were not aware of Blackplanet, LinkedIn, MyHeritage and Bebo. Many (41.1%) of the respondents had views that using SNTs can enhance extensive literature survey, increase internet browsing potential, promote teaching proficiency, and update on outcomes of researches. However, 51.4% of the respondents perceived that SNTs usage as what is meant for the lecturers/adults only while 16.1% considered it as mainly used by internet fraudsters. Findings revealed that about 50.0% of the respondents browsed Facebook and 2go daily while more than 80% of the respondents used Blackplanet, MyHeritage, Skyrock, Bebo, LinkedIn and My YearBook as the need arise. Major constraints to the awareness and utilization of SNTs were high cost and poor quality of ICTs facilities (77.1%), epileptic power supply (75.0%), inadequate telecommunication infrastructure (71.1%), low technical know-how (62.9%) and inadequate computer knowledge (61.1%). The result of PPMC analysis showed that there was an inverse relationship between constraints and utilization of SNTs at p < 0.05. It can be concluded that constraints affect efficient and effective utilization of SNTs in the study area. It is hereby recommended that management of colleges of education and agricultural institutes should provide good internet connectivity, computer facilities, and alternative power supply in order to increase the awareness and utilization of SNTs among students.

Keywords: awareness, utilization, social network tools, constraints, students

Procedia PDF Downloads 337
28650 K-12 Students’ Digital Life: Activities and Attitudes

Authors: Meital Amzalag, Sharon Hardof-Jaffe

Abstract:

In the last few decades, children and youth have been immersed in digital technologies. Indeed, recent studies explored the implication of technology use in their leisure and learning activities. Educators face an essential need to utilize technology and implement them into the curriculum. To do that, educators need to understand how young people use digital technology. This study aims to explore K12 students' digital lives from their point of view, to reveal their digital activities, age and gender differences with respect to digital activities, and to present the students' attitudes towards technologies in learning. The study approach is quantitative and includes354 students ages 6-16 from three schools in Israel. The online questionnaire was based on self-reports and consists of four parts: Digital activities: leisure time activities (such as social networks, gaming types), search activities (information types and platforms), and digital application use (e.g., calendar, notes); Digital skills (requisite digital platform skills such as evaluation and creativity); Social and emotional aspects of digital use (conducting digital activities alone and with friends, feelings, and emotions during digital use such as happiness, bullying); Digital attitudes towards digital integration in learning. An academic ethics board approved the study. The main findings reveal the most popular K12digital activities: Navigating social network sites, watching TV, playing mobile games, seeking information on the internet, and playing computer games. In addition, the findings reveal age differences in digital activities, such as significant differences in the use of social network sites. Moreover, the finding raises gender differences as girls use more social network sites and boys use more digital games, which are characterized by high complexity and challenges. Additionally, we found positive attitudes towards technology integration in school. Students perceive technology as enhancing creativity, promoting active learning, encouraging self-learning, and helping students with learning difficulties. The presentation will provide an up-to-date, accurate picture of the use of various digital technologies by k12 students. In addition, it will discuss the learning potentials of such use and how to implement digital technologies in the curriculum. Acknowledgments: This study is a part of a broader study about K-12 digital life in Israel and is supported by Mofet-the Israel Institute for Teachers'Development.

Keywords: technology and learning, K-12, digital life, gender differences

Procedia PDF Downloads 116
28649 Locating Potential Site for Biomass Power Plant Development in Central Luzon Philippines Using GIS-Based Suitability Analysis

Authors: Bryan M. Baltazar, Marjorie V. Remolador, Klathea H. Sevilla, Imee Saladaga, Loureal Camille Inocencio, Ma. Rosario Concepcion O. Ang

Abstract:

Biomass energy is a traditional source of sustainable energy, which has been widely used in developing countries. The Philippines, specifically Central Luzon, has an abundant source of biomass. Hence, it could supply abundant agricultural residues (rice husks), as feedstock in a biomass power plant. However, locating a potential site for biomass development is a complex process which involves different factors, such as physical, environmental, socio-economic, and risks that are usually diverse and conflicting. Moreover, biomass distribution is highly dispersed geographically. Thus, this study develops an integrated method combining Geographical Information Systems (GIS) and methods for energy planning; Multi-Criteria Decision Analysis (MCDA) and Analytical Hierarchy Process (AHP), for locating suitable site for biomass power plant development in Central Luzon, Philippines by considering different constraints and factors. Using MCDA, a three level hierarchy of factors and constraints was produced, with corresponding weights determined by experts by using AHP. Applying the results, a suitability map for Biomass power plant development in Central Luzon was generated. It showed that the central part of the region has the highest potential for biomass power plant development. It is because of the characteristics of the area such as the abundance of rice fields, with generally flat land surfaces, accessible roads and grid networks, and low risks to flooding and landslide. This study recommends the use of higher accuracy resource maps, and further analysis in selecting the optimum site for biomass power plant development that would account for the cost and transportation of biomass residues.

Keywords: analytic hierarchy process, biomass energy, GIS, multi-criteria decision analysis, site suitability analysis

Procedia PDF Downloads 406
28648 A Genetic Algorithm Based Sleep-Wake up Protocol for Area Coverage in WSNs

Authors: Seyed Mahdi Jameii, Arash Nikdel, Seyed Mohsen Jameii

Abstract:

Energy efficiency is an important issue in the field of Wireless Sensor Networks (WSNs). So, minimizing the energy consumption in this kind of networks should be an essential consideration. Sleep/wake scheduling mechanism is an efficient approach to handling this issue. In this paper, we propose a Genetic Algorithm-based Sleep-Wake up Area Coverage protocol called GA-SWAC. The proposed protocol puts the minimum of nodes in active mode and adjusts the sensing radius of each active node to decrease the energy consumption while maintaining the network’s coverage. The proposed protocol is simulated. The results demonstrate the efficiency of the proposed protocol in terms of coverage ratio, number of active nodes and energy consumption.

Keywords: wireless sensor networks, genetic algorithm, coverage, connectivity

Procedia PDF Downloads 498
28647 Resilience with Spontaneous Volunteers in Disasters-Coordination Using an It System

Authors: Leo Latasch, Mario Di Gennaro

Abstract:

Introduction: The goal of this project was to increase the resilience of the population as well as rescue organizations to make both quality and time-related improvements in handling crises. A helper network was created for this purpose. Methods: Social questions regarding the structure and purpose of helper networks were considered - specifically with regard to helper motivation, the level of commitment and collaboration between populations and agencies. The exchange of information, the coordinated use of volunteers, and the distribution of available resources will be ensured through defined communication and cooperation routines. Helper smartphones will also be used provide a picture of the situation on the ground. Results: The helper network was established and deployed based on the RESIBES information technology system. It consists of a service platform, a web portal and a smartphone app. The service platform is the central element for collaboration between the various rescue organizations, as well as for persons, associations, and companies from the population offering voluntary aid. The platform was used for: Registering helpers and resources and then requesting and assigning it in case of a disaster. These services allow the population's resources to be organized. The service platform also allows for a secure data exchange between services and external systems. Conclusions: The social and technical work priorities have allowed us to cover a full cycle of advance structural work, gaining an overview, damage management, evaluation, and feedback on experiences. This cycle allows experiences gained while handling the crisis to feed back into the cycle and improve preparations and management strategies.

Keywords: coordination, disaster, resilience, volunteers

Procedia PDF Downloads 122
28646 Synthesis and Characterization of Anti-Psychotic Drugs Based DNA Aptamers

Authors: Shringika Soni, Utkarsh Jain, Nidhi Chauhan

Abstract:

Aptamers are recently discovered ~80-100 bp long artificial oligonucleotides that not only demonstrated their applications in therapeutics; it is tremendously used in diagnostic and sensing application to detect different biomarkers and drugs. Synthesizing aptamers for proteins or genomic template is comparatively feasible in laboratory, but drugs or other chemical target based aptamers require major specification and proper optimization and validation. One has to optimize all selection, amplification, and characterization steps of the end product, which is extremely time-consuming. Therefore, we performed asymmetric PCR (polymerase chain reaction) for random oligonucleotides pool synthesis, and further use them in Systematic evolution of ligands by exponential enrichment (SELEX) for anti-psychotic drugs based aptamers synthesis. Anti-psychotic drugs are major tranquilizers to control psychosis for proper cognitive functions. Though their low medical use, their misuse may lead to severe medical condition as addiction and can promote crime in social and economical impact. In this work, we have approached the in-vitro SELEX method for ssDNA synthesis for anti-psychotic drugs (in this case ‘target’) based aptamer synthesis. The study was performed in three stages, where first stage included synthesis of random oligonucleotides pool via asymmetric PCR where end product was analyzed with electrophoresis and purified for further stages. The purified oligonucleotide pool was incubated in SELEX buffer, and further partition was performed in the next stage to obtain target specific aptamers. The isolated oligonucleotides are characterized and quantified after each round of partition, and significant results were obtained. After the repetitive partition and amplification steps of target-specific oligonucleotides, final stage included sequencing of end product. We can confirm the specific sequence for anti-psychoactive drugs, which will be further used in diagnostic application in clinical and forensic set-up.

Keywords: anti-psychotic drugs, aptamer, biosensor, ssDNA, SELEX

Procedia PDF Downloads 120