Search results for: Atomic data
23940 D-Lysine Assisted 1-Ethyl-3-(3-Dimethylaminopropyl)Carbodiimide / N-Hydroxy Succinimide Initiated Crosslinked Collagen Scaffold with Controlled Structural and Surface Properties
Authors: G. Krishnamoorthy, S. Anandhakumar
Abstract:
The effect of D-Lysine (D-Lys) on collagen with 1-ethyl-3-(3-dimethylaminopropyl) carbodiimide(EDC)/N-hydroxysuccinimide(NHS) initiated cross linking using experimental and modelling tools are evaluated. The results of the Coll-D-Lys-EDC/NHS scaffold also indicate an increase in the tensile strength (TS), percentage of elongation (% E), denaturation temperature (Td), and decrease the decomposition rate compared to L-Lys-EDC/NHS. Scanning electron microscopic (SEM) and atomic force microscopic (AFM) analyses revealed a well ordered with properly oriented and well-aligned structure of scaffold. The D-Lys stabilizes the scaffold against degradation by collagenase than L-Lys. The cell assay showed more than 98% fibroblast viability (NIH3T3) and improved cell adhesions, protein adsorption after 72h of culture when compared with native scaffold. Cell attachment after 74h was robust, with cytoskeletal analysis showing that the attached cells were aligned along the fibers assuming a spindle-shape appearance, despite, gene expression analyses revealed no apparent alterations in mRNA levels, although cell proliferation was not adversely affected. D-Lysine (D-Lys) plays a pivotal role in the self-assembly and conformation of collagen fibrils. The D-Lys assisted EDC/NHS initiated cross-linking induces the formation of an carboxamide by the activation of the side chain -COOH group, followed by aminolysis of the O-iso acylurea intermediates by the -NH2 groups are directly joined via an isopeptides bond. This leads to the formation of intra- and inter-helical cross links. Modeling studies indicated that D-Lys bind with collagen-like peptide (CLP) through multiple H-bonding and hydrophobic interactions. Orientational changes in collagenase on CLP-D-Lys are observed which may decrease its accessibility to degradation and stabilize CLP against the action of the former. D-Lys has lowest binding energy and improved fibrillar-assembly and staggered alignment without the undesired structural stiffness and aggregations. The proteolytic machinery is not well equipped to deal with Coll-D-Lys than Coll-L-Lys scaffold. The information derived from the present study could help in designing collagenolytically stable heterochiral collagen based scaffold for biomedical applications.Keywords: collagen, collagenase, collagen like peptide, D-lysine, heterochiral collagen scaffold
Procedia PDF Downloads 39223939 Plasma Selenium Concentration and Polymorphism of Selenoprotein and Prostate Cancer
Authors: Yu-Mei Hsueh, Cheng-Shiuan Tsai, Chao-Yuan Huang
Abstract:
Prostate Cancer (PC) is a malignant tumor originated in prostate and is a second common male’s cancer in the world. Incidence of PC in Asia countries, have still been rising over the past few decades. As an antioxidant, selenium can slow down prostate cancer tumor progression, but the association between plasma selenium levels and risk of aggressive prostate cancer may be modified by different genotype of selenoprotein. The aim of this study is to determine the relationship between plasma selenium, polymorphism of selenoprotein, urinaty total arsenic, and prostate cancer. Two hundred ninety five pathologically-confirmed cases of PC and 295 cancer-free controls were individually matched to case subjects by age (± 5 years) were recruited from Department of Urology of National Taiwan University Hospital, Taipei Municipal Wan Fang Hospital and Taipei Medical University Hospital. Personal interview and biospeciment of urine and blood collection from participants were conducted by well-trained interviewers after participants’ informed consent was obtained. Plasma selenium was measured by an inductively coupled plasma mass. Urinary arsenic concentration was detected using high-performance liquid chromatography-linked hydride generator and atomic absorption spectrometry. The polymorphism of SEPP1rs3797310 and SEP15 rs5859 were determined using polymerase chain reaction-restriction fragment length polymorphism method. The higher plasma selenium was the lower OR of PC with a dose-response relationship. Prostate cancer patients with high plasma selenium had low tumor stage and grade. Participants carried SEPP1rs3797310 CT+TT genotype compared to those with CC genotype had a lower OR of PC in crude model; then this relationship was disappeared after confounder was adjusted. Prostate cancer patients with high urinary total arsenic concentration had high tumor stage and grade. Urinary total arsenic concentration was significantly positively related with plasma selenium and prostate specific antigen concentration. Participants with lower plasma selenium concentration and higher urinary total arsenic concentration compared to those with higher plasma selenium concentration and lower urinary total arsenic concentration had a higher OR of PC with a dose-response relationship.Keywords: prostate cancer, plasma selenium concentration, urinary arsenic concentration, prostate specific antigen
Procedia PDF Downloads 47223938 ROOP: Translating Sequential Code Fragments to Distributed Code Fragments Using Deep Reinforcement Learning
Authors: Arun Sanjel, Greg Speegle
Abstract:
Every second, massive amounts of data are generated, and Data Intensive Scalable Computing (DISC) frameworks have evolved into effective tools for analyzing such massive amounts of data. Since the underlying architecture of these distributed computing platforms is often new to users, building a DISC application can often be time-consuming and prone to errors. The automated conversion of a sequential program to a DISC program will consequently significantly improve productivity. However, synthesizing a user’s intended program from an input specification is complex, with several important applications, such as distributed program synthesizing and code refactoring. Existing works such as Tyro and Casper rely entirely on deductive synthesis techniques or similar program synthesis approaches. Our approach is to develop a data-driven synthesis technique to identify sequential components and translate them to equivalent distributed operations. We emphasize using reinforcement learning and unit testing as feedback mechanisms to achieve our objectives.Keywords: program synthesis, distributed computing, reinforcement learning, unit testing, DISC
Procedia PDF Downloads 10723937 A Novel Technological Approach to Maintaining the Cold Chain during Transportation
Authors: Philip J. Purnell
Abstract:
Innovators propose to use the Internet of Things to solve the problem of maintaining the cold chain during the transport of biopharmaceutical products. Sending a data logger with refrigerated goods is only useful to inform the recipient of the goods that they have either breached the cold chain and are therefore potentially spoiled or that they have not breached it and are therefore assumed to be in good condition. Connecting the data logger to the Internet of Things means that the supply chain manager will be informed in real-time of the exact location and the precise temperature of the material at any point on earth. Readable using a simple online interface, the supply chain manager will watch the progress of their material on a Google map together with accurate and crucially real-time temperature readings. The data logger will also send alarms to the supply chain manager if a cold chain breach becomes imminent allowing them time to contact the transporter and restore the cold chain before the material is affected. This development is expected to save billions of dollars in wasted biologics that currently arrive either spoiled or in an unreliable condition.Keywords: internet of things, cold chain, data logger, transportation
Procedia PDF Downloads 44223936 Positioning a Southern Inclusive Framework Embedded in the Social Model of Disability Theory Contextualised for Guyana
Authors: Lidon Lashley
Abstract:
This paper presents how the social model of disability can be used to reshape inclusive education practices in Guyana. Inclusive education in Guyana is metamorphosizing but still firmly held in the tenets of the Medical Model of Disability which influences the experiences of children with Special Education Needs and/or Disabilities (SEN/D). An ethnographic approach to data gathering was employed in this study. Qualitative data was gathered from the voices of children with and without SEN/D as well as their mainstream teachers to present the interplay of discourses and subjectivities in the situation. The data was analyzed using Adele Clarke's postmodern approach to grounded theory analysis called situational analysis. The data suggest that it is possible but will be challenging to fully contextualize and adopt Loreman's synthesis and Booths and Ainscow's Index in the two mainstream schools studied. In addition, the data paved the way for the presentation of the social model framework specific to Guyana called 'Southern Inclusive Education Framework for Guyana' and its support tool called 'The Inclusive Checker created for Southern mainstream primary classrooms.Keywords: social model of disability, medical model of disability, subjectivities, metamorphosis, special education needs, postcolonial Guyana, inclusion, culture, mainstream primary schools, Loreman's synthesis, Booths and Ainscow's index
Procedia PDF Downloads 16223935 An Extensible Software Infrastructure for Computer Aided Custom Monitoring of Patients in Smart Homes
Authors: Ritwik Dutta, Marylin Wolf
Abstract:
This paper describes the trade-offs and the design from scratch of a self-contained, easy-to-use health dashboard software system that provides customizable data tracking for patients in smart homes. The system is made up of different software modules and comprises a front-end and a back-end component. Built with HTML, CSS, and JavaScript, the front-end allows adding users, logging into the system, selecting metrics, and specifying health goals. The back-end consists of a NoSQL Mongo database, a Python script, and a SimpleHTTPServer written in Python. The database stores user profiles and health data in JSON format. The Python script makes use of the PyMongo driver library to query the database and displays formatted data as a daily snapshot of user health metrics against target goals. Any number of standard and custom metrics can be added to the system, and corresponding health data can be fed automatically, via sensor APIs or manually, as text or picture data files. A real-time METAR request API permits correlating weather data with patient health, and an advanced query system is implemented to allow trend analysis of selected health metrics over custom time intervals. Available on the GitHub repository system, the project is free to use for academic purposes of learning and experimenting, or practical purposes by building on it.Keywords: flask, Java, JavaScript, health monitoring, long-term care, Mongo, Python, smart home, software engineering, webserver
Procedia PDF Downloads 39023934 Building an Integrated Relational Database from Swiss Nutrition National Survey and Swiss Health Datasets for Data Mining Purposes
Authors: Ilona Mewes, Helena Jenzer, Farshideh Einsele
Abstract:
Objective: The objective of the study was to integrate two big databases from Swiss nutrition national survey (menuCH) and Swiss health national survey 2012 for data mining purposes. Each database has a demographic base data. An integrated Swiss database is built to later discover critical food consumption patterns linked with lifestyle diseases known to be strongly tied with food consumption. Design: Swiss nutrition national survey (menuCH) with approx. 2000 respondents from two different surveys, one by Phone and the other by questionnaire along with Swiss health national survey 2012 with 21500 respondents were pre-processed, cleaned and finally integrated to a unique relational database. Results: The result of this study is an integrated relational database from the Swiss nutritional and health databases.Keywords: health informatics, data mining, nutritional and health databases, nutritional and chronical databases
Procedia PDF Downloads 11223933 Clustering Performance Analysis using New Correlation-Based Cluster Validity Indices
Authors: Nathakhun Wiroonsri
Abstract:
There are various cluster validity measures used for evaluating clustering results. One of the main objectives of using these measures is to seek the optimal unknown number of clusters. Some measures work well for clusters with different densities, sizes and shapes. Yet, one of the weaknesses that those validity measures share is that they sometimes provide only one clear optimal number of clusters. That number is actually unknown and there might be more than one potential sub-optimal option that a user may wish to choose based on different applications. We develop two new cluster validity indices based on a correlation between an actual distance between a pair of data points and a centroid distance of clusters that the two points are located in. Our proposed indices constantly yield several peaks at different numbers of clusters which overcome the weakness previously stated. Furthermore, the introduced correlation can also be used for evaluating the quality of a selected clustering result. Several experiments in different scenarios, including the well-known iris data set and a real-world marketing application, have been conducted to compare the proposed validity indices with several well-known ones.Keywords: clustering algorithm, cluster validity measure, correlation, data partitions, iris data set, marketing, pattern recognition
Procedia PDF Downloads 10323932 Analyzing Competitive Advantage of Internet of Things and Data Analytics in Smart City Context
Authors: Petra Hofmann, Dana Koniel, Jussi Luukkanen, Walter Nieminen, Lea Hannola, Ilkka Donoghue
Abstract:
The Covid-19 pandemic forced people to isolate and become physically less connected. The pandemic hasnot only reshaped people’s behaviours and needs but also accelerated digital transformation (DT). DT of cities has become an imperative with the outlook of converting them into smart cities in the future. Embedding digital infrastructure and smart city initiatives as part of the normal design, construction, and operation of cities provides a unique opportunity to improve connection between people. Internet of Things (IoT) is an emerging technology and one of the drivers in DT. It has disrupted many industries by introducing different services and business models, and IoT solutions are being applied in multiple fields, including smart cities. As IoT and data are fundamentally linked together, IoT solutions can only create value if the data generated by the IoT devices is analysed properly. Extracting relevant conclusions and actionable insights by using established techniques, data analytics contributes significantly to the growth and success of IoT applications and investments. Companies must grasp DT and be prepared to redesign their offerings and business models to remain competitive in today’s marketplace. As there are many IoT solutions available today, the amount of data is tremendous. The challenge for companies is to understand what solutions to focus on and how to prioritise and which data to differentiate from the competition. This paper explains how IoT and data analytics can impact competitive advantage and how companies should approach IoT and data analytics to translate them into concrete offerings and solutions in the smart city context. The study was carried out as a qualitative, literature-based research. A case study is provided to validate the preservation of company’s competitive advantage through smart city solutions. The results of the researchcontribution provide insights into the different factors and considerations related to creating competitive advantage through IoT and data analytics deployment in the smart city context. Furthermore, this paper proposes a framework that merges the factors and considerations with examples of offerings and solutions in smart cities. The data collected through IoT devices, and the intelligent use of it, can create a competitive advantage to companies operating in smart city business. Companies should take into consideration the five forces of competition that shape industries and pay attention to the technological, organisational, and external contexts which define factors for consideration of competitive advantages in the field of IoT and data analytics. Companies that can utilise these key assets in their businesses will most likely conquer the markets and have a strong foothold in the smart city business.Keywords: internet of things, data analytics, smart cities, competitive advantage
Procedia PDF Downloads 9423931 Social Data-Based Users Profiles' Enrichment
Authors: Amel Hannech, Mehdi Adda, Hamid Mcheick
Abstract:
In this paper, we propose a generic model of user profile integrating several elements that may positively impact the research process. We exploit the classical behavior of users and integrate a delimitation process of their research activities into several research sessions enriched with contextual and temporal information, which allows reflecting the current interests of these users in every period of time and infer data freshness. We argue that the annotation of resources gives more transparency on users' needs. It also strengthens social links among resources and users, and can so increase the scope of the user profile. Based on this idea, we integrate the social tagging practice in order to exploit the social users' behavior to enrich their profiles. These profiles are then integrated into a recommendation system in order to predict the interesting personalized items of users allowing to assist them in their researches and further enrich their profiles. In this recommendation, we provide users new research experiences.Keywords: user profiles, topical ontology, contextual information, folksonomies, tags' clusters, data freshness, association rules, data recommendation
Procedia PDF Downloads 26523930 Impact of External Temperature on the Speleothem Growth in the Moravian Karst
Authors: Frantisek Odvarka
Abstract:
Based on the data from the Moravian Karst, the influence of the calcite speleothem growth by selected meteorological factors was evaluated. External temperature was determined as one of the main factors influencing speleothem growth in Moravian Karst. This factor significantly influences the CO₂ concentration in soil/epikarst, and cave atmosphere in the Moravian Karst and significantly contributes to the changes in the CO₂ partial pressure differences between soil/epikarst and cave atmosphere in Moravian Karst, which determines the drip water supersaturation with respect to the calcite and quantity of precipitated calcite in the Moravian Karst cave environment. External air temperatures and cave air temperatures were measured using a COMET S3120 data logger, which can measure temperatures in the range from -30 to +80 °C with an accuracy of ± 0.4 °C. CO₂ concentrations in the cave and soils were measured with a FT A600 CO₂H Ahlborn probe (value range 0 ppmv to 10,000 ppmv, accuracy 1 ppmv), which was connected to the data logger ALMEMO 2290-4, V5 Ahlborn. The soil temperature was measured with a FHA646E1 Ahlborn probe (temperature range -20 to 70 °C, accuracy ± 0.4 °C) connected to an ALMEMO 2290-4 V5 Ahlborn data logger. The airflow velocities into and out of the cave were monitored by a FVA395 TH4 Thermo anemometer (speed range from 0.05 to 2 m s⁻¹, accuracy ± 0.04 m s⁻¹), which was connected to the ALMEMO 2590-4 V5 Ahlborn data logger for recording. The flow was measured in the lower and upper entrance of the Imperial Cave. The data were analyzed in MS Office Excel 2019 and PHREEQC.Keywords: speleothem growth, carbon dioxide partial pressure, Moravian Karst, external temperature
Procedia PDF Downloads 14423929 Using Data Mining in Automotive Safety
Authors: Carine Cridelich, Pablo Juesas Cano, Emmanuel Ramasso, Noureddine Zerhouni, Bernd Weiler
Abstract:
Safety is one of the most important considerations when buying a new car. While active safety aims at avoiding accidents, passive safety systems such as airbags and seat belts protect the occupant in case of an accident. In addition to legal regulations, organizations like Euro NCAP provide consumers with an independent assessment of the safety performance of cars and drive the development of safety systems in automobile industry. Those ratings are mainly based on injury assessment reference values derived from physical parameters measured in dummies during a car crash test. The components and sub-systems of a safety system are designed to achieve the required restraint performance. Sled tests and other types of tests are then carried out by car makers and their suppliers to confirm the protection level of the safety system. A Knowledge Discovery in Databases (KDD) process is proposed in order to minimize the number of tests. The KDD process is based on the data emerging from sled tests according to Euro NCAP specifications. About 30 parameters of the passive safety systems from different data sources (crash data, dummy protocol) are first analysed together with experts opinions. A procedure is proposed to manage missing data and validated on real data sets. Finally, a procedure is developed to estimate a set of rough initial parameters of the passive system before testing aiming at reducing the number of tests.Keywords: KDD process, passive safety systems, sled test, dummy injury assessment reference values, frontal impact
Procedia PDF Downloads 38223928 Comparative Study of Greenhouse Locations through Satellite Images and Geographic Information System: Methodological Evaluation in Venezuela
Authors: Maria A. Castillo H., Andrés R. Leandro C.
Abstract:
During the last decades, agricultural productivity in Latin America has increased with precision agriculture and more efficient agricultural technologies. The use of automated systems, satellite images, geographic information systems, and tools for data analysis, and artificial intelligence have contributed to making more effective strategic decisions. Twenty years ago, the state of Mérida, located in the Venezuelan Andes, reported the largest area covered by greenhouses in the country, where certified seeds of potatoes, vegetables, ornamentals, and flowers were produced for export and consumption in the central region of the country. In recent years, it is estimated that production under greenhouses has changed, and the area covered has decreased due to different factors, but there are few historical statistical data in sufficient quantity and quality to support this estimate or to be used for analysis and decision making. The objective of this study is to compare data collected about geoposition, use, and covered areas of the greenhouses in 2007 to data available in 2021, as support for the analysis of the current situation of horticultural production in the main municipalities of the state of Mérida. The document presents the development of the work in the diagnosis and integration of geographic coordinates in GIS and data analysis phases. As a result, an evaluation of the process is made, a dashboard is presented with the most relevant data along with the geographical coordinates integrated into GIS, and an analysis of the obtained information is made. Finally, some recommendations for actions are added, and works that expand the information obtained and its geographical traceability over time are proposed. This study contributes to granting greater certainty in the supporting data for the evaluation of social, environmental, and economic sustainability indicators and to make better decisions according to the sustainable development goals in the area under review. At the same time, the methodology provides improvements to the agricultural data collection process that can be extended to other study areas and crops.Keywords: greenhouses, geographic information system, protected agriculture, data analysis, Venezuela
Procedia PDF Downloads 9323927 Optical Characterization of Transition Metal Ion Doped ZnO Microspheres Synthesized via Laser Ablation in Air
Authors: Parvathy Anitha, Nilesh J. Vasa, M. S. Ramachandra Rao
Abstract:
ZnO is a semiconducting material with a direct wide band gap of 3.37 eV and a large exciton binding energy of 60 meV at room temperature. Microspheres with high sphericity and symmetry exhibit unique functionalities which makes them excellent omnidirectional optical resonators. Hence there is an advent interest in fabrication of single crystalline semiconductor microspheres especially magnetic ZnO microspheres, as ZnO is a promising material for semiconductor device applications. Also, ZnO is non-toxic and biocompatible, implying it is a potential material for biomedical applications. Room temperature Photoluminescence (PL) spectra of the fabricated ZnO microspheres were measured, at an excitation wavelength of 325 nm. The ultraviolet (UV) luminescence observed is attributed to the room-temperature free exciton related near-band-edge (NBE) emission in ZnO. Besides the NBE luminescence, weak and broad visible luminescence (~560nm) was also observed. This broad emission band in the visible range is associated with oxygen vacancies related to structural defects. In transition metal (TM) ion-doped ZnO, 3d levels emissions of TM ions will modify the inherent characteristic emissions of ZnO. A micron-sized ZnO crystal has generally a wurtzite structure with a natural hexagonal cross section, which will serve as a WGM (whispering gallery mode) lasing micro cavity due to its high refractive index (~2.2). But hexagonal cavities suffers more optical loss at their corners in comparison to spherical structures; hence spheres may be a better candidate to achieve effective light confinement. In our study, highly smooth spherical shaped micro particles with different diameters ranging from ~4 to 6 μm were grown on different substrates. SEM (Scanning Electron Microscopy) and AFM (Atomic Force Microscopy) images show the presence of uniform smooth surfaced spheres. Raman scattering measurements from the fabricated samples at 488 nm light excitation provide convincing supports for the wurtzite structure of the prepared ZnO microspheres. WGM lasing studies from TM-doped ZnO microparticles are in progress.Keywords: laser ablation, microcavity, photoluminescence, ZnO microsphere
Procedia PDF Downloads 21723926 Modelling Consistency and Change of Social Attitudes in 7 Years of Longitudinal Data
Authors: Paul Campbell, Nicholas Biddle
Abstract:
There is a complex, endogenous relationship between individual circumstances, attitudes, and behaviour. This study uses longitudinal panel data to assess changes in social and political attitudes over a 7-year period. Attitudes are captured with the question 'what is the most important issue facing Australia today', collected at multiple time points in a longitudinal survey of 2200 Australians. Consistency of attitudes, and factors predicting change over time, are assessed. The consistency of responses has methodological implications for data collection, specifically how often such questions ought to be asked of a population. When change in attitude is observed, this study assesses the extent to which individual demographic characteristics, personality traits, and broader societal events predict change.Keywords: attitudes, longitudinal survey analysis, personality, social values
Procedia PDF Downloads 13323925 Data Protection and Regulation Compliance on Handling Physical Child Abuse Scenarios- A Scoping Review
Authors: Ana Mafalda Silva, Rebeca Fontes, Ana Paula Vaz, Carla Carreira, Ana Corte-Real
Abstract:
Decades of research on the topic of interpersonal violence against minors highlight five main conclusions: 1) it causes harmful effects on children's development and health; 2) it is prevalent; 3) it violates children's rights; 4) it can be prevented and 5) parents are the main aggressors. The child abuse scenario is identified through clinical observation, administrative data and self-reports. The most used instruments are self-reports; however, there are no valid and reliable self-report instruments for minors, which consist of a retrospective interpretation of the situation by the victim already in her adult phase and/or by her parents. Clinical observation and collection of information, namely from the orofacial region, are essential in the early identification of these situations. The management of medical data, such as personal data, must comply with the General Data Protection Regulation (GDPR), in Europe, and with the General Law of Data Protection (LGPD), in Brazil. This review aims to answer the question: In a situation of medical assistance to minors, in the suspicion of interpersonal violence, due to mistreatment, is it necessary for the guardians to provide consent in the registration and sharing of personal data, namely medical ones. A scoping review was carried out based on a search by the Web of Science and Pubmed search engines. Four papers and two documents from the grey literature were selected. As found, the process of identifying and signaling child abuse by the health professional, and the necessary early intervention in defense of the minor as a victim of abuse, comply with the guidelines expressed in the GDPR and LGPD. This way, the notification in maltreatment scenarios by health professionals should be a priority and there shouldn’t be the fear or anxiety of legal repercussions that stands in the way of collecting and treating the data necessary for the signaling procedure that safeguards and promotes the welfare of children living with abuse.Keywords: child abuse, disease notifications, ethics, healthcare assistance
Procedia PDF Downloads 9523924 A Generic Middleware to Instantly Sync Intensive Writes of Heterogeneous Massive Data via Internet
Authors: Haitao Yang, Zhenjiang Ruan, Fei Xu, Lanting Xia
Abstract:
Industry data centers often need to sync data changes reliably and instantly from a large-scale of heterogeneous autonomous relational databases accessed via the not-so-reliable Internet, for which a practical universal sync middle of low maintenance and operation costs is most wanted, but developing such a product and adapting it for various scenarios are a very sophisticated and continuous practice. The authors have been devising, applying, and optimizing a generic sync middleware system, named GSMS since 2006, holding the principles or advantages that the middleware must be SyncML-compliant and transparent to data application layer logic, need not refer to implementation details of databases synced, does not rely on host computer operating systems deployed, and its construction is light weighted and hence, of low cost. A series of ultimate experiments with GSMS sync performance were conducted for a persuasive example of a source relational database that underwent a broad range of write loads, say, from one thousand to one million intensive writes within a few minutes. The tests proved that GSMS has achieved an instant sync level of well below a fraction of millisecond per record sync, and GSMS’ smooth performances under ultimate write loads also showed it is feasible and competent.Keywords: heterogeneous massive data, instantly sync intensive writes, Internet generic middleware design, optimization
Procedia PDF Downloads 12023923 Building Transparent Supply Chains through Digital Tracing
Authors: Penina Orenstein
Abstract:
In today’s world, particularly with COVID-19 a constant worldwide threat, organizations need greater visibility over their supply chains more than ever before, in order to find areas for improvement and greater efficiency, reduce the chances of disruption and stay competitive. The concept of supply chain mapping is one where every process and route is mapped in detail between each vendor and supplier. The simplest method of mapping involves sourcing publicly available data including news and financial information concerning relationships between suppliers. An additional layer of information would be disclosed by large, direct suppliers about their production and logistics sites. While this method has the advantage of not requiring any input from suppliers, it also doesn’t allow for much transparency beyond the first supplier tier and may generate irrelevant data—noise—that must be filtered out to find the actionable data. The primary goal of this research is to build data maps of supply chains by focusing on a layered approach. Using these maps, the secondary goal is to address the question as to whether the supply chain is re-engineered to make improvements, for example, to lower the carbon footprint. Using a drill-down approach, the end result is a comprehensive map detailing the linkages between tier-one, tier-two, and tier-three suppliers super-imposed on a geographical map. The driving force behind this idea is to be able to trace individual parts to the exact site where they’re manufactured. In this way, companies can ensure sustainability practices from the production of raw materials through the finished goods. The approach allows companies to identify and anticipate vulnerabilities in their supply chain. It unlocks predictive analytics capabilities and enables them to act proactively. The research is particularly compelling because it unites network science theory with empirical data and presents the results in a visual, intuitive manner.Keywords: data mining, supply chain, empirical research, data mapping
Procedia PDF Downloads 17423922 Synoptic Analysis of a Heavy Flood in the Province of Sistan-Va-Balouchestan: Iran January 2020
Authors: N. Pegahfar, P. Ghafarian
Abstract:
In this research, the synoptic weather conditions during the heavy flood of 10-12 January 2020 in the Sistan-va-Balouchestan Province of Iran will be analyzed. To this aim, reanalysis data from the National Centers for Environmental Prediction (NCEP) and National Center for Atmospheric Research (NCAR), NCEP Global Forecasting System (GFS) analysis data, measured data from a surface station together with satellite images from the European Organization for the Exploitation of Meteorological Satellites (EUMETSAT) have been used from 9 to 12 January 2020. Atmospheric parameters both at the lower troposphere and also at the upper part of that have been used, including absolute vorticity, wind velocity, temperature, geopotential height, relative humidity, and precipitation. Results indicated that both lower-level and upper-level currents were strong. In addition, the transport of a large amount of humidity from the Oman Sea and the Red Sea to the south and southeast of Iran (Sistan-va-Balouchestan Province) led to the vast and unexpected precipitation and then a heavy flood.Keywords: Sistan-va-Balouchestn Province, heavy flood, synoptic, analysis data
Procedia PDF Downloads 10223921 The Evaluation of Occupational Exposure of Chrome in Welders of Stainless Steels
Authors: L. Musak, J. Valachova, T. Vasicko, O. Osina
Abstract:
Introduction: Stainless steel is resistant to electrochemical corrosion by passivation. Welders are greatly exposed to welding fumes of toxic metals, which added to this steel. The content of chromium (Cr) in steel was above 11.5%, Ni and Mo from 2 to 6.5%. The aim of the study was the evaluation of occupational exposure to Cr, chromosome analysis and valuation of individual susceptibility polymorphism of gene CCND1 c.870 G>A. Materials and Methods: The exposed group was consisted from 117 welders of stainless steels. The average age was 38.43 years and average exposure time 7.14 years. Smokers represented 40.17%. The control group consisted of 123 non-exposed workers with an average age of 39.74 years and time employment 16.67 years. Smokers accounted for 22.76%. Analysis of Cr in blood and urine was performed by atomic absorption spectrophotometry (AAS Varian SpectraAA 30P) with electrothermal decomposition of the sample in the graphite furnace. For the evaluation of chromosomal aberrations (CA) was used cytogenetic analysis of peripheral blood lymphocytes, gene polymorphism was determined by PCR-RFLP reaction using appropriate primers and restriction enzymes. For statistical analysis was used the Mann-Whitney U-test. Results: The mean Cr level in exposed group was 0.095 mmol/l (0.019 min-max 0.504). No value does exceed the average normal value. The average value Cr in urine was 7.9 mmol/mol creatinine (min 0.026 to max 19.26). The total number of CA was 1.86% in compared to 1.70% controls. (CTA-type 0.90% vs 0.80% and CSA-type 0.96% vs 0.90%). In the number of total CA was observed statistical difference between smokers and non-smokers of exposed group (S-1.57% vs. NS-2.04%, P<0.05). In CCND1 gene polymorphisms was observed the increasing of the total CA with wild-type allele (WT) via heterozygous to the VAR genotype (1.44%<1.82%<2.13%). There was observed a statistically higher incidence of CTA-type aberrations in variant genotypes between exposed and control groups (1.22% vs. 0.59%, P<0.05). Discussion and conclusions: The work place is usually higher source of exposure to harmful factors. Workers need consistently and checked frequently health control. In assessing the risk of adverse effects of metals is important to consider their persistence, behavior and bioavailability. Prolonged exposure to carcinogens may not manifest symptoms of poisoning, but delayed effects may occur, which resulted in a higher incidence of malignant tumors.Keywords: genotoxicity, chromium, stainless steels, welders
Procedia PDF Downloads 36923920 Role of Machine Learning in Internet of Things Enabled Smart Cities
Authors: Amit Prakash Singh, Shyamli Singh, Chavi Srivastav
Abstract:
This paper presents the idea of Internet of Thing (IoT) for the infrastructure of smart cities. Internet of Thing has been visualized as a communication prototype that incorporates myriad of digital services. The various component of the smart cities shall be implemented using microprocessor, microcontroller, sensors for network communication and protocols. IoT enabled systems have been devised to support the smart city vision, of which aim is to exploit the currently available precocious communication technologies to support the value-added services for function of the city. Due to volume, variety, and velocity of data, it requires analysis using Big Data concept. This paper presented the various techniques used to analyze big data using machine learning.Keywords: IoT, smart city, embedded systems, sustainable environment
Procedia PDF Downloads 57523919 Machine Learning Classification of Fused Sentinel-1 and Sentinel-2 Image Data Towards Mapping Fruit Plantations in Highly Heterogenous Landscapes
Authors: Yingisani Chabalala, Elhadi Adam, Khalid Adem Ali
Abstract:
Mapping smallholder fruit plantations using optical data is challenging due to morphological landscape heterogeneity and crop types having overlapped spectral signatures. Furthermore, cloud covers limit the use of optical sensing, especially in subtropical climates where they are persistent. This research assessed the effectiveness of Sentinel-1 (S1) and Sentinel-2 (S2) data for mapping fruit trees and co-existing land-use types by using support vector machine (SVM) and random forest (RF) classifiers independently. These classifiers were also applied to fused data from the two sensors. Feature ranks were extracted using the RF mean decrease accuracy (MDA) and forward variable selection (FVS) to identify optimal spectral windows to classify fruit trees. Based on RF MDA and FVS, the SVM classifier resulted in relatively high classification accuracy with overall accuracy (OA) = 0.91.6% and kappa coefficient = 0.91% when applied to the fused satellite data. Application of SVM to S1, S2, S2 selected variables and S1S2 fusion independently produced OA = 27.64, Kappa coefficient = 0.13%; OA= 87%, Kappa coefficient = 86.89%; OA = 69.33, Kappa coefficient = 69. %; OA = 87.01%, Kappa coefficient = 87%, respectively. Results also indicated that the optimal spectral bands for fruit tree mapping are green (B3) and SWIR_2 (B10) for S2, whereas for S1, the vertical-horizontal (VH) polarization band. Including the textural metrics from the VV channel improved crop discrimination and co-existing land use cover types. The fusion approach proved robust and well-suited for accurate smallholder fruit plantation mapping.Keywords: smallholder agriculture, fruit trees, data fusion, precision agriculture
Procedia PDF Downloads 5423918 Application Research of Stilbene Crystal for the Measurement of Accelerator Neutron Sources
Authors: Zhao Kuo, Chen Liang, Zhang Zhongbing, Ruan Jinlu. He Shiyi, Xu Mengxuan
Abstract:
Stilbene, C₁₄H₁₂, is well known as one of the most useful organic scintillators for pulse shape discrimination (PSD) technique for its good scintillation properties. An on-line acquisition system and an off-line acquisition system were developed with several CAMAC standard plug-ins, NIM plug-ins, neutron/γ discriminating plug-in named 2160A and a digital oscilloscope with high sampling rate respectively for which stilbene crystals and photomultiplier tube detectors (PMT) as detector for accelerator neutron sources measurement carried out in China Institute of Atomic Energy. Pulse amplitude spectrums and charge amplitude spectrums were real-time recorded after good neutron/γ discrimination whose best PSD figure-of-merits (FoMs) are 1.756 for D-D accelerator neutron source and 1.393 for D-T accelerator neutron source. The probability of neutron events in total events was 80%, and neutron detection efficiency was 5.21% for D-D accelerator neutron sources, which were 50% and 1.44% for D-T accelerator neutron sources after subtracting the background of scattering observed by the on-line acquisition system. Pulse waveform signals were acquired by the off-line acquisition system randomly while the on-line acquisition system working. The PSD FoMs obtained by the off-line acquisition system were 2.158 for D-D accelerator neutron sources and 1.802 for D-T accelerator neutron sources after waveform digitization off-line processing named charge integration method for just 1000 pulses. In addition, the probabilities of neutron events in total events obtained by the off-line acquisition system matched very well with the probabilities of the on-line acquisition system. The pulse information recorded by the off-line acquisition system could be repetitively used to adjust the parameters or methods of PSD research and obtain neutron charge amplitude spectrums or pulse amplitude spectrums after digital analysis with a limited number of pulses. The off-line acquisition system showed equivalent or better measurement effects compared with the online system with a limited number of pulses which indicated a feasible method based on stilbene crystals detectors for the measurement of prompt neutrons neutron sources like prompt accelerator neutron sources emit a number of neutrons in a short time.Keywords: stilbene crystal, accelerator neutron source, neutron / γ discrimination, figure-of-merits, CAMAC, waveform digitization
Procedia PDF Downloads 18723917 A Tactic for a Cosmopolitan City Comparison through a Data-Driven Approach: Case of Climate City Networking
Authors: Sombol Mokhles
Abstract:
Tackling climate change requires expanding networking opportunities between a diverse range of cities to accelerate climate actions. Existing climate city networks have limitations in actively engaging “ordinary” cities in networking processes between cities, as they encourage a few powerful cities to be followed by the many “ordinary” cities. To reimagine the networking opportunities between cities beyond global cities, this paper incorporates “cosmopolitan comparison” to expand our knowledge of a diverse range of cities using a data-driven approach. Through a cosmopolitan perspective, a framework is presented on how to utilise large data to expand knowledge of cities beyond global cities to reimagine the existing hierarchical networking practices. The contribution of this framework is beyond urban climate governance but inclusive of different fields which strive for a more inclusive and cosmopolitan comparison attentive to the differences across cities.Keywords: cosmopolitan city comparison, data-driven approach, climate city networking, urban climate governance
Procedia PDF Downloads 11123916 An Analysis on Clustering Based Gene Selection and Classification for Gene Expression Data
Authors: K. Sathishkumar, V. Thiagarasu
Abstract:
Due to recent advances in DNA microarray technology, it is now feasible to obtain gene expression profiles of tissue samples at relatively low costs. Many scientists around the world use the advantage of this gene profiling to characterize complex biological circumstances and diseases. Microarray techniques that are used in genome-wide gene expression and genome mutation analysis help scientists and physicians in understanding of the pathophysiological mechanisms, in diagnoses and prognoses, and choosing treatment plans. DNA microarray technology has now made it possible to simultaneously monitor the expression levels of thousands of genes during important biological processes and across collections of related samples. Elucidating the patterns hidden in gene expression data offers a tremendous opportunity for an enhanced understanding of functional genomics. However, the large number of genes and the complexity of biological networks greatly increase the challenges of comprehending and interpreting the resulting mass of data, which often consists of millions of measurements. A first step toward addressing this challenge is the use of clustering techniques, which is essential in the data mining process to reveal natural structures and identify interesting patterns in the underlying data. This work presents an analysis of several clustering algorithms proposed to deals with the gene expression data effectively. The existing clustering algorithms like Support Vector Machine (SVM), K-means algorithm and evolutionary algorithm etc. are analyzed thoroughly to identify the advantages and limitations. The performance evaluation of the existing algorithms is carried out to determine the best approach. In order to improve the classification performance of the best approach in terms of Accuracy, Convergence Behavior and processing time, a hybrid clustering based optimization approach has been proposed.Keywords: microarray technology, gene expression data, clustering, gene Selection
Procedia PDF Downloads 32323915 A Theoretical Model for Pattern Extraction in Large Datasets
Authors: Muhammad Usman
Abstract:
Pattern extraction has been done in past to extract hidden and interesting patterns from large datasets. Recently, advancements are being made in these techniques by providing the ability of multi-level mining, effective dimension reduction, advanced evaluation and visualization support. This paper focuses on reviewing the current techniques in literature on the basis of these parameters. Literature review suggests that most of the techniques which provide multi-level mining and dimension reduction, do not handle mixed-type data during the process. Patterns are not extracted using advanced algorithms for large datasets. Moreover, the evaluation of patterns is not done using advanced measures which are suited for high-dimensional data. Techniques which provide visualization support are unable to handle a large number of rules in a small space. We present a theoretical model to handle these issues. The implementation of the model is beyond the scope of this paper.Keywords: association rule mining, data mining, data warehouses, visualization of association rules
Procedia PDF Downloads 22323914 Synthesis of Deformed Nuclei 260Rf, 261Rf and 262Rf in the Decay of 266Rf*Formed via Different Fusion Reactions: Entrance Channel Effects
Authors: Niyti, Aman Deep, Rajesh Kharab, Sahila Chopra, Raj. K. Gupta
Abstract:
Relatively long-lived transactinide elements (i.e., elements with atomic number Z≥104) up to Z = 108 have been produced in nuclear reactions between low Z projectiles (C to Al) and actinide targets. Cross sections have been observed to decrease steeply with increasing Z. Recently, production cross sections of several picobarns have been reported for comparatively neutron-rich nuclides of 112 through 118 produced via hot fusion reactions with 48Ca and actinide targets. Some of those heavy nuclides are reported to have lifetimes on the order of seconds or longer. The relatively high cross sections in these hot fusion reactions are not fully understood and this has renewed interest in systematic studies of heavy-ion reactions with actinide targets. The main aim of this work is to understand the dynamics hot fusion reactions 18O+ 248Cm and 22Ne+244Pu (carried out at RIKEN and TASCA respectively) using the collective clusterization technique, carried out by undertaking the decay of the compound nucleus 266Rf∗ into 4n, 5n and 6n neutron evaporation channels. Here we extend our earlier study of the excitation functions (EFs) of 266Rf∗, formed in fusion reaction 18O+248Cm, based on Dynamical Cluster-decay Model (DCM) using the pocket formula for nuclear proximity potential, to the use of other nuclear interaction potentials derived from Skyrme energy density formalism (SEDF) based on semiclassical extended Thomas Fermi (ETF) approach and also study entrance channel effects by considering the synthesis of 266Rf* in 22Ne+244Pu reaction. The Skyrme forces used are the old force SIII, and new forces GSkI and KDE0(v1). Here, the EFs for the production of 260Rf, 261Rf and 262Rf isotope via 6n, 5n and 4n decay channel from the 266Rf∗ compound nucleus are studied at Elab = 88.2 to 125 MeV, including quadrupole deformations β2i and ‘hot-optimum’ orientations θi. The calculations are made within the DCM where the neck-length ∆R is the only parameter representing the relative separation distance between two fragments and/or clusters Ai which assimilates the neck formation effects.Keywords: entrance channel effects, fusion reactions, skyrme force, superheavy nucleus
Procedia PDF Downloads 25323913 Design of Data Management Software System Supporting Rendezvous and Docking with Various Spaceships
Authors: Zhan Panpan, Lu Lan, Sun Yong, He Xiongwen, Yan Dong, Gu Ming
Abstract:
The function of the two spacecraft docking network, the communication and control of a docking target with various spacecrafts is realized in the space lab data management system. In order to solve the problem of the complex data communication mode between the space lab and various spaceships, and the problem of software reuse caused by non-standard protocol, a data management software system supporting rendezvous and docking with various spaceships has been designed. The software system is based on CCSDS Spcecraft Onboard Interface Service(SOIS). It consists of Software Driver Layer, Middleware Layer and Appliaction Layer. The Software Driver Layer hides the various device interfaces using the uniform device driver framework. The Middleware Layer is divided into three lays, including transfer layer, application support layer and system business layer. The communication of space lab plaform bus and the docking bus is realized in transfer layer. Application support layer provides the inter tasks communitaion and the function of unified time management for the software system. The data management software functions are realized in system business layer, which contains telemetry management service, telecontrol management service, flight status management service, rendezvous and docking management service and so on. The Appliaction Layer accomplishes the space lab data management system defined tasks using the standard interface supplied by the Middleware Layer. On the basis of layered architecture, rendezvous and docking tasks and the rendezvous and docking management service are independent in the software system. The rendezvous and docking tasks will be activated and executed according to the different spaceships. In this way, the communication management functions in the independent flight mode, the combination mode of the manned spaceship and the combination mode of the cargo spaceship are achieved separately. The software architecture designed standard appliction interface for the services in each layer. Different requirements of the space lab can be supported by the use of standard services per layer, and the scalability and flexibility of the data management software can be effectively improved. It can also dynamically expand the number and adapt to the protocol of visiting spaceships. The software system has been applied in the data management subsystem of the space lab, and has been verified in the flight of the space lab. The research results of this paper can provide the basis for the design of the data manage system in the future space station.Keywords: space lab, rendezvous and docking, data management, software system
Procedia PDF Downloads 36823912 The Wear Recognition on Guide Surface Based on the Feature of Radar Graph
Authors: Youhang Zhou, Weimin Zeng, Qi Xie
Abstract:
Abstract: In order to solve the wear recognition problem of the machine tool guide surface, a new machine tool guide surface recognition method based on the radar-graph barycentre feature is presented in this paper. Firstly, the gray mean value, skewness, projection variance, flat degrees and kurtosis features of the guide surface image data are defined as primary characteristics. Secondly, data Visualization technology based on radar graph is used. The visual barycentre graphical feature is demonstrated based on the radar plot of multi-dimensional data. Thirdly, a classifier based on the support vector machine technology is used, the radar-graph barycentre feature and wear original feature are put into the classifier separately for classification and comparative analysis of classification and experiment results. The calculation and experimental results show that the method based on the radar-graph barycentre feature can detect the guide surface effectively.Keywords: guide surface, wear defects, feature extraction, data visualization
Procedia PDF Downloads 51923911 Aggregation Scheduling Algorithms in Wireless Sensor Networks
Authors: Min Kyung An
Abstract:
In Wireless Sensor Networks which consist of tiny wireless sensor nodes with limited battery power, one of the most fundamental applications is data aggregation which collects nearby environmental conditions and aggregates the data to a designated destination, called a sink node. Important issues concerning the data aggregation are time efficiency and energy consumption due to its limited energy, and therefore, the related problem, named Minimum Latency Aggregation Scheduling (MLAS), has been the focus of many researchers. Its objective is to compute the minimum latency schedule, that is, to compute a schedule with the minimum number of timeslots, such that the sink node can receive the aggregated data from all the other nodes without any collision or interference. For the problem, the two interference models, the graph model and the more realistic physical interference model known as Signal-to-Interference-Noise-Ratio (SINR), have been adopted with different power models, uniform-power and non-uniform power (with power control or without power control), and different antenna models, omni-directional antenna and directional antenna models. In this survey article, as the problem has proven to be NP-hard, we present and compare several state-of-the-art approximation algorithms in various models on the basis of latency as its performance measure.Keywords: data aggregation, convergecast, gathering, approximation, interference, omni-directional, directional
Procedia PDF Downloads 229