Search results for: scientific database
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3507

Search results for: scientific database

3027 Wave Pressure Metering with the Specific Instrument and Measure Description Determined by the Shape and Surface of the Instrument including the Number of Sensors and Angle between Them

Authors: Branimir Jurun, Elza Jurun

Abstract:

Focus of this paper is description and functioning manner of the instrument for wave pressure metering. Moreover, an essential component of this paper is the proposal of a metering unit for the direct wave pressure measurement determined by the shape and surface of the instrument including the number of sensors and angle between them. Namely, far applied instruments by means of height, length, direction, wave time period and other components determine wave pressure on a particular area. This instrument, allows the direct measurement i.e. measurement without additional calculation, of the wave pressure expressed in a standardized unit of measure. That way the instrument has a standardized form, surface, number of sensors and the angle between them. In addition, it is made with the status that follows the wave and always is on the water surface. Database quality which is listed by the instrument is made possible by using the Arduino chip. This chip is programmed for receiving by two data from each of the sensors each second. From these data by a pre-defined manner a unique representative value is estimated. By this procedure all relevant wave pressure measurement results are directly and immediately registered. Final goal of establishing such a rich database is a comprehensive statistical analysis that ranges from multi-criteria analysis across different modeling and parameters testing to hypothesis accepting relating to the widest variety of man-made activities such as filling of beaches, security cages for aquaculture, bridges construction.

Keywords: instrument, metering, water, waves

Procedia PDF Downloads 241
3026 Economic Policy to Stimulate Industrial Development in Georgia

Authors: Gulnaz Erkomaishvili

Abstract:

The article analyzes the modern level of industrial production in Georgia, shows the export-import of industrial products and evaluates the results of the activities of institutions implementing industrial policy. The research showed us that the level of development of industry in the country and its export potential are quite low. The article concludes that in the modern phase of industrial development, the country should choose a model focused on technological development and maximum growth of export potential. Objectives. The aim of the research is to develop an economic policy that promotes the development of industry and to look for ways to implement it effectively. Methodologies This paper uses general and specific methods, in particular, analysis, synthesis, induction, deduction, scientific abstraction, comparative and statistical methods, as well as experts’ evaluation. In-depth interviews with experts were conducted to determine quantitative and qualitative indicators; Publications of the National Statistics Office of Georgia are used to determine the regularity between analytical and statistical estimations. Also, theoretical and applied research of international organizations and scientist-economists are used. Contributions Based on the identified challenges in the area of industry, recommendations for the implementation of an active industrial policy in short and long term periods were developed. In particular: the government's priority orientation of industrial development; paying special attention to the processing industry sectors that Georgia has the potential to produce; supporting the development of scientific fields; Determination of certain benefits for those investors who invest money in industrial production; State partnership with the private sector, manifested in the fight against bureaucracy, corruption and crime, creating favorable business conditions for entrepreneurs; Coordination between education - science - production should be implemented in the country. Much attention should be paid to basic scientific research, which does not require purely commercial returns in the short term, science should become a real productive force; Special importance should be given to the creation of an environment that will support the expansion of export-oriented production; Overcoming barriers to entry into export markets.

Keywords: industry, sectoral structure of industry, exsport-import of industrial products, industrial policy

Procedia PDF Downloads 77
3025 Locally Crafted Sustainability: A Scoping Review for Nesting Social-Ecological and Socio-Technical Systems Towards Action Research in Agriculture

Authors: Marcia Figueira

Abstract:

Context: Positivist transformations in agriculture were responsible for top-down – often coercive – mechanisms of uniformed modernization that weathered local diversities and agency. New development pathways need to now shift according to comprehensive integrations of knowledge - scientific, indigenous, and local, and to be sustained on political interventions, bottom-up change, and social learning if climate goals are to be met – both in mitigation and adaptation. Objectives The objectives of this research are to understand how social-ecological and socio-technical systems characterisation can be nested to bridge scientific research/knowledge into a local context and knowledge system; and, with it, stem sustainable innovation. Methods To do so, we conducted a scoping review to explore theoretical and empirical works linked to Ostrom’s Social-Ecological Systems framework and Geels’ multi-level perspective of socio-technical systems transformations in the context of agriculture. Results As a result, we were able to identify key variables and connections to 1- understand the rules in use and the community attributes influencing resource management; and 2- how they are and have been shaped and shaping systems innovations. Conclusion Based on these results, we discuss how to leverage action research for mutual learning toward a replicable but highly place-based agriculture transformation frame.

Keywords: agriculture systems innovations, social-ecological systems, socio-technical systems, action research

Procedia PDF Downloads 69
3024 Multi-Objective Optimization for the Green Vehicle Routing Problem: Approach to Case Study of the Newspaper Distribution Problem

Authors: Julio C. Ferreira, Maria T. A. Steiner

Abstract:

The aim of this work is to present a solution procedure referred to here as the Multi-objective Optimization for Green Vehicle Routing Problem (MOOGVRP) to provide solutions for a case study. The proposed methodology consists of three stages to resolve Scenario A. Stage 1 consists of the “treatment” of data; Stage 2 consists of applying mathematical models of the p-Median Capacitated Problem (with the objectives of minimization of distances and homogenization of demands between groups) and the Asymmetric Traveling Salesman Problem (with the objectives of minimizing distances and minimizing time). The weighted method was used as the multi-objective procedure. In Stage 3, an analysis of the results is conducted, taking into consideration the environmental aspects related to the case study, more specifically with regard to fuel consumption and air pollutant emission. This methodology was applied to a (partial) database that addresses newspaper distribution in the municipality of Curitiba, Paraná State, Brazil. The preliminary findings for Scenario A showed that it was possible to improve the distribution of the load, reduce the mileage and the greenhouse gas by 17.32% and the journey time by 22.58% in comparison with the current scenario. The intention for future works is to use other multi-objective techniques and an expanded version of the database and explore the triple bottom line of sustainability.

Keywords: Asymmetric Traveling Salesman Problem, Green Vehicle Routing Problem, Multi-objective Optimization, p-Median Capacitated Problem

Procedia PDF Downloads 91
3023 Quantitative and Qualitative Analysis of Randomized Controlled Trials in Physiotherapy from India

Authors: K. Hariohm, V. Prakash, J. Saravana Kumar

Abstract:

Introduction and Rationale: Increased scope of Physiotherapy (PT) practice also has contributed to research in the field of PT. It is essential to determine the production and quality of the clinical trials from India since, it may reflect the scientific growth of the profession. These trends can be taken as a baseline to measure our performance and also can be used as a guideline for the future trials. Objective: To quantify and analyze qualitatively the RCT’s from India from the period 2000-2013’ May, and classify data for the information process. Methods: Studies were searched in the Medline database using the key terms “India”, “Indian”, “Physiotherapy”. Clinical trials only with PT authors were included. Trials out of scope of PT practice and on animals were excluded. Retrieved valid articles were analyzed for published year, type of participants, area of study, PEDro score, outcome measure domains of impairment, activity, participation; ‘a priori’ sample size calculation, region, and explanation of the intervention. Result: 45 valid articles were retrieved from the year 2000-2013’ May. The majority of articles were done on symptomatic participants (81%). The frequencies of conditions repeated more were low back pain (n-7) and diabetes (n-4). PEDro score with mode 5 and upper limit of 8 and lower limit 4 was found. 97.2% of studies measure the outcome at the impairment level, 34% in activity level, and 27.8% in participation level. 29.7% of studies did ‘a priori’ sample size calculation. Correlation of year trend and PEDro score found to be not significant (p>.05). Individual PEDro item analysis showed, randomization (100%), concealment (33%) baseline (76%), blinding-subject, therapist, assessor (9.1%, 0%, 10%), follow-up (89%) ITT (15%), statistics between groups (100%), measures of variance (88 %). Conclusion: The trend shows an upward slope in terms of RCTs published from India which is a good indicator. The qualitative analysis showed some gaps in the clinical trial design, which can be expected to be, fulfilled by the future researchers.

Keywords: RCT, PEDro, physical therapy, rehabilitation

Procedia PDF Downloads 321
3022 A Survey on Data-Centric and Data-Aware Techniques for Large Scale Infrastructures

Authors: Silvina Caíno-Lores, Jesús Carretero

Abstract:

Large scale computing infrastructures have been widely developed with the core objective of providing a suitable platform for high-performance and high-throughput computing. These systems are designed to support resource-intensive and complex applications, which can be found in many scientific and industrial areas. Currently, large scale data-intensive applications are hindered by the high latencies that result from the access to vastly distributed data. Recent works have suggested that improving data locality is key to move towards exascale infrastructures efficiently, as solutions to this problem aim to reduce the bandwidth consumed in data transfers, and the overheads that arise from them. There are several techniques that attempt to move computations closer to the data. In this survey we analyse the different mechanisms that have been proposed to provide data locality for large scale high-performance and high-throughput systems. This survey intends to assist scientific computing community in understanding the various technical aspects and strategies that have been reported in recent literature regarding data locality. As a result, we present an overview of locality-oriented techniques, which are grouped in four main categories: application development, task scheduling, in-memory computing and storage platforms. Finally, the authors include a discussion on future research lines and synergies among the former techniques.

Keywords: data locality, data-centric computing, large scale infrastructures, cloud computing

Procedia PDF Downloads 240
3021 SPBAC: A Semantic Policy-Based Access Control for Database Query

Authors: Aaron Zhang, Alimire Kahaer, Gerald Weber, Nalin Arachchilage

Abstract:

Access control is an essential safeguard for the security of enterprise data, which controls users’ access to information resources and ensures the confidentiality and integrity of information resources [1]. Research shows that the more common types of access control now have shortcomings [2]. In this direction, to improve the existing access control, we have studied the current technologies in the field of data security, deeply investigated the previous data access control policies and their problems, identified the existing deficiencies, and proposed a new extension structure of SPBAC. SPBAC extension proposed in this paper aims to combine Policy-Based Access Control (PBAC) with semantics to provide logically connected, real-time data access functionality by establishing associations between enterprise data through semantics. Our design combines policies with linked data through semantics to create a "Semantic link" so that access control is no longer per-database and determines that users in each role should be granted access based on the instance policy, and improves the SPBAC implementation by constructing policies and defined attributes through the XACML specification, which is designed to extend on the original XACML model. While providing relevant design solutions, this paper hopes to continue to study the feasibility and subsequent implementation of related work at a later stage.

Keywords: access control, semantic policy-based access control, semantic link, access control model, instance policy, XACML

Procedia PDF Downloads 66
3020 Hindi Speech Synthesis by Concatenation of Recognized Hand Written Devnagri Script Using Support Vector Machines Classifier

Authors: Saurabh Farkya, Govinda Surampudi

Abstract:

Optical Character Recognition is one of the current major research areas. This paper is focussed on recognition of Devanagari script and its sound generation. This Paper consists of two parts. First, Optical Character Recognition of Devnagari handwritten Script. Second, speech synthesis of the recognized text. This paper shows an implementation of support vector machines for the purpose of Devnagari Script recognition. The Support Vector Machines was trained with Multi Domain features; Transform Domain and Spatial Domain or Structural Domain feature. Transform Domain includes the wavelet feature of the character. Structural Domain consists of Distance Profile feature and Gradient feature. The Segmentation of the text document has been done in 3 levels-Line Segmentation, Word Segmentation, and Character Segmentation. The pre-processing of the characters has been done with the help of various Morphological operations-Otsu's Algorithm, Erosion, Dilation, Filtration and Thinning techniques. The Algorithm was tested on the self-prepared database, a collection of various handwriting. Further, Unicode was used to convert recognized Devnagari text into understandable computer document. The document so obtained is an array of codes which was used to generate digitized text and to synthesize Hindi speech. Phonemes from the self-prepared database were used to generate the speech of the scanned document using concatenation technique.

Keywords: Character Recognition (OCR), Text to Speech (TTS), Support Vector Machines (SVM), Library of Support Vector Machines (LIBSVM)

Procedia PDF Downloads 472
3019 Local Binary Patterns-Based Statistical Data Analysis for Accurate Soccer Match Prediction

Authors: Mohammad Ghahramani, Fahimeh Saei Manesh

Abstract:

Winning a soccer game is based on thorough and deep analysis of the ongoing match. On the other hand, giant gambling companies are in vital need of such analysis to reduce their loss against their customers. In this research work, we perform deep, real-time analysis on every soccer match around the world that distinguishes our work from others by focusing on particular seasons, teams and partial analytics. Our contributions are presented in the platform called “Analyst Masters.” First, we introduce various sources of information available for soccer analysis for teams around the world that helped us record live statistical data and information from more than 50,000 soccer matches a year. Our second and main contribution is to introduce our proposed in-play performance evaluation. The third contribution is developing new features from stable soccer matches. The statistics of soccer matches and their odds before and in-play are considered in the image format versus time including the halftime. Local Binary patterns, (LBP) is then employed to extract features from the image. Our analyses reveal incredibly interesting features and rules if a soccer match has reached enough stability. For example, our “8-minute rule” implies if 'Team A' scores a goal and can maintain the result for at least 8 minutes then the match would end in their favor in a stable match. We could also make accurate predictions before the match of scoring less/more than 2.5 goals. We benefit from the Gradient Boosting Trees, GBT, to extract highly related features. Once the features are selected from this pool of data, the Decision trees decide if the match is stable. A stable match is then passed to a post-processing stage to check its properties such as betters’ and punters’ behavior and its statistical data to issue the prediction. The proposed method was trained using 140,000 soccer matches and tested on more than 100,000 samples achieving 98% accuracy to select stable matches. Our database from 240,000 matches shows that one can get over 20% betting profit per month using Analyst Masters. Such consistent profit outperforms human experts and shows the inefficiency of the betting market. Top soccer tipsters achieve 50% accuracy and 8% monthly profit in average only on regional matches. Both our collected database of more than 240,000 soccer matches from 2012 and our algorithm would greatly benefit coaches and punters to get accurate analysis.

Keywords: soccer, analytics, machine learning, database

Procedia PDF Downloads 215
3018 Web-Based Tools to Increase Public Understanding of Nuclear Technology and Food Irradiation

Authors: Denise Levy, Anna Lucia C. H. Villavicencio

Abstract:

Food irradiation is a processing and preservation technique to eliminate insects and parasites and reduce disease-causing microorganisms. Moreover, the process helps to inhibit sprouting and delay ripening, extending fresh fruits and vegetables shelf-life. Nevertheless, most Brazilian consumers seem to misunderstand the difference between irradiated food and radioactive food and the general public has major concerns about the negative health effects and environmental contamination. Society´s judgment and decision making are directly linked to perceived benefits and risks. The web-based project entitled ‘Scientific information about food irradiation: Internet as a tool to approach science and society’ was created by the Nuclear and Energetic Research Institute (IPEN), in order to offer an interdisciplinary approach to science education, integrating economic, ethical, social and political aspects of food irradiation. This project takes into account that, misinformation and unfounded preconceived ideas impact heavily on the acceptance of irradiated food and purchase intention by the Brazilian consumer. Taking advantage of the potential value of the Internet to enhance communication and education among general public, a research study was carried out regarding the possibilities and trends of Information and Communication Technologies among the Brazilian population. The content includes concepts, definitions and Frequently Asked Questions (FAQ) about processes, safety, advantages, limitations and the possibilities of food irradiation, including health issues, as well as its impacts on the environment. The project counts on eight self-instructional interactive web courses, situating scientific content in relevant social contexts in order to encourage self-learning and further reflections. Communication is a must to improve public understanding of science. The use of information technology for quality scientific divulgation shall contribute greatly to provide information throughout the country, spreading information to as many people as possible, minimizing geographic distances and stimulating communication and development.

Keywords: food irradiation, multimedia learning tools, nuclear science, society and education

Procedia PDF Downloads 229
3017 Door Fan Test in New CED at Portopalo Test Site

Authors: F. Noto, M. Castro, R. Garraffo, An. Mirabella, A. Rizzo, G. Cuttone

Abstract:

The door fan test is a verification procedure on the tightness of a room, necessary following the installation of saturation extinguishing systems and made mandatory according to the UNI 15004-1: 2019 standard whenever a gas extinguishing system is designed and installed. The door fan test was carried out at the Portopalo di Capo Passero headquarters of the Southern National Laboratories and highlighted how the Data Processing Center is perfectly up to standard, passing the door fan test in an excellent way. The Southern National Laboratories constitute a solid research reality, well established in the international scientific panorama. The CED in the Portopalo site has been expanded, so the extinguishing system has been expanded according to a detailed design. After checking the correctness of the design to verify the absence of air leaks, we carried out the door fan test. The activities of the LNS are mainly aimed at basic research in the field of Nuclear Physics, Nuclear and Particle Astrophysics. The Portopalo site will host some of the largest submarine wired scientific research infrastructures built in Europe and in the world, such as KM3NeT and EMSO ERIC; in particular, the site research laboratory in Portopalo will host the power supply and data acquisition systems of the underwater infrastructures, and a technological backbone will be created, unique in the Mediterranean, capable of allowing the connection, at abyssal depths, of dozens of real-time surveying and research structures of the marine environment deep.

Keywords: KM3Net, fire protection, door fan test, CED

Procedia PDF Downloads 76
3016 Relationship Between Health Coverage and Emergency Disease Burden

Authors: Karim Hajjar, Luis Lillo, Diego Martinez, Manuel Hermosilla, Nicholas Risko

Abstract:

Objectives: This study examines the relationship between universal health coverage (UCH) and the burden of emergency diseases at a global level. Methods: Data on Disability-Adjusted Life Years (DALYs) from emergency conditions were extracted from the Institute for Health Metrics and Evaluation (IHME) database for the years 2015 and 2019. Data on UHC, measured using two variables, 1) coverage of essential health services and 2) proportion of population spending more than 10% of household income on out-of-pocket health care expenditure, was extracted from the World Bank Database for years preceding our outcome of interest. Linear regression was performed, analyzing the effect of the UHC variables on the DALYs of emergency diseases, controlling for other variables. Results: A total of 133 countries were included. 44.4% of the analyzed countries had coverage of essential health services index of at least 70/100, and 35.3% had at least 10% of their population spend greater than 10% of their household income on healthcare. For every point increase in the coverage of essential health services index, there was a 13-point reduction in DALYs of emergency medical diseases (95% CI -16, -11). Conversely, for every percent decrease in the population with large household expenditure on healthcare, there was a 0.48 increase in DALYs of emergency medical diseases (95% CI -5.6, 4.7). Conclusions: After adjusting for multiple variables, an increase in coverage of essential health services was significantly associated with improvement in DALYs for emergency conditions. There was, however, no association between catastrophic health expenditure and DALYs.

Keywords: emergency medicine, universal healthcare, global health, health economics

Procedia PDF Downloads 72
3015 Improve B-Tree Index’s Performance Using Lock-Free Hash Table

Authors: Zhanfeng Ma, Zhiping Xiong, Hu Yin, Zhengwei She, Aditya P. Gurajada, Tianlun Chen, Ying Li

Abstract:

Many RDBMS vendors use B-tree index to achieve high performance for point queries and range queries, and some of them also employ hash index to further enhance the performance as hash table is more efficient for point queries. However, there are extra overheads to maintain a separate hash index, for example, hash mapping for all data records must always be maintained, which results in more memory space consumption; locking, logging and other mechanisms are needed to guarantee ACID, which affects the concurrency and scalability of the system. To relieve the overheads, Hash Cached B-tree (HCB) index is proposed in this paper, which consists of a standard disk-based B-tree index and an additional in-memory lock-free hash table. Initially, only the B-tree index is constructed for all data records, the hash table is built on the fly based on runtime workload, only data records accessed by point queries are indexed using hash table, this helps reduce the memory footprint. Changes to hash table are done using compare-and-swap (CAS) without performing locking and logging, this helps improve the concurrency and avoid contention. The hash table is also optimized to be cache conscious. HCB index is implemented in SAP ASE database, compared with the standard B-tree index, early experiments and customer adoptions show significant performance improvement. This paper provides an overview of the design of HCB index and reports the experimental results.

Keywords: B-tree, compare-and-swap, lock-free hash table, point queries, range queries, SAP ASE database

Procedia PDF Downloads 266
3014 Studying the Effectiveness of Using Narrative Animation on Students’ Understanding of Complex Scientific Concepts

Authors: Atoum Abdullah

Abstract:

The purpose of this research is to determine the extent to which computer animation and narration affect students’ understanding of complex scientific concepts and improve their exam performance, this is compared to traditional lectures that include PowerPoints with texts and static images. A mixed-method design in data collection was used, including quantitative and qualitative data. Quantitative data was collected using a pre and post-test method and a close-ended questionnaire. Qualitative data was collected through an open-ended questionnaire. A pre and posttest strategy was used to measure the level of students’ understanding with and without the use of animation. The test included multiple-choice questions to test factual knowledge, open-ended questions to test conceptual knowledge, and to label the diagram questions to test application knowledge. The results showed that students on average, performed significantly higher on the posttest as compared to the pretest on all areas of acquired knowledge. However, the increase in the posttest score with respect to the acquisition of conceptual and application knowledge was higher compared to the increase in the posttest score with respect to the acquisition of factual knowledge. This result demonstrates that animation is more beneficial when acquiring deeper, conceptual, and cognitive knowledge than when only factual knowledge is acquired.

Keywords: animation, narration, science, teaching

Procedia PDF Downloads 153
3013 A Comparative Study of the Impact of Membership in International Climate Change Treaties and the Environmental Kuznets Curve (EKC) in Line with Sustainable Development Theories

Authors: Mojtaba Taheri, Saied Reza Ameli

Abstract:

In this research, we have calculated the effect of membership in international climate change treaties for 20 developed countries based on the human development index (HDI) and compared this effect with the process of pollutant reduction in the Environmental Kuznets Curve (EKC) theory. For this purpose, the data related to The real GDP per capita with 2010 constant prices is selected from the World Development Indicators (WDI) database. Ecological Footprint (ECOFP) is the amount of biologically productive land needed to meet human needs and absorb carbon dioxide emissions. It is measured in global hectares (gha), and the data retrieved from the Global Ecological Footprint (2021) database will be used, and we will proceed by examining step by step and performing several series of targeted statistical regressions. We will examine the effects of different control variables, including Energy Consumption Structure (ECS) will be counted as the share of fossil fuel consumption in total energy consumption and will be extracted from The United States Energy Information Administration (EIA) (2021) database. Energy Production (EP) refers to the total production of primary energy by all energy-producing enterprises in one country at a specific time. It is a comprehensive indicator that shows the capacity of energy production in the country, and the data for its 2021 version, like the Energy Consumption Structure, is obtained from (EIA). Financial development (FND) is defined as the ratio of private credit to GDP, and to some extent based on the stock market value, also as a ratio to GDP, and is taken from the (WDI) 2021 version. Trade Openness (TRD) is the sum of exports and imports of goods and services measured as a share of GDP, and we use the (WDI) data (2021) version. Urbanization (URB) is defined as the share of the urban population in the total population, and for this data, we used the (WDI) data source (2021) version. The descriptive statistics of all the investigated variables are presented in the results section. Related to the theories of sustainable development, Environmental Kuznets Curve (EKC) is more significant in the period of study. In this research, we use more than fourteen targeted statistical regressions to purify the net effects of each of the approaches and examine the results.

Keywords: climate change, globalization, environmental economics, sustainable development, international climate treaty

Procedia PDF Downloads 49
3012 Ultra-Fast pH-Gradient Ion Exchange Chromatography for the Separation of Monoclonal Antibody Charge Variants

Authors: Robert van Ling, Alexander Schwahn, Shanhua Lin, Ken Cook, Frank Steiner, Rowan Moore, Mauro de Pra

Abstract:

Purpose: Demonstration of fast high resolution charge variant analysis for monoclonal antibody (mAb) therapeutics within 5 minutes. Methods: Three commercially available mAbs were used for all experiments. The charge variants of therapeutic mAbs (Bevacizumab, Cetuximab, Infliximab, and Trastuzumab) are analyzed on a strong cation exchange column with a linear pH gradient separation method. The linear gradient from pH 5.6 to pH 10.2 is generated over time by running a linear pump gradient from 100% Thermo Scientific™ CX-1 pH Gradient Buffer A (pH 5.6) to 100% CX-1 pH Gradient Buffer B (pH 10.2), using the Thermo Scientific™ Vanquish™ UHPLC system. Results: The pH gradient method is generally applicable to monoclonal antibody charge variant analysis. In conjunction with state-of-the-art column and UHPLC technology, ultra fast high-resolution separations are consistently achieved in under 5 minutes for all mAbs analyzed. Conclusion: The linear pH gradient method is a platform method for mAb charge variant analysis. The linear pH gradient method can be easily optimized to improve separations and shorten cycle times. Ultra-fast charge variant separation is facilitated with UHPLC that complements, and in some instances outperforms CE approaches in terms of both resolution and throughput.

Keywords: charge variants, ion exchange chromatography, monoclonal antibody, UHPLC

Procedia PDF Downloads 422
3011 Verification Protocols for the Lightning Protection of a Large Scale Scientific Instrument in Harsh Environments: A Case Study

Authors: Clara Oliver, Oibar Martinez, Jose Miguel Miranda

Abstract:

This paper is devoted to the study of the most suitable protocols to verify the lightning protection and ground resistance quality in a large-scale scientific facility located in a harsh environment. We illustrate this work by reviewing a case study: the largest telescopes of the Northern Hemisphere Cherenkov Telescope Array, CTA-N. This array hosts sensitive and high-speed optoelectronics instrumentation and sits on a clear, free from obstacle terrain at around 2400 m above sea level. The site offers a top-quality sky but also features challenging conditions for a lightning protection system: the terrain is volcanic and has resistivities well above 1 kOhm·m. In addition, the environment often exhibits humidities well below 5%. On the other hand, the high complexity of a Cherenkov telescope structure does not allow a straightforward application of lightning protection standards. CTA-N has been conceived as an array of fourteen Cherenkov Telescopes of two different sizes, which will be constructed in La Palma Island, Spain. Cherenkov Telescopes can provide valuable information on different astrophysical sources from the gamma rays reaching the Earth’s atmosphere. The largest telescopes of CTA are called LST’s, and the construction of the first one was finished in October 2018. The LST has a shape which resembles a large parabolic antenna, with a 23-meter reflective surface supported by a tubular structure made of carbon fibers and steel tubes. The reflective surface has 400 square meters and is made of an array of segmented mirrors that can be controlled individually by a subsystem of actuators. This surface collects and focuses the Cherenkov photons into the camera, where 1855 photo-sensors convert the light in electrical signals that can be processed by dedicated electronics. We describe here how the risk assessment of direct strike impacts was made and how down conductors and ground system were both tested. The verification protocols which should be applied for the commissioning and operation phases are then explained. We stress our attention on the ground resistance quality assessment.

Keywords: grounding, large scale scientific instrument, lightning risk assessment, lightning standards and safety

Procedia PDF Downloads 109
3010 Experimental Setup of Corona Discharge on Dye Degradation for Science Education

Authors: Shivam Dubey, Vinit Srivastava, Abhay Singh Thakur, Rahul Vaish

Abstract:

The presence of organic dyes in water is a critical issue that poses a significant threat to the environment and human health. We have investigated the use of corona discharge as a potential method for degrading organic dyes in water. Methylene Blue dye was exposed to corona discharge, and its photo-absorbance was measured over time to determine the extent of degradation. The results depicted a decreased absorbance for the dye and the loss of the characteristic colour of methylene blue. The effects of various parameters, including current, voltage, gas phase, salinity, and electrode spacing, on the reaction rates, were investigated. The highest reaction rates were observed at the highest current and voltage (up to 10kV), lowest salinity, smallest electrode spacing, and an environment containing enhanced levels of oxygen. These findings have possible applications for science education curriculum. By investigating the use of corona discharge for destroying organic dyes, we can provide students with a practical application of scientific principles that they can apply to real-world problems. This research can demonstrate the importance of understanding the chemical and physical properties of organic dyes and the effects of corona discharge on their degradation and provide a holistic understanding of the applications of scientific research. Moreover, our study also emphasizes the importance of considering the various parameters that can affect reaction rates. By investigating the effects of current, voltage, matter phase, salinity, and electrode spacing, we can provide students with an opportunity to learn about the importance of experimental design and how to evade constraints that can limit meaningful results. In conclusion, this study has the potential to provide valuable insights into the use of corona discharge for destroying organic dyes in water and has significant implications for science education. By highlighting the practical applications of scientific principles, experimental design, and the importance of considering various parameters, this research can help students develop critical thinking skills and prepare them for future careers in science and engineering.

Keywords: dye degradation, corona discharge, science education, hands-on learning, chemical education

Procedia PDF Downloads 55
3009 A Robust Spatial Feature Extraction Method for Facial Expression Recognition

Authors: H. G. C. P. Dinesh, G. Tharshini, M. P. B. Ekanayake, G. M. R. I. Godaliyadda

Abstract:

This paper presents a new spatial feature extraction method based on principle component analysis (PCA) and Fisher Discernment Analysis (FDA) for facial expression recognition. It not only extracts reliable features for classification, but also reduces the feature space dimensions of pattern samples. In this method, first each gray scale image is considered in its entirety as the measurement matrix. Then, principle components (PCs) of row vectors of this matrix and variance of these row vectors along PCs are estimated. Therefore, this method would ensure the preservation of spatial information of the facial image. Afterwards, by incorporating the spectral information of the eigen-filters derived from the PCs, a feature vector was constructed, for a given image. Finally, FDA was used to define a set of basis in a reduced dimension subspace such that the optimal clustering is achieved. The method of FDA defines an inter-class scatter matrix and intra-class scatter matrix to enhance the compactness of each cluster while maximizing the distance between cluster marginal points. In order to matching the test image with the training set, a cosine similarity based Bayesian classification was used. The proposed method was tested on the Cohn-Kanade database and JAFFE database. It was observed that the proposed method which incorporates spatial information to construct an optimal feature space outperforms the standard PCA and FDA based methods.

Keywords: facial expression recognition, principle component analysis (PCA), fisher discernment analysis (FDA), eigen-filter, cosine similarity, bayesian classifier, f-measure

Procedia PDF Downloads 408
3008 Methodological Deficiencies in Knowledge Representation Conceptual Theories of Artificial Intelligence

Authors: Nasser Salah Eldin Mohammed Salih Shebka

Abstract:

Current problematic issues in AI fields are mainly due to those of knowledge representation conceptual theories, which in turn reflected on the entire scope of cognitive sciences. Knowledge representation methods and tools are driven from theoretical concepts regarding human scientific perception of the conception, nature, and process of knowledge acquisition, knowledge engineering and knowledge generation. And although, these theoretical conceptions were themselves driven from the study of the human knowledge representation process and related theories; some essential factors were overlooked or underestimated, thus causing critical methodological deficiencies in the conceptual theories of human knowledge and knowledge representation conceptions. The evaluation criteria of human cumulative knowledge from the perspectives of nature and theoretical aspects of knowledge representation conceptions are affected greatly by the very materialistic nature of cognitive sciences. This nature caused what we define as methodological deficiencies in the nature of theoretical aspects of knowledge representation concepts in AI. These methodological deficiencies are not confined to applications of knowledge representation theories throughout AI fields, but also exceeds to cover the scientific nature of cognitive sciences. The methodological deficiencies we investigated in our work are: - The Segregation between cognitive abilities in knowledge driven models.- Insufficiency of the two-value logic used to represent knowledge particularly on machine language level in relation to the problematic issues of semantics and meaning theories. - Deficient consideration of the parameters of (existence) and (time) in the structure of knowledge. The latter requires that we present a more detailed introduction of the manner in which the meanings of Existence and Time are to be considered in the structure of knowledge. This doesn’t imply that it’s easy to apply in structures of knowledge representation systems, but outlining a deficiency caused by the absence of such essential parameters, can be considered as an attempt to redefine knowledge representation conceptual approaches, or if proven impossible; constructs a perspective on the possibility of simulating human cognition on machines. Furthermore, a redirection of the aforementioned expressions is required in order to formulate the exact meaning under discussion. This redirection of meaning alters the role of Existence and time factors to the Frame Work Environment of knowledge structure; and therefore; knowledge representation conceptual theories. Findings of our work indicate the necessity to differentiate between two comparative concepts when addressing the relation between existence and time parameters, and between that of the structure of human knowledge. The topics presented throughout the paper can also be viewed as an evaluation criterion to determine AI’s capability to achieve its ultimate objectives. Ultimately, we argue some of the implications of our findings that suggests that; although scientific progress may have not reached its peak, or that human scientific evolution has reached a point where it’s not possible to discover evolutionary facts about the human Brain and detailed descriptions of how it represents knowledge, but it simply implies that; unless these methodological deficiencies are properly addressed; the future of AI’s qualitative progress remains questionable.

Keywords: cognitive sciences, knowledge representation, ontological reasoning, temporal logic

Procedia PDF Downloads 92
3007 Emotional Security in Relation to Students' Emotional Efficiency

Authors: Ibtisam Mahmoud Mohammed Sultan

Abstract:

The present research aimed to identify the level of both emotional and emotional competence among students in Tikrit University aimed to know the assumptions in statistical significance for both variables as gender variables (m-f) and specialty (scientific-humanistic), as research to learn what Relationship between emotional safety and efficiency alanfaalet Tikrit University students. The researcher built emotional security measure (54) as built measure emotional competence (46), as the researcher extract full alsaykomtrih characteristics of both scales. The research sample consisted of (600) students selected by the random way and applying the scales on a basic search sample and processed statistical data using a variety of methods, including statistical test (test T.) and Pearson correlation coefficient, the researcher found a set of results. The following: 1. that the Tikrit University students possess a high level of emotional security. 2. to safely enjoy passionate males more than females. 3. that there is no difference between students of scientific and humanitarian specialization in variable emotional security. 4. that the Tikrit University students enjoy a high level of emotional competence. 5. the female-male outperforming in emotional competence level. 6. the humanitarian specialization students Excel in emotional competence for those of specialty. 7. the existence of a positive correlation between variables. Through search results, the researcher has developed a set of conclusions, proposals, and recommendations.

Keywords: relation, emotional security, students, efficiency

Procedia PDF Downloads 98
3006 Evidence of a Negativity Bias in the Keywords of Scientific Papers

Authors: Kseniia Zviagintseva, Brett Buttliere

Abstract:

Science is fundamentally a problem-solving enterprise, and scientists pay more attention to the negative things, that cause them dissonance and negative affective state of uncertainty or contradiction. While this is agreed upon by philosophers of science, there are few empirical demonstrations. Here we examine the keywords from those papers published by PLoS in 2014 and show with several sentiment analyzers that negative keywords are studied more than positive keywords. Our dataset is the 927,406 keywords of 32,870 scientific articles in all fields published in 2014 by the journal PLOS ONE (collected from Altmetric.com). Counting how often the 47,415 unique keywords are used, we can examine whether those negative topics are studied more than positive. In order to find the sentiment of the keywords, we utilized two sentiment analysis tools, Hu and Liu (2004) and SentiStrength (2014). The results below are for Hu and Liu as these are the less convincing results. The average keyword was utilized 19.56 times, with half of the keywords being utilized only 1 time and the maximum number of uses being 18,589 times. The keywords identified as negative were utilized 37.39 times, on average, with the positive keywords being utilized 14.72 times and the neutral keywords - 19.29, on average. This difference is only marginally significant, with an F value of 2.82, with a p of .05, but one must keep in mind that more than half of the keywords are utilized only 1 time, artificially increasing the variance and driving the effect size down. To examine more closely, we looked at those top 25 most utilized keywords that have a sentiment. Among the top 25, there are only two positive words, ‘care’ and ‘dynamics’, in position numbers 5 and 13 respectively, with all the rest being identified as negative. ‘Diseases’ is the most studied keyword with 8,790 uses, with ‘cancer’ and ‘infectious’ being the second and fourth most utilized sentiment-laden keywords. The sentiment analysis is not perfect though, as the words ‘diseases’ and ‘disease’ are split by taking 1st and 3rd positions. Combining them, they remain as the most common sentiment-laden keyword, being utilized 13,236 times. More than just splitting the words, the sentiment analyzer logs ‘regression’ and ‘rat’ as negative, and these should probably be considered false positives. Despite these potential problems, the effect is apparent, as even the positive keywords like ‘care’ could or should be considered negative, since this word is most commonly utilized as a part of ‘health care’, ‘critical care’ or ‘quality of care’ and generally associated with how to improve it. All in all, the results suggest that negative concepts are studied more, also providing support for the notion that science is most generally a problem-solving enterprise. The results also provide evidence that negativity and contradiction are related to greater productivity and positive outcomes.

Keywords: bibliometrics, keywords analysis, negativity bias, positive and negative words, scientific papers, scientometrics

Procedia PDF Downloads 165
3005 Open Science Philosophy, Research and Innovation

Authors: C.Ardil

Abstract:

Open Science translates the understanding and application of various theories and practices in open science philosophy, systems, paradigms and epistemology. Open Science originates with the premise that universal scientific knowledge is a product of a collective scholarly and social collaboration involving all stakeholders and knowledge belongs to the global society. Scientific outputs generated by public research are a public good that should be available to all at no cost and without barriers or restrictions. Open Science has the potential to increase the quality, impact and benefits of science and to accelerate advancement of knowledge by making it more reliable, more efficient and accurate, better understandable by society and responsive to societal challenges, and has the potential to enable growth and innovation through reuse of scientific results by all stakeholders at all levels of society, and ultimately contribute to growth and competitiveness of global society. Open Science is a global movement to improve accessibility to and reusability of research practices and outputs. In its broadest definition, it encompasses open access to publications, open research data and methods, open source, open educational resources, open evaluation, and citizen science. The implementation of open science provides an excellent opportunity to renegotiate the social roles and responsibilities of publicly funded research and to rethink the science system as a whole. Open Science is the practice of science in such a way that others can collaborate and contribute, where research data, lab notes and other research processes are freely available, under terms that enable reuse, redistribution and reproduction of the research and its underlying data and methods. Open Science represents a novel systematic approach to the scientific process, shifting from the standard practices of publishing research results in scientific publications towards sharing and using all available knowledge at an earlier stage in the research process, based on cooperative work and diffusing scholarly knowledge with no barriers and restrictions. Open Science refers to efforts to make the primary outputs of publicly funded research results (publications and the research data) publicly accessible in digital format with no limitations. Open Science is about extending the principles of openness to the whole research cycle, fostering, sharing and collaboration as early as possible, thus entailing a systemic change to the way science and research is done. Open Science is the ongoing transition in how open research is carried out, disseminated, deployed, and transformed to make scholarly research more open, global, collaborative, creative and closer to society. Open Science involves various movements aiming to remove the barriers for sharing any kind of output, resources, methods or tools, at any stage of the research process. Open Science embraces open access to publications, research data, source software, collaboration, peer review, notebooks, educational resources, monographs, citizen science, or research crowdfunding. The recognition and adoption of open science practices, including open science policies that increase open access to scientific literature and encourage data and code sharing, is increasing in the open science philosophy. Revolutionary open science policies are motivated by ethical, moral or utilitarian arguments, such as the right to access digital research literature for open source research or science data accumulation, research indicators, transparency in the field of academic practice, and reproducibility. Open science philosophy is adopted primarily to demonstrate the benefits of open science practices. Researchers use open science applications for their own advantage in order to get more offers, increase citations, attract media attention, potential collaborators, career opportunities, donations and funding opportunities. In open science philosophy, open data findings are evidence that open science practices provide significant benefits to researchers in scientific research creation, collaboration, communication, and evaluation according to more traditional closed science practices. Open science considers concerns such as the rigor of peer review, common research facts such as financing and career development, and the sacrifice of author rights. Therefore, researchers are recommended to implement open science research within the framework of existing academic evaluation and incentives. As a result, open science research issues are addressed in the areas of publishing, financing, collaboration, resource management and sharing, career development, discussion of open science questions and conclusions.

Keywords: Open Science, Open Science Philosophy, Open Science Research, Open Science Data

Procedia PDF Downloads 110
3004 The Role and Effects of Communication on Occupational Safety: A Review

Authors: Pieter A. Cornelissen, Joris J. Van Hoof

Abstract:

The interest in improving occupational safety started almost simultaneously with the beginning of the Industrial Revolution. Yet, it was not until the late 1970’s before the role of communication was considered in scientific research regarding occupational safety. In recent years the importance of communication as a means to improve occupational safety has increased. Not only as communication might have a direct effect on safety performance and safety outcomes, but also as it can be viewed as a major component of other important safety-related elements (e.g., training, safety meetings, leadership). And while safety communication is an increasingly important topic in research, its operationalization is often vague and differs among studies. This is not only problematic when comparing results, but also in applying these results to practice and the work floor. By means of an in-depth analysis—building on an existing dataset—this review aims to overcome these problems. The initial database search yielded 25.527 articles, which was reduced to a research corpus of 176 articles. Focusing on the 37 articles of this corpus that addressed communication (related to safety outcomes and safety performance), the current study will provide a comprehensive overview of the role and effects of safety communication and outlines the conditions under which communication contributes to a safer work environment. The study shows that in literature a distinction is commonly made between safety communication (i.e., the exchange or dissemination of safety-related information) and feedback (i.e. a reactive form of communication). And although there is a consensus among researchers that both communication and feedback positively affect safety performance, there is a debate about the directness of this relationship. Whereas some researchers assume a direct relationship between safety communication and safety performance, others state that this relationship is mediated by safety climate. One of the key findings is that despite the strongly present view that safety communication is a formal and top-down safety management tool, researchers stress the importance of open communication that encourages and allows employees to express their worries, experiences, views, and share information. This raises questions with regard to other directions (e.g., bottom-up, horizontal) and forms of communication (e.g., informal). The current review proposes a framework to overcome the often vague and different operationalizations of safety communication. The proposed framework can be used to characterize safety communication in terms of stakeholders, direction, and characteristics of communication (e.g., medium usage).

Keywords: communication, feedback, occupational safety, review

Procedia PDF Downloads 278
3003 A Framework for Secure Information Flow Analysis in Web Applications

Authors: Ralph Adaimy, Wassim El-Hajj, Ghassen Ben Brahim, Hazem Hajj, Haidar Safa

Abstract:

Huge amounts of data and personal information are being sent to and retrieved from web applications on daily basis. Every application has its own confidentiality and integrity policies. Violating these policies can have broad negative impact on the involved company’s financial status, while enforcing them is very hard even for the developers with good security background. In this paper, we propose a framework that enforces security-by-construction in web applications. Minimal developer effort is required, in a sense that the developer only needs to annotate database attributes by a security class. The web application code is then converted into an intermediary representation, called Extended Program Dependence Graph (EPDG). Using the EPDG, the provided annotations are propagated to the application code and run against generic security enforcement rules that were carefully designed to detect insecure information flows as early as they occur. As a result, any violation in the data’s confidentiality or integrity policies is reported. As a proof of concept, two PHP web applications, Hotel Reservation and Auction, were used for testing and validation. The proposed system was able to catch all the existing insecure information flows at their source. Moreover and to highlight the simplicity of the suggested approaches vs. existing approaches, two professional web developers assessed the annotation tasks needed in the presented case studies and provided a very positive feedback on the simplicity of the annotation task.

Keywords: web applications security, secure information flow, program dependence graph, database annotation

Procedia PDF Downloads 447
3002 TARF: Web Toolkit for Annotating RNA-Related Genomic Features

Authors: Jialin Ma, Jia Meng

Abstract:

Genomic features, the genome-based coordinates, are commonly used for the representation of biological features such as genes, RNA transcripts and transcription factor binding sites. For the analysis of RNA-related genomic features, such as RNA modification sites, a common task is to correlate these features with transcript components (5'UTR, CDS, 3'UTR) to explore their distribution characteristics in terms of transcriptomic coordinates, e.g., to examine whether a specific type of biological feature is enriched near transcription start sites. Existing approaches for performing these tasks involve the manipulation of a gene database, conversion from genome-based coordinate to transcript-based coordinate, and visualization methods that are capable of showing RNA transcript components and distribution of the features. These steps are complicated and time consuming, and this is especially true for researchers who are not familiar with relevant tools. To overcome this obstacle, we develop a dedicated web app TARF, which represents web toolkit for annotating RNA-related genomic features. TARF web tool intends to provide a web-based way to easily annotate and visualize RNA-related genomic features. Once a user has uploaded the features with BED format and specified a built-in transcript database or uploaded a customized gene database with GTF format, the tool could fulfill its three main functions. First, it adds annotation on gene and RNA transcript components. For every features provided by the user, the overlapping with RNA transcript components are identified, and the information is combined in one table which is available for copy and download. Summary statistics about ambiguous belongings are also carried out. Second, the tool provides a convenient visualization method of the features on single gene/transcript level. For the selected gene, the tool shows the features with gene model on genome-based view, and also maps the features to transcript-based coordinate and show the distribution against one single spliced RNA transcript. Third, a global transcriptomic view of the genomic features is generated utilizing the Guitar R/Bioconductor package. The distribution of features on RNA transcripts are normalized with respect to RNA transcript landmarks and the enrichment of the features on different RNA transcript components is demonstrated. We tested the newly developed TARF toolkit with 3 different types of genomics features related to chromatin H3K4me3, RNA N6-methyladenosine (m6A) and RNA 5-methylcytosine (m5C), which are obtained from ChIP-Seq, MeRIP-Seq and RNA BS-Seq data, respectively. TARF successfully revealed their respective distribution characteristics, i.e. H3K4me3, m6A and m5C are enriched near transcription starting sites, stop codons and 5’UTRs, respectively. Overall, TARF is a useful web toolkit for annotation and visualization of RNA-related genomic features, and should help simplify the analysis of various RNA-related genomic features, especially those related RNA modifications.

Keywords: RNA-related genomic features, annotation, visualization, web server

Procedia PDF Downloads 190
3001 Iris Cancer Detection System Using Image Processing and Neural Classifier

Authors: Abdulkader Helwan

Abstract:

Iris cancer, so called intraocular melanoma is a cancer that starts in the iris; the colored part of the eye that surrounds the pupil. There is a need for an accurate and cost-effective iris cancer detection system since the available techniques used currently are still not efficient. The combination of the image processing and artificial neural networks has a great efficiency for the diagnosis and detection of the iris cancer. Image processing techniques improve the diagnosis of the cancer by enhancing the quality of the images, so the physicians diagnose properly. However, neural networks can help in making decision; whether the eye is cancerous or not. This paper aims to develop an intelligent system that stimulates a human visual detection of the intraocular melanoma, so called iris cancer. The suggested system combines both image processing techniques and neural networks. The images are first converted to grayscale, filtered, and then segmented using prewitt edge detection algorithm to detect the iris, sclera circles and the cancer. The principal component analysis is used to reduce the image size and for extracting features. Those features are considered then as inputs for a neural network which is capable of deciding if the eye is cancerous or not, throughout its experience adopted by many training iterations of different normal and abnormal eye images during the training phase. Normal images are obtained from a public database available on the internet, “Mile Research”, while the abnormal ones are obtained from another database which is the “eyecancer”. The experimental results for the proposed system show high accuracy 100% for detecting cancer and making the right decision.

Keywords: iris cancer, intraocular melanoma, cancerous, prewitt edge detection algorithm, sclera

Procedia PDF Downloads 480
3000 The Universal Theory: Role of Imaginary Pressure on Different Relative Motions

Authors: Sahib Dino Naseerani

Abstract:

The presented scientific text discusses the concept of imaginary pressure and its role in different relative motions. It explores how imaginary pressure, which is the combined effect of external atmospheric pressure and real pressure, affects various substances and their physical properties. The study aims to understand the impact of imaginary pressure and its potential applications in different contexts, such as spaceflight. The main objective of this study is to investigate the role of imaginary pressure on different relative motions. Specifically, the researchers aim to examine how imaginary pressure affects the contraction and mass variation of a body when it is in motion at the speed of light. The study seeks to provide insights into the behavior and consequences of imaginary pressure in various scenarios. The data was collected using three research papers. This research contributes to a better understanding of the theoretical implications of imaginary pressure. It elucidates how imaginary pressure is responsible for the contraction and mass variation of a body in motion, particularly at the speed of light. The findings shed light on the behavior of substances under the influence of imaginary pressure, providing valuable insights for future scientific studies. The study addresses the question of how imaginary pressure influences various relative motions and their associated physical properties. It aims to understand the role of imaginary pressure in the contraction and mass variation of a body, particularly at high speeds. By examining different substances in liquid and solid forms, the research explores the consequences of imaginary pressure on their volume, length, and mass.

Keywords: imaginary pressure, contraction, variation, relative motion

Procedia PDF Downloads 77
2999 Reducing Flood Risk in a Megacity: Using Mobile Application and Value Capture for Flood Risk Prevention and Risk Reduction Financing

Authors: Dedjo Yao Simon, Takahiro Saito, Norikazu Inuzuka, Ikuo Sugiyama

Abstract:

The megacity of Abidjan is a coastal urban area where the number of floods reported and the associated impacts are on a rapid increase due to climate change, an uncontrolled urbanization, a rapid population increase, a lack of flood disaster mitigation and citizens’ awareness. The objective of this research is to reduce in the short and long term period, the human and socio-economic impact of the flood. Hydrological simulation is applied on free of charge global spatial data (digital elevation model, satellite-based rainfall estimate, landuse) to identify the flood-prone area and to map the risk of flood. A direct interview to a sample residents is used to validate the simulation results. Then a mobile application (Flood Locator) is prototyped to disseminate the risk information to the citizen. In addition, a value capture strategy is proposed to mobilize financial resource for disaster risk reduction (DRRf) to reduce the impact of the flood. The town of Cocody in Abidjan is selected as a case study area to implement this research. The mapping of the flood risk reveals that population living in the study area is highly vulnerable. For a 5-year flood, more than 60% of the floodplain is affected by a water depth of at least 0.5 meters; and more than 1000 ha with at least 5000 buildings are directly exposed. The risk becomes higher for a 50 and 100-year floods. Also, the interview reveals that the majority of the citizen are not aware of the risk and severity of flooding in their community. This shortage of information is overcome by the Flood Locator and by an urban flood database we prototype for accumulate flood data. Flood Locator App allows the users to view floodplain and depth on a digital map; the user can activate the GPS sensor of the mobile to visualize his location on the map. Some more important additional features allow the citizen user to capture flood events and damage information that they can send remotely to the database. Also, the disclosure of the risk information could result to a decrement (-14%) of the value of properties locate inside floodplain and an increment (+19%) of the value of property in the suburb area. The tax increment due to the higher tax increment in the safer area should be captured to constitute the DRRf. The fund should be allocated to the reduction of flood risk for the benefit of people living in flood-prone areas. The flood prevention system discusses in this research will minimize in the short and long term the direct damages in the risky area due to effective awareness of citizen and the availability of DRRf. It will also contribute to the growth of the urban area in the safer zone and reduce human settlement in the risky area in the long term. Data accumulated in the urban flood database through the warning app will contribute to regenerate Abidjan towards the more resilient city by means of risk avoidable landuse in the master plan.

Keywords: abidjan, database, flood, geospatial techniques, risk communication, smartphone, value capture

Procedia PDF Downloads 259
2998 The Impact of Prior Cancer History on the Prognosis of Salivary Gland Cancer Patients: A Population-based Study from the Surveillance, Epidemiology, and End Results (SEER) Database

Authors: Junhong Li, Danni Cheng, Yaxin Luo, Xiaowei Yi, Ke Qiu, Wendu Pang, Minzi Mao, Yufang Rao, Yao Song, Jianjun Ren, Yu Zhao

Abstract:

Background: The number of multiple cancer patients was increasing, and the impact of prior cancer history on salivary gland cancer patients remains unclear. Methods: Clinical, demographic and pathological information on salivary gland cancer patients were retrospectively collected from the Surveillance, Epidemiology, and End Results (SEER) database from 2004 to 2017, and the characteristics and prognosis between patients with a prior cancer and those without prior caner were compared. Univariate and multivariate cox proportional regression models were used for the analysis of prognosis. A risk score model was established to exam the impact of treatment on patients with a prior cancer in different risk groups. Results: A total of 9098 salivary gland cancer patients were identified, and 1635 of them had a prior cancer history. Salivary gland cancer patients with prior cancer had worse survival compared with those without a prior cancer (p<0.001). Patients with a different type of first cancer had a distinct prognosis (p<0.001), and longer latent time was associated with better survival (p=0.006) in the univariate model, although both became nonsignificant in the multivariate model. Salivary gland cancer patients with a prior cancer were divided into low-risk (n= 321), intermediate-risk (n=223), and high-risk (n=62) groups and the results showed that patients at high risk could benefit from surgery, radiation therapy, and chemotherapy, and those at intermediate risk could benefit from surgery. Conclusion: Prior cancer history had an adverse impact on the survival of salivary gland cancer patients, and individualized treatment should be seriously considered for them.

Keywords: prior cancer history, prognosis, salivary gland cancer, SEER

Procedia PDF Downloads 124