Search results for: privacy loss
2121 Transportation Accidents Mortality Modeling in Thailand
Authors: W. Sriwattanapongse, S. Prasitwattanaseree, S. Wongtrangan
Abstract:
The transportation accidents mortality is a major problem that leads to loss of human lives, and economic. The objective was to identify patterns of statistical modeling for estimating mortality rates due to transportation accidents in Thailand by using data from 2000 to 2009. The data was taken from the death certificate, vital registration database. The number of deaths and mortality rates were computed classifying by gender, age, year and region. There were 114,790 cases of transportation accidents deaths. The highest average age-specific transport accident mortality rate is 3.11 per 100,000 per year in males, Southern region and the lowest average age-specific transport accident mortality rate is 1.79 per 100,000 per year in females, North-East region. Linear, poisson and negative binomial models were chosen for fitting statistical model. Among the models fitted, the best was chosen based on the analysis of deviance and AIC. The negative binomial model was clearly appropriate fitted.Keywords: transportation accidents, mortality, modeling, analysis of deviance
Procedia PDF Downloads 2442120 WebAppShield: An Approach Exploiting Machine Learning to Detect SQLi Attacks in an Application Layer in Run-time
Authors: Ahmed Abdulla Ashlam, Atta Badii, Frederic Stahl
Abstract:
In recent years, SQL injection attacks have been identified as being prevalent against web applications. They affect network security and user data, which leads to a considerable loss of money and data every year. This paper presents the use of classification algorithms in machine learning using a method to classify the login data filtering inputs into "SQLi" or "Non-SQLi,” thus increasing the reliability and accuracy of results in terms of deciding whether an operation is an attack or a valid operation. A method Web-App auto-generated twin data structure replication. Shielding against SQLi attacks (WebAppShield) that verifies all users and prevents attackers (SQLi attacks) from entering and or accessing the database, which the machine learning module predicts as "Non-SQLi" has been developed. A special login form has been developed with a special instance of data validation; this verification process secures the web application from its early stages. The system has been tested and validated, up to 99% of SQLi attacks have been prevented.Keywords: SQL injection, attacks, web application, accuracy, database
Procedia PDF Downloads 1502119 A Study on Application of Elastic Theory for Computing Flexural Stresses in Preflex Beam
Authors: Nasiri Ahmadullah, Shimozato Tetsuhiro, Masayuki Tai
Abstract:
This paper presents the step-by-step procedure for using Elastic Theory to calculate the internal stresses in composite bridge girders prestressed by the Preflexing Technology, called Prebeam in Japan and Preflex beam worldwide. Elastic Theory approaches preflex beams the same way as it does the conventional composite girders. Since preflex beam undergoes different stages of construction, calculations are made using different sectional and material properties. Stresses are calculated in every stage using the properties of the specific section. Stress accumulation gives the available stress in a section of interest. Concrete presence in the section implies prestress loss due to creep and shrinkage, however; more work is required to be done in this field. In addition to the graphical presentation of this application, this paper further discusses important notes of graphical comparison between the results of an experimental-only research carried out on a preflex beam, with the results of simulation based on the elastic theory approach, for an identical beam using Finite Element Modeling (FEM) by the author.Keywords: composite girder, Elastic Theory, preflex beam, prestressing
Procedia PDF Downloads 2792118 Analyzing the Risk Based Approach in General Data Protection Regulation: Basic Challenges Connected with Adapting the Regulation
Authors: Natalia Kalinowska
Abstract:
The adoption of the General Data Protection Regulation, (GDPR) finished the four-year work of the European Commission in this area in the European Union. Considering far-reaching changes, which will be applied by GDPR, the European legislator envisaged two-year transitional period. Member states and companies have to prepare for a new regulation until 25 of May 2018. The idea, which becomes a new look at an attitude to data protection in the European Union is risk-based approach. So far, as a result of implementation of Directive 95/46/WE, in many European countries (including Poland) there have been adopted very particular regulations, specifying technical and organisational security measures e.g. Polish implementing rules indicate even how long password should be. According to the new approach from May 2018, controllers and processors will be obliged to apply security measures adequate to level of risk associated with specific data processing. The risk in GDPR should be interpreted as the likelihood of a breach of the rights and freedoms of the data subject. According to Recital 76, the likelihood and severity of the risk to the rights and freedoms of the data subject should be determined by reference to the nature, scope, context and purposes of the processing. GDPR does not indicate security measures which should be applied – in recitals there are only examples such as anonymization or encryption. It depends on a controller’s decision what type of security measures controller considered as sufficient and he will be responsible if these measures are not sufficient or if his identification of risk level is incorrect. Data protection regulation indicates few levels of risk. Recital 76 indicates risk and high risk, but some lawyers think, that there is one more category – low risk/now risk. Low risk/now risk data processing is a situation when it is unlikely to result in a risk to the rights and freedoms of natural persons. GDPR mentions types of data processing when a controller does not have to evaluate level of risk because it has been classified as „high risk” processing e.g. processing on a large scale of special categories of data, processing with using new technologies. The methodology will include analysis of legal regulations e.g. GDPR, the Polish Act on the Protection of personal data. Moreover: ICO Guidelines and articles concerning risk based approach in GDPR. The main conclusion is that an appropriate risk assessment is a key to keeping data safe and avoiding financial penalties. On the one hand, this approach seems to be more equitable, not only for controllers or processors but also for data subjects, but on the other hand, it increases controllers’ uncertainties in the assessment which could have a direct impact on incorrect data protection and potential responsibility for infringement of regulation.Keywords: general data protection regulation, personal data protection, privacy protection, risk based approach
Procedia PDF Downloads 2512117 Climate Indices: A Key Element for Climate Change Adaptation and Ecosystem Forecasting - A Case Study for Alberta, Canada
Authors: Stefan W. Kienzle
Abstract:
The increasing number of occurrences of extreme weather and climate events have significant impacts on society and are the cause of continued and increasing loss of human and animal lives, loss or damage to property (houses, cars), and associated stresses to the public in coping with a changing climate. A climate index breaks down daily climate time series into meaningful derivatives, such as the annual number of frost days. Climate indices allow for the spatially consistent analysis of a wide range of climate-dependent variables, which enables the quantification and mapping of historical and future climate change across regions. As trends of phenomena such as the length of the growing season change differently in different hydro-climatological regions, mapping needs to be carried out at a high spatial resolution, such as the 10km by 10km Canadian Climate Grid, which has interpolated daily values from 1950 to 2017 for minimum and maximum temperature and precipitation. Climate indices form the basis for the analysis and comparison of means, extremes, trends, the quantification of changes, and their respective confidence levels. A total of 39 temperature indices and 16 precipitation indices were computed for the period 1951 to 2017 for the Province of Alberta. Temperature indices include the annual number of days with temperatures above or below certain threshold temperatures (0, +-10, +-20, +25, +30ºC), frost days, and timing of frost days, freeze-thaw days, growing or degree days, and energy demands for air conditioning and heating. Precipitation indices include daily and accumulated 3- and 5-day extremes, days with precipitation, period of days without precipitation, and snow and potential evapotranspiration. The rank-based nonparametric Mann-Kendall statistical test was used to determine the existence and significant levels of all associated trends. The slope of the trends was determined using the non-parametric Sen’s slope test. The Google mapping interface was developed to create the website albertaclimaterecords.com, from which beach of the 55 climate indices can be queried for any of the 6833 grid cells that make up Alberta. In addition to the climate indices, climate normals were calculated and mapped for four historical 30-year periods and one future period (1951-1980, 1961-1990, 1971-2000, 1981-2017, 2041-2070). While winters have warmed since the 1950s by between 4 - 5°C in the South and 6 - 7°C in the North, summers are showing the weakest warming during the same period, ranging from about 0.5 - 1.5°C. New agricultural opportunities exist in central regions where the number of heat units and growing degree days are increasing, and the number of frost days is decreasing. While the number of days below -20ºC has about halved across Alberta, the growing season has expanded by between two and five weeks since the 1950s. Interestingly, both the number of days with heat waves and cold spells have doubled to four-folded during the same period. This research demonstrates the enormous potential of using climate indices at the best regional spatial resolution possible to enable society to understand historical and future climate changes of their region.Keywords: climate change, climate indices, habitat risk, regional, mapping, extremes
Procedia PDF Downloads 922116 Effect of SCN5A Gene Mutation in Endocardial Cell
Authors: Helan Satish, M. Ramasubba Reddy
Abstract:
The simulation of an endocardial cell for gene mutation in the cardiac sodium ion channel NaV1.5, encoded by SCN5A gene, is discussed. The characterization of Brugada Syndrome by loss of function effect on SCN5A mutation due to L812Q mutant present in the DII-S4 transmembrane region of the NaV1.5 channel protein and its effect in an endocardial cell is studied. Ten Tusscher model of human ventricular action potential is modified to incorporate the changes contributed by L812Q mutant in the endocardial cells. Results show that BrS-associated SCN5A mutation causes reduction in the inward sodium current by modifications in the channel gating dynamics such as delayed activation, enhanced inactivation, and slowed recovery from inactivation in the endocardial cell. A decrease in the inward sodium current was also observed, which affects depolarization phase (Phase 0) that leads to reduction in the spike amplitude of the cardiac action potential.Keywords: SCN5A gene mutation, sodium channel, Brugada syndrome, cardiac arrhythmia, action potential
Procedia PDF Downloads 1242115 Assessment of Chemical and Physical Properties of Surface Water Resources in Flood Affected Area
Authors: Siti Hajar Ya’acob, Nor Sayzwani Sukri, Farah Khaliz Kedri, Rozidaini Mohd Ghazi, Nik Raihan Nik Yusoff, Aweng A/L Eh Rak
Abstract:
Flood event that occurred in mid-December 2014 in East Coast of Peninsular Malaysia has driven attention from the public nationwide. Apart from loss and damage of properties and belongings, the massive flood event has introduced environmental disturbances on surface water resources in such flood affected area. A study has been conducted to measure the physical and chemical composition of Galas River and Pergau River prior to identification the flood impact towards environmental deterioration in surrounding area. Samples that have been collected were analyzed in-situ using YSI portable instrument and also in the laboratory for acid digestion and heavy metals analysis using Atomic Absorption Spectroscopy (AAS). Results showed that range of temperature (0C), DO (mg/L), Ec (µs/cm), TDS (mg/L), turbidity (NTU), pH, and salinity were 25.05-26.65, 1.51-5.85, 0.032-0.054, 0.022-0.035, 23.2-76.4, 3.46-7.31, and 0.01-0.02 respectively. The results from this study could be used as a primary database to evaluate the status of water quality of the respective river after the massive flood.Keywords: flood, river, heavy metals, AAS
Procedia PDF Downloads 3792114 Feasibility of an Extreme Wind Risk Assessment Software for Industrial Applications
Authors: Francesco Pandolfi, Georgios Baltzopoulos, Iunio Iervolino
Abstract:
The impact of extreme winds on industrial assets and the built environment is gaining increasing attention from stakeholders, including the corporate insurance industry. This has led to a progressively more in-depth study of building vulnerability and fragility to wind. Wind vulnerability models are used in probabilistic risk assessment to relate a loss metric to an intensity measure of the natural event, usually a gust or a mean wind speed. In fact, vulnerability models can be integrated with the wind hazard, which consists of associating a probability to each intensity level in a time interval (e.g., by means of return periods) to provide an assessment of future losses due to extreme wind. This has also given impulse to the world- and regional-scale wind hazard studies.Another approach often adopted for the probabilistic description of building vulnerability to the wind is the use of fragility functions, which provide the conditional probability that selected building components will exceed certain damage states, given wind intensity. In fact, in wind engineering literature, it is more common to find structural system- or component-level fragility functions rather than wind vulnerability models for an entire building. Loss assessment based on component fragilities requires some logical combination rules that define the building’s damage state given the damage state of each component and the availability of a consequence model that provides the losses associated with each damage state. When risk calculations are based on numerical simulation of a structure’s behavior during extreme wind scenarios, the interaction of component fragilities is intertwined with the computational procedure. However, simulation-based approaches are usually computationally demanding and case-specific. In this context, the present work introduces the ExtReMe wind risk assESsment prototype Software, ERMESS, which is being developed at the University of Naples Federico II. ERMESS is a wind risk assessment tool for insurance applications to industrial facilities, collecting a wide assortment of available wind vulnerability models and fragility functions to facilitate their incorporation into risk calculations based on in-built or user-defined wind hazard data. This software implements an alternative method for building-specific risk assessment based on existing component-level fragility functions and on a number of simplifying assumptions for their interactions. The applicability of this alternative procedure is explored by means of an illustrative proof-of-concept example, which considers four main building components, namely: the roof covering, roof structure, envelope wall and envelope openings. The application shows that, despite the simplifying assumptions, the procedure can yield risk evaluations that are comparable to those obtained via more rigorous building-level simulation-based methods, at least in the considered example. The advantage of this approach is shown to lie in the fact that a database of building component fragility curves can be put to use for the development of new wind vulnerability models to cover building typologies not yet adequately covered by existing works and whose rigorous development is usually beyond the budget of portfolio-related industrial applications.Keywords: component wind fragility, probabilistic risk assessment, vulnerability model, wind-induced losses
Procedia PDF Downloads 1802113 Unlocking Health Insights: Studying Data for Better Care
Authors: Valentina Marutyan
Abstract:
Healthcare data mining is a rapidly developing field at the intersection of technology and medicine that has the potential to change our understanding and approach to providing healthcare. Healthcare and data mining is the process of examining huge amounts of data to extract useful information that can be applied in order to improve patient care, treatment effectiveness, and overall healthcare delivery. This field looks for patterns, trends, and correlations in a variety of healthcare datasets, such as electronic health records (EHRs), medical imaging, patient demographics, and treatment histories. To accomplish this, it uses advanced analytical approaches. Predictive analysis using historical patient data is a major area of interest in healthcare data mining. This enables doctors to get involved early to prevent problems or improve results for patients. It also assists in early disease detection and customized treatment planning for every person. Doctors can customize a patient's care by looking at their medical history, genetic profile, current and previous therapies. In this way, treatments can be more effective and have fewer negative consequences. Moreover, helping patients, it improves the efficiency of hospitals. It helps them determine the number of beds or doctors they require in regard to the number of patients they expect. In this project are used models like logistic regression, random forests, and neural networks for predicting diseases and analyzing medical images. Patients were helped by algorithms such as k-means, and connections between treatments and patient responses were identified by association rule mining. Time series techniques helped in resource management by predicting patient admissions. These methods improved healthcare decision-making and personalized treatment. Also, healthcare data mining must deal with difficulties such as bad data quality, privacy challenges, managing large and complicated datasets, ensuring the reliability of models, managing biases, limited data sharing, and regulatory compliance. Finally, secret code of data mining in healthcare helps medical professionals and hospitals make better decisions, treat patients more efficiently, and work more efficiently. It ultimately comes down to using data to improve treatment, make better choices, and simplify hospital operations for all patients.Keywords: data mining, healthcare, big data, large amounts of data
Procedia PDF Downloads 752112 In-Situ Redevelopment in Urban India: Two Case Studies from Delhi and Mumbai
Authors: Ashok Kumar, Anjali Sharma
Abstract:
As cities grow and expand spatially, redevelopment in urban India is beginning to emerge as a new mode of urban expansion sweeping low-income informal settlements. This paper examines the extent and nature of expanding urban frontier before examining implications for the families living in these settlements. Displacement of these families may appear to be an obvious consequence. However, we have conducted ethnographic studies over the past several months in a Delhi slum named Kathputli Colony, Delhi. In depth analysis of the study for this slum appears to present a variegated set of consequences for the residents of informal settlements including loss of livelihoods, dismantling of family ties, and general anxiety arising out of uncertainty about resettlement. Apart from Delhi case study, we also compare and contrast another redevelopment case from Mumbai located at Bhendi Bazar. These examples from the two mega cities of Mumbai and Delhi are analysed to understand and explore expanding urban frontiers and their consequences for informing future public policy.Keywords: informal settlements, policy, redevelopment, urban
Procedia PDF Downloads 3302111 The Ethics of Documentary Filmmaking Discuss the Ethical Considerations and Responsibilities of Documentary Filmmakers When Portraying Real-life Events and Subjects
Authors: Batatunde Kolawole
Abstract:
Documentary filmmaking stands as a distinctive medium within the cinematic realm, commanding a unique responsibility the portrayal of real-life events and subjects. This research delves into the profound ethical considerations and responsibilities that documentary filmmakers shoulder as they embark on the quest to unveil truth and weave compelling narratives. In the exploration, they embark on a comprehensive review of ethical frameworks and real-world case studies, illuminating the intricate web of challenges that documentarians confront. These challenges encompass an array of ethical intricacies, from securing informed consent to safeguarding privacy, maintaining unwavering objectivity, and sidestepping the snares of narrative manipulation when crafting stories from reality. Furthermore, they dissect the contemporary ethical terrain, acknowledging the emergence of novel dilemmas in the digital age, such as deepfakes and digital alterations. Through a meticulous analysis of ethical quandaries faced by distinguished documentary filmmakers and their strategies for ethical navigation, this study offers invaluable insights into the evolving role of documentaries in molding public discourse. They underscore the indispensable significance of transparency, integrity, and an indomitable commitment to encapsulating the intricacies of reality within the realm of ethical documentary filmmaking. In a world increasingly reliant on visual narratives, an understanding of the subtle ethical dimensions of documentary filmmaking holds relevance not only for those behind the camera but also for the diverse audiences who engage with and interpret the realities unveiled on screen. This research stands as a rigorous examination of the moral compass that steers this potent form of cinematic expression. It emphasizes the capacity of ethical documentary filmmaking to enlighten, challenge, and inspire, all while unwaveringly upholding the core principles of truthfulness and respect for the human subjects under scrutiny. Through this holistic analysis, they illuminate the enduring significance of upholding ethical integrity while uncovering the truths that shape our world. Ethical documentary filmmaking, as exemplified by "Rape" and countless other powerful narratives, serves as a testament to the enduring potential of cinema to inform, challenge, and drive meaningful societal discourse.Keywords: filmmaking, documentary, human right, film
Procedia PDF Downloads 652110 Culture Dimensions of Information Systems Security in Saudi Arabia National Health Services
Authors: Saleh Alumaran, Giampaolo Bella, Feng Chen
Abstract:
The study of organisations’ information security cultures has attracted scholars as well as healthcare services industry to research the topic and find appropriate tools and approaches to develop a positive culture. The vast majority of studies in Saudi national health services are on the use of technology to protect and secure health services information. On the other hand, there is a lack of research on the role and impact of an organisation’s cultural dimensions on information security. This research investigated and analysed the role and impact of cultural dimensions on information security in Saudi Arabia health service. Hypotheses were tested and two surveys were carried out in order to collect data and information from three major hospitals in Saudi Arabia (SA). The first survey identified the main cultural-dimension problems in SA health services and developed an initial information security culture framework model. The second survey evaluated and tested the developed framework model to test its usefulness, reliability and applicability. The model is based on human behaviour theory, where the individual’s attitude is the key element of the individual’s intention to behave as well as of his or her actual behaviour. The research identified six cultural dimensions: Saudi national culture, Saudi health service leadership, employees’ trust, technology, multicultural interactions and employees’ job roles. The research also identified a set of cultural sub-dimensions. These include working values and norms, tribe values and norms, attitudes towards women, power sharing, vision, social interaction, respect and understanding, hospital intra-net, hospital employees’ language(s) used, multi-national culture, communication system, employees’ job satisfaction and job security. The research identified that (a) the human behaviour towards medical information in SA is one of the main threats to information security and one of the main challenges to SA health authority, (b) The current situation of SA hospitals’ IS cultures is falling short in protecting medical information due to the current value and norms towards information security, (c) Saudi national culture and employees’ job role are the main dimensions playing major roles in the employees’ attitude, and technology is the least important dimension playing a role in the employees’ attitudes.Keywords: cultural dimension, electronic health record, information security, privacy
Procedia PDF Downloads 3512109 Web-Based Criminal Diary: Paperless Criminal Evidence for Federal Republic of Nigeria
Authors: Yekini Nureni Asafe, Haastrup Victor Adeleye, Ikotun Abiodun Motunrayo, Ojo Olanrewaju
Abstract:
Web Based Criminal Diary is a web based application whereby data of criminals been convicted by a judge in the court of law in Nigeria are shown to the entire public. Presently, criminal records are kept manually in Nigeria, which means when a person needs to be investigated to know if the person has a criminal record in the country, there is need to pass through different manual processes. With the use of manual record keeping, the criminal records can easily be manipulated by people in charge. The focus of this research work is to design a web-based application system for criminal record in Nigeria, towards elimination of challenges (such as loss of criminal records, in-efficiency in criminal record keeping, data manipulation, and other attendant problems of paper-based record keeping) which surrounds manual processing currently in use. The product of this research work will also help to minimize crime rate in our country since the opportunities and benefits lost as a result of a criminal record create will a lifelong barriers for anyone attempting to overcome a criminal past in our country.Keywords: court of law, criminal, criminal diary, criminal evidence, Nigeria, web-based
Procedia PDF Downloads 3172108 Effect of Polymer Concentration on the Rheological Properties of Polyelectrolyte Solutions
Authors: Khaled Benyounes, Abderrahmane Mellak
Abstract:
The rheology of aqueous solutions of polyelectrolyte (polyanionic cellulose, PAC) at high molecular weight was investigated using a controlled stress rheometer. Several rheological measurements; viscosity measurements, creep compliance tests at a constant low shear stress and oscillation experiments have been performed. The concentrations ranged by weight from 0.01 to 2.5% of PAC. It was found that the aqueous solutions of PAC do not exhibit a yield stress, the flow curves of PAC over a wide range of shear rate (0 to 1000 s-1) could be described by the cross model and the Williamson models. The critical concentrations of polymer c* and c** have been estimated. The dynamic moduli, i.e., storage modulus (G’) and loss modulus (G’’) of the polymer have been determined at frequency sweep from 0.01 to 10 Hz. At polymer concentration above 1%, the modulus G’ is superior to G’’. The relationships between the dynamic modulus and concentration of polymer have been established. The creep-recovery experiments demonstrated that polymer solutions show important viscoelastic properties of system water-PAC when the concentration of the polymer increases.Keywords: polyanionic cellulose, viscosity, creep, oscillation, cross model
Procedia PDF Downloads 3242107 Exploring the Quest for Centralized Identity in Mohsin Hamid's "The Last White Man": Post-Apocalyptic Transformations and Societal Reconfigurations
Authors: Kashifa Khalid, Eesham Fatima
Abstract:
This study aims to analyze the loss of identity and its impact on one’s life in ‘The Last White Man’ by Mohsin Hamid. The theory of Alienation Effect by Bertolt Brecht has been applied to the text as Hamid offers the readers a unique perspective, alluding to significant themes like identity, race, and death. The aspects of defamiliarization align impeccably with the plot, as existence and the corresponding concept of identity seem to have dissolved into utter chaos. This extends from the unexplained transformation to the way the entire world unravels from its general norm into a dystopian mayhem. The characters, starting with the protagonist Anders, have lost their center. One’s own self transforms into the ‘other,’ and the struggle is to get refamiliarized with one’s own self. Alienation and isolation only rise as the construct of race and identity is taken apart brick by brick, ironically at its own pace as many new realities are blown to bits. The inseparable relationship between identity and grief under the ever-looming cloud of ‘death’ is studied in detail. The theoretical framework and thematic aspects harmonize in accordance with the writing style put forth by Hamid, tying all the loose ends together.Keywords: alienation, chaos, identity, transformation
Procedia PDF Downloads 432106 A Supply Chain Traceability Improvement Using RFID
Authors: Yaser Miaji, Mohammad Sabbagh
Abstract:
Radio Frequency Identification (RFID) is a technology which shares a similar concept with bar code. With RFID, the electromagnetic or electrostatic coupling in the RF portion of the electromagnetic spectrum is used to transmit signals. Supply chain management is aimed to keep going long-term performance of individual companies and the overall supply chain by maximizing customer satisfaction with minimum costs. One of the major issues in the supply chain management is product loss or shrinkage. In order to overcome this problem, this system which uses Radio Frequency Identification (RFID) technology will be able to RFID track and identify where losses are occurring and enable effective traceability. RFID brings a new dimension to supply chain management by providing a more efficient way of being able to identify and track items at the various stages throughout the supply chain. This system has been developed and tested to prove that RFID technology can be used to improve traceability in supply chain at low cost. Due to its simplicity in interface program and database management system using Visual Basic and MS Excel or MS Access the system can be more affordable and implemented even by small and medium scale industries.Keywords: supply chain, RFID, tractability, radio frequency identification
Procedia PDF Downloads 4852105 Multilabel Classification with Neural Network Ensemble Method
Authors: Sezin Ekşioğlu
Abstract:
Multilabel classification has a huge importance for several applications, it is also a challenging research topic. It is a kind of supervised learning that contains binary targets. The distance between multilabel and binary classification is having more than one class in multilabel classification problems. Features can belong to one class or many classes. There exists a wide range of applications for multi label prediction such as image labeling, text categorization, gene functionality. Even though features are classified in many classes, they may not always be properly classified. There are many ensemble methods for the classification. However, most of the researchers have been concerned about better multilabel methods. Especially little ones focus on both efficiency of classifiers and pairwise relationships at the same time in order to implement better multilabel classification. In this paper, we worked on modified ensemble methods by getting benefit from k-Nearest Neighbors and neural network structure to address issues within a beneficial way and to get better impacts from the multilabel classification. Publicly available datasets (yeast, emotion, scene and birds) are performed to demonstrate the developed algorithm efficiency and the technique is measured by accuracy, F1 score and hamming loss metrics. Our algorithm boosts benchmarks for each datasets with different metrics.Keywords: multilabel, classification, neural network, KNN
Procedia PDF Downloads 1532104 Modeling Water Resources Carrying Capacity, Optimizing Water Treatment, Smart Water Management, and Conceptualizing a Watershed Management Approach
Authors: Pius Babuna
Abstract:
Sustainable water use is important for the existence of the human race. Water resources carrying capacity (WRCC) measures the sustainability of water use; however, the calculation and optimization of WRCC remain challenging. This study used a mathematical model (the Logistics Growth of Water Resources -LGWR) and a linear objective function to model water sustainability. We tested the validity of the models using data from Ghana. Total freshwater resources, water withdrawal, and population data were used in MATLAB. The results show that the WRCC remains sustainable until the year 2132 ±18, when half of the total annual water resources will be used. The optimized water treatment cost suggests that Ghana currently wastes GHȼ 1115.782± 50 cedis (~$182.21± 50) per water treatment plant per month or ~ 0.67 million gallons of water in an avoidable loss. Adopting an optimized water treatment scheme and a watershed management approach will help sustain the WRCC.Keywords: water resources carrying capacity, smart water management, optimization, sustainable water use, water withdrawal
Procedia PDF Downloads 852103 Transformation of Potato, Plantain, Banana to Flour in Order to Enhance Sustainable Development and Promote Local Consumption
Authors: Munu Fritz-Austin Ndam
Abstract:
Although the Cameroonian system of farming is considered as the first generation, the primary actors here involved have not yet understood the meaning of adding value to the product or produce they produce. The challenge here is for everyone who practices agriculture as an income generating activity in Cameroon to be able to understand the concept of value-added products and to know how to go about it. Recent studies have shown that these farmers who depend on agriculture as their main income generation activity make a great loss out of it because they don’t have the means to either transport their produce to the appropriate market, they don’t the knowledge on how to transform it, or they don’t have a means of conserving the product for a longer duration. It is important to note that after a thorough evaluation of the activity carried out, the final added value product sold is of great benefit not only to the producer but the buyer and the population at large. In my proposed prestation, I will discuss how the transformation activity will have a positive impact on the lives of farmers and the buyers and, most importantly, describe the methodology and procedure which is followed before the tubers (banana, Plantain, potato)is transformed into the finished or semi-finished product.Keywords: transformation, sustainability, development, consumption
Procedia PDF Downloads 1022102 Bioinformatics High Performance Computation and Big Data
Authors: Javed Mohammed
Abstract:
Right now, bio-medical infrastructure lags well behind the curve. Our healthcare system is dispersed and disjointed; medical records are a bit of a mess; and we do not yet have the capacity to store and process the crazy amounts of data coming our way from widespread whole-genome sequencing. And then there are privacy issues. Despite these infrastructure challenges, some researchers are plunging into bio medical Big Data now, in hopes of extracting new and actionable knowledge. They are doing delving into molecular-level data to discover bio markers that help classify patients based on their response to existing treatments; and pushing their results out to physicians in novel and creative ways. Computer scientists and bio medical researchers are able to transform data into models and simulations that will enable scientists for the first time to gain a profound under-standing of the deepest biological functions. Solving biological problems may require High-Performance Computing HPC due either to the massive parallel computation required to solve a particular problem or to algorithmic complexity that may range from difficult to intractable. Many problems involve seemingly well-behaved polynomial time algorithms (such as all-to-all comparisons) but have massive computational requirements due to the large data sets that must be analyzed. High-throughput techniques for DNA sequencing and analysis of gene expression have led to exponential growth in the amount of publicly available genomic data. With the increased availability of genomic data traditional database approaches are no longer sufficient for rapidly performing life science queries involving the fusion of data types. Computing systems are now so powerful it is possible for researchers to consider modeling the folding of a protein or even the simulation of an entire human body. This research paper emphasizes the computational biology's growing need for high-performance computing and Big Data. It illustrates this article’s indispensability in meeting the scientific and engineering challenges of the twenty-first century, and how Protein Folding (the structure and function of proteins) and Phylogeny Reconstruction (evolutionary history of a group of genes) can use HPC that provides sufficient capability for evaluating or solving more limited but meaningful instances. This article also indicates solutions to optimization problems, and benefits Big Data and Computational Biology. The article illustrates the Current State-of-the-Art and Future-Generation Biology of HPC Computing with Big Data.Keywords: high performance, big data, parallel computation, molecular data, computational biology
Procedia PDF Downloads 3622101 Pre-Treatment of Anodic Inoculum with Nitroethane to Improve Performance of a Microbial Fuel Cell
Authors: Rajesh P.P., Md. Tabish Noori, Makarand M. Ghangrekar
Abstract:
Methanogenic substrate loss is reported to be a major bottleneck in microbial fuel cell which significantly reduces the power production capacity and coulombic efficiency (CE) of microbial fuel cell (MFC). Nitroethane is found to be a potent inhibitor of hydrogenotrophic methanogens in rumen fermentation process. Influence of nitroethane pre-treated sewage sludge inoculum on suppressing the methanogenic activity and enhancing the electrogenesis in MFC was evaluated. MFC inoculated with nitroethane pre-treated anodic inoculum demonstrated a maximum operating voltage of 541 mV, with coulombic efficiency and sustainable volumetric power density of 39.85 % and 14.63 W/m3 respectively. Linear sweep voltammetry indicated a higher electron discharge on the anode surface due to enhancement of electrogenic activity while suppressing methanogenic activity. A 63 % reduction in specific methanogenic activity was observed in anaerobic sludge pre-treated with nitroethane; emphasizing significance of this pretreatment for suppressing methanogenesis and its utility for enhancing electricity generation in MFC.Keywords: coulombic efficiency, methanogenesis inhibition, microbial fuel cell, nitroethane
Procedia PDF Downloads 3172100 Finding Out the Best Place for Resettling of Victims after the Earthquake: A Case Study for Tehran, Iran
Authors: Reyhaneh Saeedi, Nima Ghasemloo
Abstract:
Iran is a capable zone for earthquake that follows loss of lives and financial damages. To have sheltering for earthquake victims is one of the basic requirements although it is hard to select suitable places for temporary resettling after an earthquake happens. Before these kinds of disasters happen, the best places for resettling the victims must be designated. This matter is an important issue in disaster management and planning. Geospatial Information System (GIS) has a determining role in disaster management; it can determine the best places for temporary resettling after such a disaster. In this paper the best criteria have been determined associated with their weights and buffers by use of research and questionnaire for locating the best places. In this paper, AHP method is used as decision model and to locate the best places for temporary resettling is done based on the selected criteria. Also in this research are made the buffer layers of criteria and change them to the raster layers. Later on, the raster layers are multiplied on desired weights then, the results are added together. Finally there are suitable places for resettling of victims by desired criteria by different colors with their optimum rate in QGIS software.Keywords: disaster management, temporary resettlement, earthquake, criteria
Procedia PDF Downloads 4632099 Apply Commitment Method in Power System to Minimize the Fuel Cost
Authors: Mohamed Shaban, Adel Yahya
Abstract:
The goal of this paper study is to schedule the power generation units to minimize fuel consumption cost based on a model that solves unit commitment problems. This can be done by utilizing forward dynamic programming method to determine the most economic scheduling of generating units. The model was applied to a power station, which consists of four generating units. The obtained results show that the applications of forward dynamic programming method offer a substantial reduction in fuel consumption cost. The fuel consumption cost has been reduced from $116,326 to $102,181 within a 24-hour period. This means saving about 12.16 % of fuel consumption cost. The study emphasizes the importance of applying modeling schedule programs to the operation of power generation units. As a consequence less consumption of fuel, less loss of power and less pollutionKeywords: unit commitment, forward dynamic, fuel cost, programming, generation scheduling, operation cost, power system, generating units
Procedia PDF Downloads 6082098 Thoughts Regarding Interprofessional Work between Nurses and Speech-Language-Hearing Therapists in Cancer Rehabilitation: An Approach for Dysphagia
Authors: Akemi Nasu, Keiko Matsumoto
Abstract:
Rehabilitation for cancer requires setting up individual goals for each patient and an approach that properly fits the stage of cancer when putting into practice. In order to cope with the daily changes in the patients' condition, the establishment of a good cooperative relationship between the nurses and the physiotherapists, occupational therapists, and speech-language-hearing therapists (therapists) becomes essential. This study will focus on the present situation of the cooperation between nurses and therapists, especially the speech-language-hearing therapists, and aim to elucidate what develops there. A semi-structured interview was conducted targeted at a physical therapist having practical experience in working in collaboration with nurses. The contents of the interview were transcribed and converted to data, and the data was encoded and categorized with sequentially increasing degrees of abstraction to conduct a qualitative explorative factor analysis of the data. When providing ethical explanations, particular care was taken to ensure that participants would not be subjected to any disadvantages as a result of participating in the study. In addition, they were also informed that their privacy would be ensured and that they have the right to decline to participate in the study. In addition, they were also informed that the results of the study would be announced publicly at an applicable nursing academic conference. This study has been approved following application to the ethical committee of the university with which the researchers are affiliated. The survey participant is a female speech-language-hearing therapist in her forties. As a result of the analysis, 6 categories were extracted consisting of 'measures to address appetite and aspiration pneumonia prevention', 'limitation of the care a therapist alone could provide', 'the all-inclusive patient- supportive care provided by nurses', 'expand the beneficial cooperation with nurses', 'providing education for nurses on the swallowing function utilizing videofluoroscopic examination of swallowing', 'enhancement of communication including conferences'. In order to improve the team performance, and for the teamwork competency necessary for the provision of safer care, mutual support is essential. As for the cooperation between nurses and therapists, this survey indicates that the maturing of the cooperation between professionals in order to improve nursing professionals' knowledge and enhance communication will lead to an improvement in the quality of the rehabilitation for cancer.Keywords: cancer rehabilitation, nurses, speech-language-hearing therapists, interprofessional work
Procedia PDF Downloads 1322097 Reduce of the Consumption of Industrial Kilns a Pottery Kiln as Example, Recovery of Lost Energy Using a System of Heat Exchangers and Modeling of Heat Transfer Through the Walls of the Kiln
Authors: Maha Bakkari, Fatiha Lemmeni, Rachid Tadili
Abstract:
In this work, we present some characteristics of the furnace studied, its operating principle and the experimental measurements of the evolutions of the temperatures inside and outside the walls of the This work deals with the problem of energy consumption of pottery kilns whose energy consumption is relatively too high. In this work, we determined the sources of energy loss by studying the heat transfer of a pottery furnace, we proposed a recovery system to reduce energy consumption, and then we developed a numerical model modeling the transfers through the walls of the furnace and to optimize the insulation (reduce heat losses) by testing multiple insulators. The recovery and reuse of energy recovered by the recovery system will present a significant gain in energy consumption of the oven and cooking time. This research is one of the solutions that helps reduce the greenhouse effect of the planet earth, a problem that worries the world.Keywords: recovery lost energy, energy efficiency, modeling, heat transfer
Procedia PDF Downloads 842096 Ultrahigh Thermal Stability of Dielectric Permittivity in 0.6Bi(Mg₁/₂Ti₁/₂)O₃-0.4Ba₀.₈Ca₀.₂(Ti₀.₈₇₅Nb₀.₁₂₅)O₃
Authors: Kaiyuan Chena, Senentxu Lanceros-Méndeza, Laijun Liub, Qi Zhanga
Abstract:
0.6Bi(Mg1/2Ti1/2)O3-0.4Ba0.8Ca0.2(Nb0.125Ti0.875)O3 (0.6BMT-0.4BCNT) ceramics with a pseudo-cubic structure and re-entrant dipole glass behavior have been investigated via X-ray diffraction and dielectric permittivity-temperature spectra. It shows an excellent dielectric-temperature stability with small variations of dielectric permittivity (± 5%, 420 - 802 K) and dielectric loss tangent (tanδ < 2.5%, 441 - 647 K) in a wide temperature range. Three dielectric anomalies are observed from 290 K to 1050 K. The low-temperature weakly coupled re-entrant relaxor behavior was described using Vogel-Fulcher law and the new glass model. The mid- and high-temperature dielectric anomalies are characterized by isothermal impedance and electrical modulus. The activation energy of both dielectric relaxation and conductivity follows the Arrhenius law in the temperature ranges of 633 - 753 K and 833 - 973 K, respectively. The ultrahigh thermal stability of the dielectric permittivity is attributed to the weakly coupling of polar clusters, the formation of diffuse phase transition (DPT) and the local phase transition of calcium-containing perovskite.Keywords: permittivity, relaxor, electronic ceramics, activation energy
Procedia PDF Downloads 1002095 Influence of Pier Modification Techniques for Reducing Scour around Bridge Piers
Authors: Rashid Farooq, Abdul Razzaq Ghumman, Hashim Nisar Hashmi
Abstract:
Bridge piers often fail all over the world and the whole structure may be endangered due to scouring phenomena. Scouring has been linked to catastrophic failures that lead into the loss of human lives. Various techniques have been employed to extenuate the scouring process in order to assist the bridge designs. Pier modifications plays vital role to control scouring at the vicinity of the pier. This experimental study aims at monitoring the effectiveness of pier modification and temporal development of scour depth around a bridge pier by providing a collar, a cable or openings under the same flow conditions. Provision of a collar around the octagonal pier reduced more scour depth than that for other two configurations. Providing a collar around the octagonal pier found to be the best in reducing scour. The scour depth in front of pier was found to be 19.5% less than that at the octagonal pier without any modifications. Similarly, the scour depth around the octagonal pier having provision of a cable was less than that at pier with provision of openings. The scour depth around an octagonal pier was also compared with a plain circular pier and found to be 9.1% less.Keywords: Scour, octagonal pier, collar, cable
Procedia PDF Downloads 2632094 Investigation of Arson Fire Incident in Textile Garment Building Using Fire Dynamic Simulation
Authors: Mohsin Ali Shaikh, Song Weiguo, Muhammad Kashan Surahio, Usman Shahid, Rehmat Karim
Abstract:
This study investigated a catastrophic arson fire incident that occurred at a textile garment building in Karachi, Pakistan. Unfortunately, a catastrophic event led to the loss of 262 lives and caused 55 severe injuries. The primary objective is to analyze the aspects of the fire incident and understand the causes of arson fire disasters. The study utilized Fire Dynamic Simulation (F.D.S) was employed to simulate fire propagation, visibility, harmful gas concentration, fire temperature, and numerical results. The analysis report has determined the specific circumstances that created the unpleasant incident in the present study. The significance of the current findings lies in their potential to prevent arson fires, improve fire safety measures, and the development of safety plans in building design. The fire dynamic simulation findings can serve as a theoretical basis for the investigation of arson fires and evacuation planning in textile garment buildings.Keywords: investigation, fire arson incident, textile garment, fire dynamic simulation (FDS)
Procedia PDF Downloads 882093 Pressure Losses on Realistic Geometry of Tracheobronchial Tree
Authors: Michaela Chovancova, Jakub Elcner
Abstract:
Real bronchial tree is very complicated piping system. Analysis of flow and pressure losses in this system is very difficult. Due to the complex geometry and the very small size in the lower generations is examination by CFD possible only in the central part of bronchial tree. For specify the pressure losses of lower generations is necessary to provide a mathematical equation. Determination of mathematical formulas for calculating the pressure losses in the real lungs is due to its complexity and diversity lengthy and inefficient process. For these calculations is necessary the lungs to slightly simplify (same cross-section over the length of individual generation) or use one of the models of lungs. The simplification could cause deviations from real values. The article compares the values of pressure losses obtained from CFD simulation of air flow in the central part of the real bronchial tree with the values calculated in a slightly simplified real lungs by using a mathematical relationship derived from the Bernoulli equation and continuity equation. Then, evaluate the desirability of using this formula to determine the pressure loss across the bronchial tree.Keywords: pressure gradient, airways resistance, real geometry of bronchial tree, breathing
Procedia PDF Downloads 3202092 A Comprehensive Survey of Artificial Intelligence and Machine Learning Approaches across Distinct Phases of Wildland Fire Management
Authors: Ursula Das, Manavjit Singh Dhindsa, Kshirasagar Naik, Marzia Zaman, Richard Purcell, Srinivas Sampalli, Abdul Mutakabbir, Chung-Horng Lung, Thambirajah Ravichandran
Abstract:
Wildland fires, also known as forest fires or wildfires, are exhibiting an alarming surge in frequency in recent times, further adding to its perennial global concern. Forest fires often lead to devastating consequences ranging from loss of healthy forest foliage and wildlife to substantial economic losses and the tragic loss of human lives. Despite the existence of substantial literature on the detection of active forest fires, numerous potential research avenues in forest fire management, such as preventative measures and ancillary effects of forest fires, remain largely underexplored. This paper undertakes a systematic review of these underexplored areas in forest fire research, meticulously categorizing them into distinct phases, namely pre-fire, during-fire, and post-fire stages. The pre-fire phase encompasses the assessment of fire risk, analysis of fuel properties, and other activities aimed at preventing or reducing the risk of forest fires. The during-fire phase includes activities aimed at reducing the impact of active forest fires, such as the detection and localization of active fires, optimization of wildfire suppression methods, and prediction of the behavior of active fires. The post-fire phase involves analyzing the impact of forest fires on various aspects, such as the extent of damage in forest areas, post-fire regeneration of forests, impact on wildlife, economic losses, and health impacts from byproducts produced during burning. A comprehensive understanding of the three stages is imperative for effective forest fire management and mitigation of the impact of forest fires on both ecological systems and human well-being. Artificial intelligence and machine learning (AI/ML) methods have garnered much attention in the cyber-physical systems domain in recent times leading to their adoption in decision-making in diverse applications including disaster management. This paper explores the current state of AI/ML applications for managing the activities in the aforementioned phases of forest fire. While conventional machine learning and deep learning methods have been extensively explored for the prevention, detection, and management of forest fires, a systematic classification of these methods into distinct AI research domains is conspicuously absent. This paper gives a comprehensive overview of the state of forest fire research across more recent and prominent AI/ML disciplines, including big data, classical machine learning, computer vision, explainable AI, generative AI, natural language processing, optimization algorithms, and time series forecasting. By providing a detailed overview of the potential areas of research and identifying the diverse ways AI/ML can be employed in forest fire research, this paper aims to serve as a roadmap for future investigations in this domain.Keywords: artificial intelligence, computer vision, deep learning, during-fire activities, forest fire management, machine learning, pre-fire activities, post-fire activities
Procedia PDF Downloads 70