Search results for: data reduction
26437 Robust Data Image Watermarking for Data Security
Authors: Harsh Vikram Singh, Ankur Rai, Anand Mohan
Abstract:
In this paper, we propose secure and robust data hiding algorithm based on DCT by Arnold transform and chaotic sequence. The watermark image is scrambled by Arnold cat map to increases its security and then the chaotic map is used for watermark signal spread in middle band of DCT coefficients of the cover image The chaotic map can be used as pseudo-random generator for digital data hiding, to increase security and robustness .Performance evaluation for robustness and imperceptibility of proposed algorithm has been made using bit error rate (BER), normalized correlation (NC), and peak signal to noise ratio (PSNR) value for different watermark and cover images such as Lena, Girl, Tank images and gain factor .We use a binary logo image and text image as watermark. The experimental results demonstrate that the proposed algorithm achieves higher security and robustness against JPEG compression as well as other attacks such as addition of noise, low pass filtering and cropping attacks compared to other existing algorithm using DCT coefficients. Moreover, to recover watermarks in proposed algorithm, there is no need to original cover image.Keywords: data hiding, watermarking, DCT, chaotic sequence, arnold transforms
Procedia PDF Downloads 51726436 A Case Study of Alkali-Silica Reaction Induced Consistent Damage and Strength Degradation Evaluation in a Textile Mill Building Due to Slow-Reactive Aggregates
Authors: Ahsan R. Khokhar, Fizza Hassan
Abstract:
Alkali-Silica Reaction (ASR) has been recognized as a potential cause of concrete degradation in the world since the 1940s. In Pakistan, mega hydropower structures like dams, weirs constructed from aggregates extracted from a local riverbed exhibited different levels of alkali-silica reactivity over an extended service period. The concrete expansion potential due to such aggregates has been categorized as slow-reactive. Apart from hydropower structures, ASR existence has been identified in the concrete structural elements of a Textile Mill building which used aggregates extracted from the nearby riverbed. The original structure of the Textile Mill was erected in the 80s with the addition of a textile ‘sizing and wrapping’ hall constructed in the 90s. In the years to follow, intensive spalling was observed in the structural members of the subject hall; enough to threat to the overall stability of the building. Limitations such as incomplete building data posed hurdles during the detailed structural investigation. The paper lists observations made while assessing the extent of damage and its effect on the building hall structure. Core testing and Petrographic tests were carried out as per the ASTM standards for strength degradation analysis followed by the identifying its root cause. Results confirmed significant structural strength reduction because of ASR which necessitated the formulation of an immediate re-strengthening solution. The paper also discusses the possible tracks of rehabilitative measures which are being adapted to stabilize the structure and seize further concrete expansion.Keywords: Alkali-Silica Reaction (ASR), concrete strength degradation, damage assessment, damage evaluation
Procedia PDF Downloads 13226435 The Benefits of End-To-End Integrated Planning from the Mine to Client Supply for Minimizing Penalties
Authors: G. Martino, F. Silva, E. Marchal
Abstract:
The control over delivered iron ore blend characteristics is one of the most important aspects of the mining business. The iron ore price is a function of its composition, which is the outcome of the beneficiation process. So, end-to-end integrated planning of mine operations can reduce risks of penalties on the iron ore price. In a standard iron mining company, the production chain is composed of mining, ore beneficiation, and client supply. When mine planning and client supply decisions are made uncoordinated, the beneficiation plant struggles to deliver the best blend possible. Technological improvements in several fields allowed bridging the gap between departments and boosting integrated decision-making processes. Clusterization and classification algorithms over historical production data generate reasonable previsions for quality and volume of iron ore produced for each pile of run-of-mine (ROM) processed. Mathematical modeling can use those deterministic relations to propose iron ore blends that better-fit specifications within a delivery schedule. Additionally, a model capable of representing the whole production chain can clearly compare the overall impact of different decisions in the process. This study shows how flexibilization combined with a planning optimization model between the mine and the ore beneficiation processes can reduce risks of out of specification deliveries. The model capabilities are illustrated on a hypothetical iron ore mine with magnetic separation process. Finally, this study shows ways of cost reduction or profit increase by optimizing process indicators across the production chain and integrating the different plannings with the sales decisions.Keywords: clusterization and classification algorithms, integrated planning, mathematical modeling, optimization, penalty minimization
Procedia PDF Downloads 12626434 An Empirical Investigation of Big Data Analytics: The Financial Performance of Users versus Vendors
Authors: Evisa Mitrou, Nicholas Tsitsianis, Supriya Shinde
Abstract:
In the age of digitisation and globalisation, businesses have shifted online and are investing in big data analytics (BDA) to respond to changing market conditions and sustain their performance. Our study shifts the focus from the adoption of BDA to the impact of BDA on financial performance. We explore the financial performance of both BDA-vendors (business-to-business) and BDA-clients (business-to-customer). We distinguish between the five BDA-technologies (big-data-as-a-service (BDaaS), descriptive, diagnostic, predictive, and prescriptive analytics) and discuss them individually. Further, we use four perspectives (internal business process, learning and growth, customer, and finance) and discuss the significance of how each of the five BDA-technologies affects the performance measures of these four perspectives. We also present the analysis of employee engagement, average turnover, average net income, and average net assets for BDA-clients and BDA-vendors. Our study also explores the effect of the COVID-19 pandemic on business continuity for both BDA-vendors and BDA-clients.Keywords: BDA-clients, BDA-vendors, big data analytics, financial performance
Procedia PDF Downloads 12726433 Performance and Specific Emissions of an SI Engine Using Anhydrous Ethanol–Gasoline Blends in the City of Bogota
Authors: Alexander García Mariaca, Rodrigo Morillo Castaño, Juan Rolón Ríos
Abstract:
The government of Colombia has promoted the use of biofuels in the last 20 years through laws and resolutions, which regulate their use, with the objective to improve the atmospheric air quality and to promote Colombian agricultural industry. However, despite the use of blends of biofuels with fossil fuels, the air quality in large cities does not get better, this deterioration in the air is mainly caused by mobile sources that working with spark ignition internal combustion engines (SI-ICE), operating with a mixture in volume of 90 % gasoline and 10 % ethanol called E10, that for the case of Bogota represent 84 % of the fleet. Another problem is that Colombia has big cities located above 2200 masl and there are no accurate studies on the impact that the E10 mixture could cause in the emissions and performance of SI-ICE. This study aims to establish the optimal blend between gasoline ethanol in which an SI engine operates more efficiently in urban centres located at 2600 masl. The test was developed on SI engine four-stroke, single cylinder, naturally aspirated and with carburettor for the fuel supply using blends of gasoline and anhydrous ethanol in different ratios E10, E15, E20, E40, E60, E85 and E100. These tests were conducted in the city of Bogota, which is located at 2600 masl, with the engine operating at 3600 rpm and at 25, 50, 75 and 100% of load. The results show that the performance variables as engine brake torque, brake power and brake thermal efficiency decrease, while brake specific fuel consumption increases with the rise in the percentage of ethanol in the mixture. On the other hand, the specific emissions of CO2 and NOx present increases while specific emissions of CO and HC decreases compared to those produced by gasoline. From the tests, it is concluded that the SI-ICE worked more efficiently with the E40 mixture, where was obtained an increases of the brake power of 8.81 % and a reduction on brake specific fuel consumption of 2.5 %, coupled with a reduction in the specific emissions of CO2, HC and CO in 9.72, 52.88 and 76.66 % respectively compared to the results obtained with the E10 blend. This behaviour is because the E40 mixture provides the appropriate amount of the oxygen for the combustion process, which leads to better utilization of available energy in this process, thus generating a comparable power output to the E10 mixing and producing lower emissions CO and HC with the other test blends. Nevertheless, the emission of NOx increases in 106.25 %.Keywords: emissions, ethanol, gasoline, engine, performance
Procedia PDF Downloads 32626432 Rapid Monitoring of Earthquake Damages Using Optical and SAR Data
Authors: Saeid Gharechelou, Ryutaro Tateishi
Abstract:
Earthquake is an inevitable catastrophic natural disaster. The damages of buildings and man-made structures, where most of the human activities occur are the major cause of casualties from earthquakes. A comparison of optical and SAR data is presented in the case of Kathmandu valley which was hardly shaken by 2015-Nepal Earthquake. Though many existing researchers have conducted optical data based estimated or suggested combined use of optical and SAR data for improved accuracy, however finding cloud-free optical images when urgently needed are not assured. Therefore, this research is specializd in developing SAR based technique with the target of rapid and accurate geospatial reporting. Should considers that limited time available in post-disaster situation offering quick computation exclusively based on two pairs of pre-seismic and co-seismic single look complex (SLC) images. The InSAR coherence pre-seismic, co-seismic and post-seismic was used to detect the change in damaged area. In addition, the ground truth data from field applied to optical data by random forest classification for detection of damaged area. The ground truth data collected in the field were used to assess the accuracy of supervised classification approach. Though a higher accuracy obtained from the optical data then integration by optical-SAR data. Limitation of cloud-free images when urgently needed for earthquak evevent are and is not assured, thus further research on improving the SAR based damage detection is suggested. Availability of very accurate damage information is expected for channelling the rescue and emergency operations. It is expected that the quick reporting of the post-disaster damage situation quantified by the rapid earthquake assessment should assist in channeling the rescue and emergency operations, and in informing the public about the scale of damage.Keywords: Sentinel-1A data, Landsat-8, earthquake damage, InSAR, rapid damage monitoring, 2015-Nepal earthquake
Procedia PDF Downloads 17426431 The Preparation of High Surface Area Ni/MgAl2O4 Catalysts for Syngas Methanation
Authors: Jingyu Zhou, Hongfang Ma, Haitao Zhang, Weiyong Ying
Abstract:
High surface area MgAl2O4 supported Nickel catalysts with PVA loadings varying from 0% to 15% were prepared by precipitation and impregnation method. The catalysts were characterized by low temperature N2 adsorption/desorption, X-ray diffraction and H2 temperature programmed reduction. Compared with Ni/γ-Al2O3 catalyst, Ni/MgAl2O4 catalysts exhibited higher activity and selectivity in high temperature. Among the catalysts, Ni/MgAl2O4-5P with 5 wt% PVA showed the best performance, and achieved 95% CO conversion and 96% CH4 selectivity at 600°C, 2.0 MPa, and a WHSV of 12,000 mL·g⁻¹.h⁻¹. It also maintained good stability in 50h life test.Keywords: methanation, MgAl2O4 support, PVA, high surface area
Procedia PDF Downloads 33826430 Scheduling Nodes Activity and Data Communication for Target Tracking in Wireless Sensor Networks
Authors: AmirHossein Mohajerzadeh, Mohammad Alishahi, Saeed Aslishahi, Mohsen Zabihi
Abstract:
In this paper, we consider sensor nodes with the capability of measuring the bearings (relative angle to the target). We use geometric methods to select a set of observer nodes which are responsible for collecting data from the target. Considering the characteristics of target tracking applications, it is clear that significant numbers of sensor nodes are usually inactive. Therefore, in order to minimize the total network energy consumption, a set of sensor nodes, called sentinel, is periodically selected for monitoring, controlling the environment and transmitting data through the network. The other nodes are inactive. Furthermore, the proposed algorithm provides a joint scheduling and routing algorithm to transmit data between network nodes and the fusion center (FC) in which not only provides an efficient way to estimate the target position but also provides an efficient target tracking. Performance evaluation confirms the superiority of the proposed algorithm.Keywords: coverage, routing, scheduling, target tracking, wireless sensor networks
Procedia PDF Downloads 38026429 Urban Big Data: An Experimental Approach to Building-Value Estimation Using Web-Based Data
Authors: Sun-Young Jang, Sung-Ah Kim, Dongyoun Shin
Abstract:
Current real-estate value estimation, difficult for laymen, usually is performed by specialists. This paper presents an automated estimation process based on big data and machine-learning technology that calculates influences of building conditions on real-estate price measurement. The present study analyzed actual building sales sample data for Nonhyeon-dong, Gangnam-gu, Seoul, Korea, measuring the major influencing factors among the various building conditions. Further to that analysis, a prediction model was established and applied using RapidMiner Studio, a graphical user interface (GUI)-based tool for derivation of machine-learning prototypes. The prediction model is formulated by reference to previous examples. When new examples are applied, it analyses and predicts accordingly. The analysis process discerns the crucial factors effecting price increases by calculation of weighted values. The model was verified, and its accuracy determined, by comparing its predicted values with actual price increases.Keywords: apartment complex, big data, life-cycle building value analysis, machine learning
Procedia PDF Downloads 37526428 Blockchain Technology Security Evaluation: Voting System Based on Blockchain
Authors: Omid Amini
Abstract:
Nowadays, technology plays the most important role in the life of human beings because people use technology to share data and to communicate with each other, but the challenge is the security of this data. For instance, as more people turn to technology in the world, more data is generated, and more hackers try to steal or infiltrate data. In addition, the data is under the control of the central authority, which can trigger the challenge of losing information and changing information; this can create widespread anxiety for different people in different communities. In this paper, we sought to investigate Blockchain technology that can guarantee information security and eliminate the challenge of central authority access to information. Now a day, people are suffering from the current voting system. This means that the lack of transparency in the voting system is a big problem for society and the government in most countries, but blockchain technology can be the best alternative to the previous voting system methods because it removes the most important challenge for voting. According to the results, this research can be a good start to getting acquainted with this new technology, especially on the security part and familiarity with how to use a voting system based on blockchain in the world. At the end of this research, it is concluded that the use of blockchain technology can solve the major security problem and lead to a secure and transparent election.Keywords: blockchain, technology, security, information, voting system, transparency
Procedia PDF Downloads 13826427 The Therapeutic Effects of Acupuncture on Oral Dryness and Antibody Modification in Sjogren Syndrome: A Meta-Analysis
Authors: Tzu-Hao Li, Yen-Ying Kung, Chang-Youh Tsai
Abstract:
Oral dryness is a common chief complaint among patients with Sjőgren syndrome (SS), which is a disorder currently known as autoantibodies production; however, to author’s best knowledge, there has been no satisfying pharmacy to relieve the associated symptoms. Hence the effectiveness of other non-pharmacological interventions such as acupuncture should be accessed. We conducted a meta-analysis of randomized clinical trials (RCTs) which evaluated the effectiveness of xerostomia in SS. PubMed, Embase, Cochrane Central Register of Controlled Trials (CENTRAL), Chongqing Weipu Database (CQVIP), China Academic Journals Full-text Database, AiritiLibrary, Chinese Electronic Periodicals Service (CEPS), China National Knowledge Infrastructure (CNKI) Database were searches through May 12, 2018 to select studies. Data for evaluation of subjective and objective xerostomia was extracted and was assessed with random-effects meta-analysis. After searching, a total of 541 references were yielded and five RCTs were included, covering 340 patients dry mouth resulted from SS, among whom 169 patients received acupuncture and 171 patients were control group. Acupuncture group was associated with higher subjective response rate (odds ratio 3.036, 95% confidence interval [CI] 1.828 – 5.042, P < 0.001) and increased salivary flow rate (weighted mean difference [WMD] 3.066, 95% CI 2.969 – 3.164, P < 0.001), as an objective marker. In addition, two studies examined IgG levels, which were lower in the acupuncture group (WMD -166.857, 95% CI -233.138 - -100.576, P < 0.001). Therefore, in the present meta-analysis, acupuncture improves both subjective and objective markers of dry mouth with autoantibodies reduction in patients with SS and is considered as an option of non-pharmacological treatment for SS.Keywords: acupuncture, meta-analysis, Sjogren syndrome, xerostomia
Procedia PDF Downloads 12626426 Phenotype Prediction of DNA Sequence Data: A Machine and Statistical Learning Approach
Authors: Mpho Mokoatle, Darlington Mapiye, James Mashiyane, Stephanie Muller, Gciniwe Dlamini
Abstract:
Great advances in high-throughput sequencing technologies have resulted in availability of huge amounts of sequencing data in public and private repositories, enabling a holistic understanding of complex biological phenomena. Sequence data are used for a wide range of applications such as gene annotations, expression studies, personalized treatment and precision medicine. However, this rapid growth in sequence data poses a great challenge which calls for novel data processing and analytic methods, as well as huge computing resources. In this work, a machine and statistical learning approach for DNA sequence classification based on $k$-mer representation of sequence data is proposed. The approach is tested using whole genome sequences of Mycobacterium tuberculosis (MTB) isolates to (i) reduce the size of genomic sequence data, (ii) identify an optimum size of k-mers and utilize it to build classification models, (iii) predict the phenotype from whole genome sequence data of a given bacterial isolate, and (iv) demonstrate computing challenges associated with the analysis of whole genome sequence data in producing interpretable and explainable insights. The classification models were trained on 104 whole genome sequences of MTB isoloates. Cluster analysis showed that k-mers maybe used to discriminate phenotypes and the discrimination becomes more concise as the size of k-mers increase. The best performing classification model had a k-mer size of 10 (longest k-mer) an accuracy, recall, precision, specificity, and Matthews Correlation coeffient of 72.0%, 80.5%, 80.5%, 63.6%, and 0.4 respectively. This study provides a comprehensive approach for resampling whole genome sequencing data, objectively selecting a k-mer size, and performing classification for phenotype prediction. The analysis also highlights the importance of increasing the k-mer size to produce more biological explainable results, which brings to the fore the interplay that exists amongst accuracy, computing resources and explainability of classification results. However, the analysis provides a new way to elucidate genetic information from genomic data, and identify phenotype relationships which are important especially in explaining complex biological mechanisms.Keywords: AWD-LSTM, bootstrapping, k-mers, next generation sequencing
Procedia PDF Downloads 17226425 Phenotype Prediction of DNA Sequence Data: A Machine and Statistical Learning Approach
Authors: Darlington Mapiye, Mpho Mokoatle, James Mashiyane, Stephanie Muller, Gciniwe Dlamini
Abstract:
Great advances in high-throughput sequencing technologies have resulted in availability of huge amounts of sequencing data in public and private repositories, enabling a holistic understanding of complex biological phenomena. Sequence data are used for a wide range of applications such as gene annotations, expression studies, personalized treatment and precision medicine. However, this rapid growth in sequence data poses a great challenge which calls for novel data processing and analytic methods, as well as huge computing resources. In this work, a machine and statistical learning approach for DNA sequence classification based on k-mer representation of sequence data is proposed. The approach is tested using whole genome sequences of Mycobacterium tuberculosis (MTB) isolates to (i) reduce the size of genomic sequence data, (ii) identify an optimum size of k-mers and utilize it to build classification models, (iii) predict the phenotype from whole genome sequence data of a given bacterial isolate, and (iv) demonstrate computing challenges associated with the analysis of whole genome sequence data in producing interpretable and explainable insights. The classification models were trained on 104 whole genome sequences of MTB isoloates. Cluster analysis showed that k-mers maybe used to discriminate phenotypes and the discrimination becomes more concise as the size of k-mers increase. The best performing classification model had a k-mer size of 10 (longest k-mer) an accuracy, recall, precision, specificity, and Matthews Correlation coeffient of 72.0 %, 80.5 %, 80.5 %, 63.6 %, and 0.4 respectively. This study provides a comprehensive approach for resampling whole genome sequencing data, objectively selecting a k-mer size, and performing classification for phenotype prediction. The analysis also highlights the importance of increasing the k-mer size to produce more biological explainable results, which brings to the fore the interplay that exists amongst accuracy, computing resources and explainability of classification results. However, the analysis provides a new way to elucidate genetic information from genomic data, and identify phenotype relationships which are important especially in explaining complex biological mechanismsKeywords: AWD-LSTM, bootstrapping, k-mers, next generation sequencing
Procedia PDF Downloads 16326424 Livability and Growth Performance of Noiler Chickens Fed with Different Biotic Additives
Authors: Idowu Kemi Ruth, Adeyemo Adedayo Akinade, Iyanda Adegboyega Ibukun, Idowu Olubukola Precious Akinade
Abstract:
Liveability and mortality rate is a germane aspect of product performance that cannot be overlooked in poultry production, while the disease is a major threat in the poultry industry which can cause a major loss for the farmer and a reduction in the total income generated from the stock. Therefore, efforts must be made to enhance the health status of chickens to reduce mortality. The study was conducted to investigate the effect of different biotic additives (prebiotic, probiotic and synbiotic ) on the performance of Noiler females at the growing phase (forty-nine days) till the point of the first egg across the biotic additive. A total of one hundred and twenty-eight female Noiler were used for the experiment. Experimental treatment consisted of prebiotic, probiotic, synbiotic and control at the inclusion rate of a gram into a kilogram of feed. Parameters measured are Feed intake, feed conversion ratio, the weight of the first egg, age of the first egg and livability. Data collected were subjected to a one-way analysis of variance. The result obtained revealed a better growth performance across the treatments than the control group with the least final weight at nineteen weeks of point of lay. Prebiotic treatment had the best age at first lay on day one hundred and thirty seven followed by other treatments on day one hundred and fifty four. However, the size of the eggs was not significantly influenced by the biotic additive. Hence, the experiment can be concluded that the inclusion of different biotic additives influenced the growth performance; likewise, the Prebiotic had a significant effect on the age of first laying in Noiler chicken, and livability was a hundred percent throughout the duration of the experiment.Keywords: prebiotic, probiotic, synbiotic, noiler
Procedia PDF Downloads 9826423 Design and Implementation of Flexible Metadata Editing System for Digital Contents
Authors: K. W. Nam, B. J. Kim, S. J. Lee
Abstract:
Along with the development of network infrastructures, such as high-speed Internet and mobile environment, the explosion of multimedia data is expanding the range of multimedia services beyond voice and data services. Amid this flow, research is actively being done on the creation, management, and transmission of metadata on digital content to provide different services to users. This paper proposes a system for the insertion, storage, and retrieval of metadata about digital content. The metadata server with Binary XML was implemented for efficient storage space and retrieval speeds, and the transport data size required for metadata retrieval was simplified. With the proposed system, the metadata could be inserted into the moving objects in the video, and the unnecessary overlap could be minimized by improving the storage structure of the metadata. The proposed system can assemble metadata into one relevant topic, even if it is expressed in different media or in different forms. It is expected that the proposed system will handle complex network types of data.Keywords: video, multimedia, metadata, editing tool, XML
Procedia PDF Downloads 17526422 Intelligent Fault Diagnosis for the Connection Elements of Modular Offshore Platforms
Authors: Jixiang Lei, Alexander Fuchs, Franz Pernkopf, Katrin Ellermann
Abstract:
Within the Space@Sea project, funded by the Horizon 2020 program, an island consisting of multiple platforms was designed. The platforms are connected by ropes and fenders. The connection is critical with respect to the safety of the whole system. Therefore, fault detection systems are investigated, which could detect early warning signs for a possible failure in the connection elements. Previously, a model-based method called Extended Kalman Filter was developed to detect the reduction of rope stiffness. This method detected several types of faults reliably, but some types of faults were much more difficult to detect. Furthermore, the model-based method is sensitive to environmental noise. When the wave height is low, a long time is needed to detect a fault and the accuracy is not always satisfactory. In this sense, it is necessary to develop a more accurate and robust technique that can detect all rope faults under a wide range of operational conditions. Inspired by this work on the Space at Sea design, we introduce a fault diagnosis method based on deep neural networks. Our method cannot only detect rope degradation by using the acceleration data from each platform but also estimate the contributions of the specific acceleration sensors using methods from explainable AI. In order to adapt to different operational conditions, the domain adaptation technique DANN is applied. The proposed model can accurately estimate rope degradation under a wide range of environmental conditions and help users understand the relationship between the output and the contributions of each acceleration sensor.Keywords: fault diagnosis, deep learning, domain adaptation, explainable AI
Procedia PDF Downloads 18426421 System for Monitoring Marine Turtles Using Unstructured Supplementary Service Data
Authors: Luís Pina
Abstract:
The conservation of marine biodiversity keeps ecosystems in balance and ensures the sustainable use of resources. In this context, technological resources have been used for monitoring marine species to allow biologists to obtain data in real-time. There are different mobile applications developed for data collection for monitoring purposes, but these systems are designed to be utilized only on third-generation (3G) phones or smartphones with Internet access and in rural parts of the developing countries, Internet services and smartphones are scarce. Thus, the objective of this work is to develop a system to monitor marine turtles using Unstructured Supplementary Service Data (USSD), which users can access through basic mobile phones. The system aims to improve the data collection mechanism and enhance the effectiveness of current systems in monitoring sea turtles using any type of mobile device without Internet access. The system will be able to report information related to the biological activities of marine turtles. Also, it will be used as a platform to assist marine conservation entities to receive reports of illegal sales of sea turtles. The system can also be utilized as an educational tool for communities, providing knowledge and allowing the inclusion of communities in the process of monitoring marine turtles. Therefore, this work may contribute with information to decision-making and implementation of contingency plans for marine conservation programs.Keywords: GSM, marine biology, marine turtles, unstructured supplementary service data (USSD)
Procedia PDF Downloads 20826420 A Variant of a Double Structure-Preserving QR Algorithm for Symmetric and Hamiltonian Matrices
Authors: Ahmed Salam, Haithem Benkahla
Abstract:
Recently, an efficient backward-stable algorithm for computing eigenvalues and vectors of a symmetric and Hamiltonian matrix has been proposed. The method preserves the symmetric and Hamiltonian structures of the original matrix, during the whole process. In this paper, we revisit the method. We derive a way for implementing the reduction of the matrix to the appropriate condensed form. Then, we construct a novel version of the implicit QR-algorithm for computing the eigenvalues and vectors.Keywords: block implicit QR algorithm, preservation of a double structure, QR algorithm, symmetric and Hamiltonian structures
Procedia PDF Downloads 41226419 Lessons Learned through a Bicultural Approach to Tsunami Education in Aotearoa New Zealand
Authors: Lucy H. Kaiser, Kate Boersen
Abstract:
Kura Kaupapa Māori (kura) and bilingual schools are primary schools in Aotearoa/New Zealand which operate fully or partially under Māori custom and have curricula developed to include Te Reo Māori and Tikanga Māori (Māori language and cultural practices). These schools were established to support Māori children and their families through reinforcing cultural identity by enabling Māori language and culture to flourish in the field of education. Māori kaupapa (values), Mātauranga Māori (Māori knowledge) and Te Reo are crucial considerations for the development of educational resources developed for kura, bilingual and mainstream schools. The inclusion of hazard risk in education has become an important issue in New Zealand due to the vulnerability of communities to a plethora of different hazards. Māori have an extensive knowledge of their local area and the history of hazards which is often not appropriately recognised within mainstream hazard education resources. Researchers from the Joint Centre for Disaster Research, Massey University and East Coast LAB (Life at the Boundary) in Napier were funded to collaboratively develop a toolkit of tsunami risk reduction activities with schools located in Hawke’s Bay’s tsunami evacuation zones. A Māori-led bicultural approach to developing and running the education activities was taken, focusing on creating culturally and locally relevant materials for students and schools as well as giving students a proactive role in making their communities better prepared for a tsunami event. The community-based participatory research is Māori-centred, framed by qualitative and Kaupapa Maori research methodologies and utilizes a range of data collection methods including interviews, focus groups and surveys. Māori participants, stakeholders and the researchers collaborated through the duration of the project to ensure the programme would align with the wider school curricula and kaupapa values. The education programme applied a tuakana/teina, Māori teaching and learning approach in which high school aged students (tuakana) developed tsunami preparedness activities to run with primary school students (teina). At the end of the education programme, high school students were asked to reflect on their participation, what they had learned and what they had enjoyed during the activities. This paper draws on lessons learned throughout this research project. As an exemplar, retaining a bicultural and bilingual perspective resulted in a more inclusive project as there was variability across the students’ levels of confidence using Te Reo and Māori knowledge and cultural frameworks. Providing a range of different learning and experiential activities including waiata (Māori songs), pūrākau (traditional stories) and games was important to ensure students had the opportunity to participate and contribute using a range of different approaches that were appropriate to their individual learning needs. Inclusion of teachers in facilitation also proved beneficial in assisting classroom behavioral management. Lessons were framed by the tikanga and kawa (protocols) of the school to maintain cultural safety for the researchers and the students. Finally, the tuakana/teina component of the education activities became the crux of the programme, demonstrating a path for Rangatahi to support their whānau and communities through facilitating disaster preparedness, risk reduction and resilience.Keywords: school safety, indigenous, disaster preparedness, children, education, tsunami
Procedia PDF Downloads 12626418 “Octopub”: Geographical Sentiment Analysis Using Named Entity Recognition from Social Networks for Geo-Targeted Billboard Advertising
Authors: Oussama Hafferssas, Hiba Benyahia, Amina Madani, Nassima Zeriri
Abstract:
Although data nowadays has multiple forms; from text to images, and from audio to videos, yet text is still the most used one at a public level. At an academical and research level, and unlike other forms, text can be considered as the easiest form to process. Therefore, a brunch of Data Mining researches has been always under its shadow, called "Text Mining". Its concept is just like data mining’s, finding valuable patterns in data, from large collections and tremendous volumes of data, in this case: Text. Named entity recognition (NER) is one of Text Mining’s disciplines, it aims to extract and classify references such as proper names, locations, expressions of time and dates, organizations and more in a given text. Our approach "Octopub" does not aim to find new ways to improve named entity recognition process, rather than that it’s about finding a new, and yet smart way, to use NER in a way that we can extract sentiments of millions of people using Social Networks as a limitless information source, and Marketing for product promotion as the main domain of application.Keywords: textmining, named entity recognition(NER), sentiment analysis, social media networks (SN, SMN), business intelligence(BI), marketing
Procedia PDF Downloads 59126417 The Trend of Injuries in Building Fire in Tehran from 2002 to 2012
Authors: Mohammadreza Ashouri, Majid Bayatian
Abstract:
Analysis of fire data is a way for the implementation of any plan to improve the level of safety in cities. Such an analysis is able to reveal signs of changes in a given period and can be used as a measure of safety. The information of about 66,341 fires (from 2002 to 2012) released by Tehran Safety Services and Fire-Fighting Organization and data on the population and the number of households provided by Tehran Municipality and the Statistical Yearbook of Iran were extracted. Using the data, the fire changes, the rate of injuries, and mortality rate were determined and analyzed. The rate of injuries and mortality rate of fires per one million population of Tehran were 59.58% and 86.12%, respectively. During the study period, the number of fires and fire stations increased by 104.38% and 102.63%, respectively. Most fires (9.21%) happened in the 4th District of Tehran. The results showed that the recorded fire data have not been systematically planned for fire prevention since one of the ways to reduce injuries caused by fires is to develop a systematic plan for necessary actions in emergency situations. To determine a reliable source for fire prevention, the stages, definitions of working processes and the cause and effect chains should be considered. Therefore, a comprehensive statistical system should be developed for reported and recorded fire data.Keywords: fire statistics, fire analysis, accident prevention, Tehran
Procedia PDF Downloads 18826416 Design and Implementation a Virtualization Platform for Providing Smart Tourism Services
Authors: Nam Don Kim, Jungho Moon, Tae Yun Chung
Abstract:
This paper proposes an Internet of Things (IoT) based virtualization platform for providing smart tourism services. The virtualization platform provides a consistent access interface to various types of data by naming IoT devices and legacy information systems as pathnames in a virtual file system. In the other words, the IoT virtualization platform functions as a middleware which uses the metadata for underlying collected data. The proposed platform makes it easy to provide customized tourism information by using tourist locations collected by IoT devices and additionally enables to create new interactive smart tourism services focused on the tourist locations. The proposed platform is very efficient so that the provided tourism services are isolated from changes in raw data and the services can be modified or expanded without changing the underlying data structure.Keywords: internet of things (IoT), IoT platform, serviceplatform, virtual file system (VSF)
Procedia PDF Downloads 50826415 Zn-, Mg- and Ni-Al-NO₃ Layered Double Hydroxides Intercalated by Nitrate Anions for Treatment of Textile Wastewater
Authors: Fatima Zahra Mahjoubi, Abderrahim Khalidi, Mohamed Abdennouri, Omar Cherkaoui, Noureddine Barka
Abstract:
Industrial effluents are one of the major causes of environmental pollution, especially effluents discharged from various dyestuff manufactures, plastic, and paper making industries. These effluents can give rise to certain hazards and environmental problems for their highly colored suspended organic solid. Dye effluents are not only aesthetic pollutants, but coloration of water by the dyes may affect photochemical activities in aquatic systems by reducing light penetration. It has been also reported that several commonly used dyes are carcinogenic and mutagenic for aquatic organisms. Therefore, removing dyes from effluents is of significant importance. Many adsorbent materials have been prepared in the removal of dyes from wastewater, including anionic clay or layered double hydroxyde. The zinc/aluminium (Zn-AlNO₃), magnesium/aluminium (Mg-AlNO₃) and nickel/aluminium (Ni-AlNO₃) layered double hydroxides (LDHs) were successfully synthesized via coprecipitation method. Samples were characterized by XRD, FTIR, TGA/DTA, TEM and pHPZC analysis. XRD patterns showed a basal spacing increase in the order of Zn-AlNO₃ (8.85Å)> Mg-AlNO₃ (7.95Å)> Ni-AlNO₃ (7.82Å). FTIR spectrum confirmed the presence of nitrate anions in the LDHs interlayer. The TEM images indicated that the Zn-AlNO3 presents circular to shaped particles with an average particle size of approximately 30 to 40 nm. Small plates assigned to sheets with hexagonal form were observed in the case of Mg-AlNO₃. Ni-AlNO₃ display nanostructured sphere in diameter between 5 and 10 nm. The LDHs were used as adsorbents for the removal of methyl orange (MO), as a model dye and for the treatment of an effluent generated by a textile factory. Adsorption experiments for MO were carried out as function of solution pH, contact time and initial dye concentration. Maximum adsorption was occurred at acidic solution pH. Kinetic data were tested using pseudo-first-order and pseudo-second-order kinetic models. The best fit was obtained with the pseudo-second-order kinetic model. Equilibrium data were correlated to Langmuir and Freundlich isotherm models. The best conditions for color and COD removal from textile effluent sample were obtained at lower values of pH. Total color removal was obtained with Mg-AlNO₃ and Ni-AlNO₃ LDHs. Reduction of COD to limits authorized by Moroccan standards was obtained with 0.5g/l LDHs dose.Keywords: chemical oxygen demand, color removal, layered double hydroxides, textile wastewater treatment
Procedia PDF Downloads 35826414 A Review on 3D Smart City Platforms Using Remotely Sensed Data to Aid Simulation and Urban Analysis
Authors: Slim Namouchi, Bruno Vallet, Imed Riadh Farah
Abstract:
3D urban models provide powerful tools for decision making, urban planning, and smart city services. The accuracy of this 3D based systems is directly related to the quality of these models. Since manual large-scale modeling, such as cities or countries is highly time intensive and very expensive process, a fully automatic 3D building generation is needed. However, 3D modeling process result depends on the input data, the proprieties of the captured objects, and the required characteristics of the reconstructed 3D model. Nowadays, producing 3D real-world model is no longer a problem. Remotely sensed data had experienced a remarkable increase in the recent years, especially data acquired using unmanned aerial vehicles (UAV). While the scanning techniques are developing, the captured data amount and the resolution are getting bigger and more precise. This paper presents a literature review, which aims to identify different methods of automatic 3D buildings extractions either from LiDAR or the combination of LiDAR and satellite or aerial images. Then, we present open source technologies, and data models (e.g., CityGML, PostGIS, Cesiumjs) used to integrate these models in geospatial base layers for smart city services.Keywords: CityGML, LiDAR, remote sensing, SIG, Smart City, 3D urban modeling
Procedia PDF Downloads 13726413 Structural Damage Detection via Incomplete Model Data Using Output Data Only
Authors: Ahmed Noor Al-qayyim, Barlas Özden Çağlayan
Abstract:
Structural failure is caused mainly by damage that often occurs on structures. Many researchers focus on obtaining very efficient tools to detect the damage in structures in the early state. In the past decades, a subject that has received considerable attention in literature is the damage detection as determined by variations in the dynamic characteristics or response of structures. This study presents a new damage identification technique. The technique detects the damage location for the incomplete structure system using output data only. The method indicates the damage based on the free vibration test data by using “Two Points - Condensation (TPC) technique”. This method creates a set of matrices by reducing the structural system to two degrees of freedom systems. The current stiffness matrices are obtained from optimization of the equation of motion using the measured test data. The current stiffness matrices are compared with original (undamaged) stiffness matrices. High percentage changes in matrices’ coefficients lead to the location of the damage. TPC technique is applied to the experimental data of a simply supported steel beam model structure after inducing thickness change in one element. Where two cases are considered, the method detects the damage and determines its location accurately in both cases. In addition, the results illustrate that these changes in stiffness matrix can be a useful tool for continuous monitoring of structural safety using ambient vibration data. Furthermore, its efficiency proves that this technique can also be used for big structures.Keywords: damage detection, optimization, signals processing, structural health monitoring, two points–condensation
Procedia PDF Downloads 36726412 Spontaneous Message Detection of Annoying Situation in Community Networks Using Mining Algorithm
Authors: P. Senthil Kumari
Abstract:
Main concerns in data mining investigation are social controls of data mining for handling ambiguity, noise, or incompleteness on text data. We describe an innovative approach for unplanned text data detection of community networks achieved by classification mechanism. In a tangible domain claim with humble secrecy backgrounds provided by community network for evading annoying content is presented on consumer message partition. To avoid this, mining methodology provides the capability to unswervingly switch the messages and similarly recover the superiority of ordering. Here we designated learning-centered mining approaches with pre-processing technique to complete this effort. Our involvement of work compact with rule-based personalization for automatic text categorization which was appropriate in many dissimilar frameworks and offers tolerance value for permits the background of comments conferring to a variety of conditions associated with the policy or rule arrangements processed by learning algorithm. Remarkably, we find that the choice of classifier has predicted the class labels for control of the inadequate documents on community network with great value of effect.Keywords: text mining, data classification, community network, learning algorithm
Procedia PDF Downloads 51126411 Assessment of Morphodynamic Changes at Kaluganga River Outlet, Sri Lanka Due to Poorly Planned Flood Controlling Measures
Authors: G. P. Gunasinghe, Lilani Ruhunage, N. P. Ratnayake, G. V. I. Samaradivakara, H. M. R. Premasiri, A. S. Ratnayake, Nimila Dushantha, W. A. P. Weerakoon, K. B. A. Silva
Abstract:
Sri Lanka is affected by different natural disasters such as tsunami, landslides, lightning, and riverine flood. Out of them, riverine floods act as a major disaster in the country. Different strategies are applied to control the impacts of flood hazards, and the expansion of river mouth is considered as one of the main activities for flood mitigation and disaster reduction. However, due to this expansion process, natural sand barriers including sand spits, barrier islands, and tidal planes are destroyed or subjected to change. This, in turn, can change the hydrodynamics and sediment dynamics of the area leading to other damages to the natural coastal features. The removal of a considerable portion of naturally formed sand barrier at Kaluganga River outlet (Calido Beach), Sri Lanka to control flooding event at Kaluthara urban area on May 2017, has become a serious issue in the area causing complete collapse of river mouth barrier spit bar system leading to rapid coastal erosion Kaluganga river outlet area and saltwater intrusion into the Kaluganga River. The present investigation is focused on assessing effects due to the removal of a considerable portion of naturally formed sand barrier at Kaluganga river mouth. For this study, the beach profiles, the bathymetric surveys, and Google Earth historical satellite images, before and after the flood event were collected and analyzed. Furthermore, a beach boundary survey was also carried out in October 2018 to support the satellite image data. The results of Google Earth satellite images and beach boundary survey data analyzed show a chronological breakdown of the sand barrier at the river outlet. The comparisons of pre and post-disaster bathymetric maps and beach profiles analysis revealed a noticeable deepening of the sea bed at the nearshore zone as well. Such deepening in the nearshore zone can cause the sea waves to break very near to the coastline. This might also lead to generate new diffraction patterns resulting in differential coastal accretion and erosion scenarios. Unless immediate mitigatory measures were not taken, the impacts may cause severe problems to the sensitive Kaluganag river mouth system.Keywords: bathymetry, beach profiles, coastal features, river outlet, sand barrier, Sri Lanka
Procedia PDF Downloads 14126410 The Effects of L-Arginine Supplementation on Clinical Symptoms, Quality of Life, and Anal Internal Sphincter Pressure in Patients with Chronic Anal Fissure
Authors: Masoumeh Khailghi Sikaroudi, Mohsen Masoodi, Fazad Shidfar, Meghdad Sedaghat
Abstract:
Background: The hypertonicity of internal anal sphincter resting pressure is one of the main reasons for chronic anal fissures. The aim of this study is to assess the effect of oral administration of L-arginine on anal fissure symptom improvement by relaxation of the internal anal sphincter. Method: Seventy-six chronic anal fissure patients (age: 18-65 years) took part in this randomized, double-blind, placebo-controlled trial study from February 2019 to October 2020 at Rasoul-e-Akram Hospital, Tehran, Iran. Participants were allocated into treatment (L-arginine) or placebo groups. They took a 1000 mg capsule three times a day for one month and were followed up at the end of the first and third months after receiving the intervention. Clinical symptoms, anal sphincter resting pressure, and quality of life (QoL) were completed at baseline and the end of the study. Result: The analysis of data was shown significant improvement in bleeding, fissure size, and pain within each group; however, this effect was more seen in the arginine group compared to the control group at the end of the study (P-values<0.001). Following that, a significant increase in QoL was seen just in patients who were treated with arginine (P-value=0.006). Also, the comparison of anal pressures to baseline and between groups at the end of the study showed a significant reduction in sphincter pressure in treated patients (P-value<0.001, =0.049; respectively). Conclusion: Oral administration of 3000 mg L-arginine can heal chronic anal fissures by reducing anal internal sphincter pressure with fewer side effects. However, a long-term study with more follow-up is recommended.Keywords: L-arginine, anal fissure, sphincter pressure, clinical symptoms, quality of life
Procedia PDF Downloads 7526409 Reduced Model Investigations Supported by Fuzzy Cognitive Map to Foster Circular Economy
Authors: A. Buruzs, M. F. Hatwágner, L. T. Kóczy
Abstract:
The aim of the present paper is to develop an integrated method that may provide assistance to decision makers during system planning, design, operation and evaluation. In order to support the realization of Circular Economy (CE), it is essential to evaluate local needs and conditions which help to select the most appropriate system components and resource needs. Each of these activities requires careful planning, however, the model of CE offers a comprehensive interdisciplinary framework. The aim of this research was to develop and to introduce a practical methodology for evaluation of local and regional opportunities to promote CE.Keywords: circular economy, factors, fuzzy cognitive map, model reduction, sustainability
Procedia PDF Downloads 24826408 Expanding the Evaluation Criteria for a Wind Turbine Performance
Authors: Ivan Balachin, Geanette Polanco, Jiang Xingliang, Hu Qin
Abstract:
The problem of global warming raised up interest towards renewable energy sources. To reduce cost of wind energy is a challenge. Before building of wind park conditions such as: average wind speed, direction, time for each wind, probability of icing, must be considered in the design phase. Operation values used on the setting of control systems also will depend on mentioned variables. Here it is proposed a procedure to be include in the evaluation of the performance of a wind turbine, based on the amplitude of wind changes, the number of changes and their duration. A generic study case based on actual data is presented. Data analysing techniques were applied to model the power required for yaw system based on amplitude and data amount of wind changes. A theoretical model between time, amplitude of wind changes and angular speed of nacelle rotation was identified.Keywords: field data processing, regression determination, wind turbine performance, wind turbine placing, yaw system losses
Procedia PDF Downloads 393