Search results for: spectroscopy data analysis
40995 Aging Evaluation of Ammonium Perchlorate/Hydroxyl Terminated Polybutadiene-Based Solid Rocket Engine by Reactive Molecular Dynamics Simulation and Thermal Analysis
Authors: R. F. B. Gonçalves, E. N. Iwama, J. A. F. F. Rocco, K. Iha
Abstract:
Propellants based on Hydroxyl Terminated Polybutadiene/Ammonium Perchlorate (HTPB/AP) are the most commonly used in most of the rocket engines used by the Brazilian Armed Forces. This work aimed at the possibility of extending its useful life (currently in 10 years) by performing kinetic-chemical analyzes of its energetic material via Differential Scanning Calorimetry (DSC) and also performing computer simulation of aging process using the software Large-scale Atomic/Molecular Massively Parallel Simulator (LAMMPS). Thermal analysis via DSC was performed in triplicates and in three heating ratios (5 ºC, 10 ºC, and 15 ºC) of rocket motor with 11 years shelf-life, using the Arrhenius equation to obtain its activation energy, using Ozawa and Kissinger kinetic methods, allowing comparison with manufacturing period data (standard motor). In addition, the kinetic parameters of internal pressure of the combustion chamber in 08 rocket engines with 11 years of shelf-life were also acquired, for comparison purposes with the engine start-up data.Keywords: shelf-life, thermal analysis, Ozawa method, Kissinger method, LAMMPS software, thrust
Procedia PDF Downloads 12740994 Purchasing Decision-Making in Supply Chain Management: A Bibliometric Analysis
Authors: Ahlem Dhahri, Waleed Omri, Audrey Becuwe, Abdelwahed Omri
Abstract:
In industrial processes, decision-making ranges across different scales, from process control to supply chain management. The purchasing decision-making process in the supply chain is presently gaining more attention as a critical contributor to the company's strategic success. Given the scarcity of thorough summaries in the prior studies, this bibliometric analysis aims to adopt a meticulous approach to achieve quantitative knowledge on the constantly evolving subject of purchasing decision-making in supply chain management. Through bibliometric analysis, we examine a sample of 358 peer-reviewed articles from the Scopus database. VOSviewer and Gephi software were employed to analyze, combine, and visualize the data. Data analytic techniques, including citation network, page-rank analysis, co-citation, and publication trends, have been used to identify influential works and outline the discipline's intellectual structure. The outcomes of this descriptive analysis highlight the most prominent articles, authors, journals, and countries based on their citations and publications. The findings from the research illustrate an increase in the number of publications, exhibiting a slightly growing trend in this field. Co-citation analysis coupled with content analysis of the most cited articles identified five research themes mentioned as follows integrating sustainability into the supplier selection process, supplier selection under disruption risks assessment and mitigation strategies, Fuzzy MCDM approaches for supplier evaluation and selection, purchasing decision in vendor problems, decision-making techniques in supplier selection and order lot sizing problems. With the help of a graphic timeline, this exhaustive map of the field illustrates a visual representation of the evolution of publications that demonstrate a gradual shift from research interest in vendor selection problems to integrating sustainability in the supplier selection process. These clusters offer insights into a wide variety of purchasing methods and conceptual frameworks that have emerged; however, they have not been validated empirically. The findings suggest that future research would emerge with a greater depth of practical and empirical analysis to enrich the theories. These outcomes provide a powerful road map for further study in this area.Keywords: bibliometric analysis, citation analysis, co-citation, Gephi, network analysis, purchasing, SCM, VOSviewer
Procedia PDF Downloads 8540993 Tagging a corpus of Media Interviews with Diplomats: Challenges and Solutions
Authors: Roberta Facchinetti, Sara Corrizzato, Silvia Cavalieri
Abstract:
Increasing interconnection between data digitalization and linguistic investigation has given rise to unprecedented potentialities and challenges for corpus linguists, who need to master IT tools for data analysis and text processing, as well as to develop techniques for efficient and reliable annotation in specific mark-up languages that encode documents in a format that is both human and machine-readable. In the present paper, the challenges emerging from the compilation of a linguistic corpus will be taken into consideration, focusing on the English language in particular. To do so, the case study of the InterDiplo corpus will be illustrated. The corpus, currently under development at the University of Verona (Italy), represents a novelty in terms both of the data included and of the tag set used for its annotation. The corpus covers media interviews and debates with diplomats and international operators conversing in English with journalists who do not share the same lingua-cultural background as their interviewees. To date, this appears to be the first tagged corpus of international institutional spoken discourse and will be an important database not only for linguists interested in corpus analysis but also for experts operating in international relations. In the present paper, special attention will be dedicated to the structural mark-up, parts of speech annotation, and tagging of discursive traits, that are the innovational parts of the project being the result of a thorough study to find the best solution to suit the analytical needs of the data. Several aspects will be addressed, with special attention to the tagging of the speakers’ identity, the communicative events, and anthropophagic. Prominence will be given to the annotation of question/answer exchanges to investigate the interlocutors’ choices and how such choices impact communication. Indeed, the automated identification of questions, in relation to the expected answers, is functional to understand how interviewers elicit information as well as how interviewees provide their answers to fulfill their respective communicative aims. A detailed description of the aforementioned elements will be given using the InterDiplo-Covid19 pilot corpus. The data yielded by our preliminary analysis of the data will highlight the viable solutions found in the construction of the corpus in terms of XML conversion, metadata definition, tagging system, and discursive-pragmatic annotation to be included via Oxygen.Keywords: spoken corpus, diplomats’ interviews, tagging system, discursive-pragmatic annotation, english linguistics
Procedia PDF Downloads 18540992 Application of Public Access Two-Dimensional Hydrodynamic and Distributed Hydrological Models for Flood Forecasting in Ungauged Basins
Authors: Ahmad Shayeq Azizi, Yuji Toda
Abstract:
In Afghanistan, floods are the most frequent and recurrent events among other natural disasters. On the other hand, lack of monitoring data is a severe problem, which increases the difficulty of making the appropriate flood countermeasures of flood forecasting. This study is carried out to simulate the flood inundation in Harirud River Basin by application of distributed hydrological model, Integrated Flood Analysis System (IFAS) and 2D hydrodynamic model, International River Interface Cooperative (iRIC) based on satellite rainfall combined with historical peak discharge and global accessed data. The results of the simulation can predict the inundation area, depth and velocity, and the hardware countermeasures such as the impact of levee installation can be discussed by using the present method. The methodology proposed in this study is suitable for the area where hydrological and geographical data including river survey data are poorly observed.Keywords: distributed hydrological model, flood inundation, hydrodynamic model, ungauged basins
Procedia PDF Downloads 16640991 Examining the Attitudes of Pre-School Teachers towards Values Education in Terms of Gender, School Type, Professional Seniority and Location
Authors: Hatice Karakoyun, Mustafa Akdag
Abstract:
This study has been made to examine the attitudes of pre-school teachers towards values education. The study has been made as a general scanning model. The study’s working group contains 108 pre-school teachers who worked in Diyarbakır, Turkey. In this study Values Education Attitude Scale (VEAS), which developed by Yaşaroğlu (2014), was used. In order to analyze the data for sociodemographic structure, percentage and frequency values were examined. The Kolmogorov-Smirnov method was used in determination of the normal distribution of data. During analyzing the data, KolmogorovSimirnov test and the normal curved histograms were examined to determine which statistical analyzes would be applied on the scale and it was found that the distribution was not normal. Thus, the Mann Whitney U analysis technique which is one of the nonparametric statistical analysis techniques were used to test the difference of the scores obtained from the scale in terms of independent variables. According to the analyses, it seems that pre-school teachers’ attitudes toward values education are positive. According to the scale with the highest average, it points out that pre-school teachers think that values education is very important for students’ and children’s future. The variables included in the scale (gender, seniority, age group, education, school type, school place) seem to have no effect on the pre-school teachers’ attitude grades which joined to the study.Keywords: attitude scale, pedagogy, pre-school teacher, values education
Procedia PDF Downloads 24840990 Mostar Type Indices and QSPR Analysis of Octane Isomers
Authors: B. Roopa Sri, Y Lakshmi Naidu
Abstract:
Chemical Graph Theory (CGT) is the branch of mathematical chemistry in which molecules are modeled to study their physicochemical properties using molecular descriptors. Amongst these descriptors, topological indices play a vital role in predicting the properties by defining the graph topology of the molecule. Recently, the bond-additive topological index known as the Mostar index has been proposed. In this paper, we compute the Mostar-type indices of octane isomers and use the data obtained to perform QSPR analysis. Furthermore, we show the correlation between the Mostar type indices and the properties.Keywords: chemical graph theory, mostar type indices, octane isomers, qspr analysis, topological index
Procedia PDF Downloads 13040989 Destination Decision Model for Cruising Taxis Based on Embedding Model
Authors: Kazuki Kamada, Haruka Yamashita
Abstract:
In Japan, taxi is one of the popular transportations and taxi industry is one of the big businesses. However, in recent years, there has been a difficult problem of reducing the number of taxi drivers. In the taxi business, mainly three passenger catching methods are applied. One style is "cruising" that drivers catches passengers while driving on a road. Second is "waiting" that waits passengers near by the places with many requirements for taxies such as entrances of hospitals, train stations. The third one is "dispatching" that is allocated based on the contact from the taxi company. Above all, the cruising taxi drivers need the experience and intuition for finding passengers, and it is difficult to decide "the destination for cruising". The strong recommendation system for the cruising taxies supports the new drivers to find passengers, and it can be the solution for the decreasing the number of drivers in the taxi industry. In this research, we propose a method of recommending a destination for cruising taxi drivers. On the other hand, as a machine learning technique, the embedding models that embed the high dimensional data to a low dimensional space is widely used for the data analysis, in order to represent the relationship of the meaning between the data clearly. Taxi drivers have their favorite courses based on their experiences, and the courses are different for each driver. We assume that the course of cruising taxies has meaning such as the course for finding business man passengers (go around the business area of the city of go to main stations) and course for finding traveler passengers (go around the sightseeing places or big hotels), and extract the meaning of their destinations. We analyze the cruising history data of taxis based on the embedding model and propose the recommendation system for passengers. Finally, we demonstrate the recommendation of destinations for cruising taxi drivers based on the real-world data analysis using proposing method.Keywords: taxi industry, decision making, recommendation system, embedding model
Procedia PDF Downloads 13840988 Examining Social Connectivity through Email Network Analysis: Study of Librarians' Emailing Groups in Pakistan
Authors: Muhammad Arif Khan, Haroon Idrees, Imran Aziz, Sidra Mushtaq
Abstract:
Social platforms like online discussion and mailing groups are well aligned with academic as well as professional learning spaces. Professional communities are increasingly moving to online forums for sharing and capturing the intellectual abilities. This study investigated dynamics of social connectivity of yahoo mailing groups of Pakistani Library and Information Science (LIS) professionals using Graph Theory technique. Design/Methodology: Social Network Analysis is the increasingly concerned domain for scientists in identifying whether people grow together through online social interaction or, whether they just reflect connectivity. We have conducted a longitudinal study using Network Graph Theory technique to analyze the large data-set of email communication. The data was collected from three yahoo mailing groups using network analysis software over a period of six months i.e. January to June 2016. Findings of the network analysis were reviewed through focus group discussion with LIS experts and selected respondents of the study. Data were analyzed in Microsoft Excel and network diagrams were visualized using NodeXL and ORA-Net Scene package. Findings: Findings demonstrate that professionals and students exhibit intellectual growth the more they get tied within a network by interacting and participating in communication through online forums. The study reports on dynamics of the large network by visualizing the email correspondence among group members in a network consisting vertices (members) and edges (randomized correspondence). The model pair wise relationship between group members was illustrated to show characteristics, reasons, and strength of ties. Connectivity of nodes illustrated the frequency of communication among group members through examining node coupling, diffusion of networks, and node clustering has been demonstrated in-depth. Network analysis was found to be a useful technique in investigating the dynamics of the large network.Keywords: emailing networks, network graph theory, online social platforms, yahoo mailing groups
Procedia PDF Downloads 24040987 Analysis of Path Nonparametric Truncated Spline Maximum Cubic Order in Farmers Loyalty Modeling
Authors: Adji Achmad Rinaldo Fernandes
Abstract:
Path analysis tests the relationship between variables through cause and effect. Before conducting further tests on path analysis, the assumption of linearity must be met. If the shape of the relationship is not linear and the shape of the curve is unknown, then use a nonparametric approach, one of which is a truncated spline. The purpose of this study is to estimate the function and get the best model on the nonparametric truncated spline path of linear, quadratic, and cubic orders with 1 and 2-knot points and determine the significance of the best function estimator in modeling farmer loyalty through the jackknife resampling method. This study uses secondary data through questionnaires to farmers in Sumbawa Regency who use SP-36 subsidized fertilizer products as many as 100 respondents. Based on the results of the analysis, it is known that the best-truncated spline nonparametric path model is the quadratic order of 2 knots with a coefficient of determination of 85.50%; the significance of the best-truncated spline nonparametric path estimator shows that all exogenous variables have a significant effect on endogenous variables.Keywords: nonparametric path analysis, farmer loyalty, jackknife resampling, truncated spline
Procedia PDF Downloads 4740986 Qualitative Case Studies in Reading Specialist Education
Authors: Carol Leroy
Abstract:
This presentation focuses on the analysis qualitative case studies in the graduate education of reading specialists. The presentation describes the development and application of an integrated conceptual framework for reading specialist education, drawing on Robert Stake’s work on case study research, Kenneth Zeichner’s work on professional learning, and various tools for reading assessment (e.g. the Qualitative Reading Inventory). Social constructivist theory is used to provide intersecting links between the various influences on the processes used to assess and teaching reading within the case study framework. Illustrative examples are described to show the application of the framework in reading specialist education in a teaching clinic at a large urban university. Central to education of reading specialists in this teaching clinic is the collection, analysis and interpretation of data for the design and implementation of reading and writing programs for struggling readers and writers. The case study process involves the integrated interpretation of data, which is central to qualitative case study inquiry. An emerging theme in this approach to graduate education is the ambiguity and uncertainty that governs work with the adults and children who attend the clinic for assistance. Tensions and contradictions are explored insofar as they reveal overlapping but intersecting frameworks for case study analysis in the area of literacy education. An additional theme is the interplay of multiple layers of data with a resulting depth that goes beyond the practical need of the client and toward the deeper pedagogical growth of the reading specialist. The presentation makes a case for the value of qualitative case studies in reading specialist education. Further, the use of social constructivism as a unifying paradigm provides a robustness to the conceptual framework as a tool for understanding the pedagogy that is involved.Keywords: assessment, case study, professional education, reading
Procedia PDF Downloads 45840985 Empirical Study of Running Correlations in Exam Marks: Same Statistical Pattern as Chance
Authors: Weisi Guo
Abstract:
It is well established that there may be running correlations in sequential exam marks due to students sitting in the order of course registration patterns. As such, a random and non-sequential sampling of exam marks is a standard recommended practice. Here, the paper examines a large number of exam data stretching several years across different modules to see the degree to which it is true. Using the real mark distribution as a generative process, it was found that random simulated data had no more sequential randomness than the real data. That is to say, the running correlations that one often observes are statistically identical to chance. Digging deeper, it was found that some high running correlations have students that indeed share a common course history and make similar mistakes. However, at the statistical scale of a module question, the combined effect is statistically similar to the random shuffling of papers. As such, there may not be the need to take random samples for marks, but it still remains good practice to mark papers in a random sequence to reduce the repetitive marking bias and errors.Keywords: data analysis, empirical study, exams, marking
Procedia PDF Downloads 18140984 Modeling Activity Pattern Using XGBoost for Mining Smart Card Data
Authors: Eui-Jin Kim, Hasik Lee, Su-Jin Park, Dong-Kyu Kim
Abstract:
Smart-card data are expected to provide information on activity pattern as an alternative to conventional person trip surveys. The focus of this study is to propose a method for training the person trip surveys to supplement the smart-card data that does not contain the purpose of each trip. We selected only available features from smart card data such as spatiotemporal information on the trip and geographic information system (GIS) data near the stations to train the survey data. XGboost, which is state-of-the-art tree-based ensemble classifier, was used to train data from multiple sources. This classifier uses a more regularized model formalization to control the over-fitting and show very fast execution time with well-performance. The validation results showed that proposed method efficiently estimated the trip purpose. GIS data of station and duration of stay at the destination were significant features in modeling trip purpose.Keywords: activity pattern, data fusion, smart-card, XGboost
Procedia PDF Downloads 24740983 Developing Rice Disease Analysis System on Mobile via iOS Operating System
Authors: Rujijan Vichivanives, Kittiya Poonsilp, Canasanan Wanavijit
Abstract:
This research aims to create mobile tools to analyze rice disease quickly and easily. The principle of object-oriented software engineering and objective-C language were used for software development methodology and the principle of decision tree technique was used for analysis method. Application users can select the features of rice disease or the color appears on the rice leaves for recognition analysis results on iOS mobile screen. After completing the software development, unit testing and integrating testing method were used to check for program validity. In addition, three plant experts and forty farmers have been assessed for usability and benefit of this system. The overall of users’ satisfaction was found in a good level, 57%. The plant experts give a comment on the addition of various disease symptoms in the database for more precise results of the analysis. For further research, it is suggested that image processing system should be developed as a tool that allows users search and analyze for rice diseases more convenient with great accuracy.Keywords: rice disease, data analysis system, mobile application, iOS operating system
Procedia PDF Downloads 28740982 Fuzzy Multi-Component DEA with Shared and Undesirable Fuzzy Resources
Authors: Jolly Puri, Shiv Prasad Yadav
Abstract:
Multi-component data envelopment analysis (MC-DEA) is a popular technique for measuring aggregate performance of the decision making units (DMUs) along with their components. However, the conventional MC-DEA is limited to crisp input and output data which may not always be available in exact form. In real life problems, data may be imprecise or fuzzy. Therefore, in this paper, we propose (i) a fuzzy MC-DEA (FMC-DEA) model in which shared and undesirable fuzzy resources are incorporated, (ii) the proposed FMC-DEA model is transformed into a pair of crisp models using cut approach, (iii) fuzzy aggregate performance of a DMU and fuzzy efficiencies of components are defined to be fuzzy numbers, and (iv) a numerical example is illustrated to validate the proposed approach.Keywords: multi-component DEA, fuzzy multi-component DEA, fuzzy resources, decision making units (DMUs)
Procedia PDF Downloads 40740981 A Computational Cost-Effective Clustering Algorithm in Multidimensional Space Using the Manhattan Metric: Application to the Global Terrorism Database
Authors: Semeh Ben Salem, Sami Naouali, Moetez Sallami
Abstract:
The increasing amount of collected data has limited the performance of the current analyzing algorithms. Thus, developing new cost-effective algorithms in terms of complexity, scalability, and accuracy raised significant interests. In this paper, a modified effective k-means based algorithm is developed and experimented. The new algorithm aims to reduce the computational load without significantly affecting the quality of the clusterings. The algorithm uses the City Block distance and a new stop criterion to guarantee the convergence. Conducted experiments on a real data set show its high performance when compared with the original k-means version.Keywords: pattern recognition, global terrorism database, Manhattan distance, k-means clustering, terrorism data analysis
Procedia PDF Downloads 38640980 Sol-Gel Derived Yttria-Stabilized Zirconia Nanoparticles for Dental Applications: Synthesis and Characterization
Authors: Anastasia Beketova, Emmanouil-George C. Tzanakakis, Ioannis G. Tzoutzas, Eleana Kontonasaki
Abstract:
In restorative dentistry, yttria-stabilized zirconia (YSZ) nanoparticles can be applied as fillers to improve the mechanical properties of various resin-based materials. Using sol-gel based synthesis as simple and cost-effective method, nano-sized YSZ particles with high purity can be produced. The aim of this study was to synthesize YSZ nanoparticles by the Pechini sol-gel method at different temperatures and to investigate their composition, structure, and morphology. YSZ nanopowders were synthesized by the sol-gel method using zirconium oxychloride octahydrate (ZrOCl₂.8H₂O) and yttrium nitrate hexahydrate (Y(NO₃)₃.6H₂O) as precursors with the addition of acid chelating agents to control hydrolysis and gelation reactions. The obtained powders underwent TG_DTA analysis and were sintered at three different temperatures: 800, 1000, and 1200°C for 2 hours. Their composition and morphology were investigated by Fourier Transform Infrared Spectroscopy (FTIR), X-Ray Diffraction Analysis (XRD), Scanning Electron Microscopy with associated with Energy Dispersive X-ray analyzer (SEM-EDX), Transmission Electron Microscopy (TEM) methods, and Dynamic Light Scattering (DLS). FTIR and XRD analysis showed the presence of pure tetragonal phase in the composition of nanopowders. By increasing the calcination temperature, the crystallinity of materials increased, reaching 47.2 nm for the YSZ1200 specimens. SEM analysis at high magnifications and DLS analysis showed submicron-sized particles with good dispersion and low agglomeration, which increased in size as the sintering temperature was elevated. From the TEM images of the YSZ1000 specimen, it can be seen that zirconia nanoparticles are uniform in size and shape and attain an average particle size of about 50 nm. The electron diffraction patterns clearly revealed ring patterns of polycrystalline tetragonal zirconia phase. Pure YSZ nanopowders have been successfully synthesized by the sol-gel method at different temperatures. Their size is small, and uniform, allowing their incorporation of dental luting resin cements to improve their mechanical properties and possibly enhance the bond strength of demanding dental ceramics such as zirconia to the tooth structure. This research is co-financed by Greece and the European Union (European Social Fund- ESF) through the Operational Programme 'Human Resources Development, Education and Lifelong Learning 2014- 2020' in the context of the project 'Development of zirconia adhesion cements with stabilized zirconia nanoparticles: physicochemical properties and bond strength under aging conditions' (MIS 5047876).Keywords: dental cements, nanoparticles, sol-gel, yttria-stabilized zirconia, YSZ
Procedia PDF Downloads 14740979 Olive Leaf Extract as Natural Corrosion Inhibitor for Pure Copper in 0.5 M NaCl Solution: A Study by Voltammetry around OCP
Authors: Chahla Rahal, Philippe Refait
Abstract:
Oleuropein-rich extract from olive leaf and acid hydrolysates, rich in hydroxytyrosol and elenolic acid was prepared under different experimental conditions. These phenolic compounds may be used as a corrosion inhibitor. The inhibitive action of these extracts and its major constituents on the corrosion of copper in 0.5 M NaCl solution has been evaluated by potentiodynamic polarization, electrochemical impedance spectroscopy (EIS) and weight loss measurements. The product of extraction was analyzed with high performance liquid chromatography (HPLC), whose analysis shows that olive leaf extract are greatly rich in phenolic compounds, mainly Oleuropeine (OLE), Hydroxytyrosol (HT) and elenolic acid (EA). After the acid hydrolysis and high temperature of extraction, an increase in hydroxytyrosol concentration was detected, coupled with relatively low oleuropeine content and high concentration of elenolic acid. The potentiodynamic measurements have shown that this extract acts as a mixed-type corrosion inhibitor, and good inhibition efficiency is observed with the increase in HT and EA concentration. These results suggest that the inhibitive effect of olive leaf extract might be due to the adsorption of the various phenolic compounds onto the copper surface.Keywords: Olive leaf extract, Oleuropein, hydroxytyrosol, elenolic acid , Copper, Corrosion, HPLC/DAD, Polarisation, EIS
Procedia PDF Downloads 25840978 Fuzzy Expert Approach for Risk Mitigation on Functional Urban Areas Affected by Anthropogenic Ground Movements
Authors: Agnieszka A. Malinowska, R. Hejmanowski
Abstract:
A number of European cities are strongly affected by ground movements caused by anthropogenic activities or post-anthropogenic metamorphosis. Those are mainly water pumping, current mining operation, the collapse of post-mining underground voids or mining-induced earthquakes. These activities lead to large and small-scale ground displacements and a ground ruptures. The ground movements occurring in urban areas could considerably affect stability and safety of structures and infrastructures. The complexity of the ground deformation phenomenon in relation to the structures and infrastructures vulnerability leads to considerable constraints in assessing the threat of those objects. However, the increase of access to the free software and satellite data could pave the way for developing new methods and strategies for environmental risk mitigation and management. Open source geographical information systems (OS GIS), may support data integration, management, and risk analysis. Lately, developed methods based on fuzzy logic and experts methods for buildings and infrastructure damage risk assessment could be integrated into OS GIS. Those methods were verified base on back analysis proving their accuracy. Moreover, those methods could be supported by ground displacement observation. Based on freely available data from European Space Agency and free software, ground deformation could be estimated. The main innovation presented in the paper is the application of open source software (OS GIS) for integration developed models and assessment of the threat of urban areas. Those approaches will be reinforced by analysis of ground movement based on free satellite data. Those data would support the verification of ground movement prediction models. Moreover, satellite data will enable our mapping of ground deformation in urbanized areas. Developed models and methods have been implemented in one of the urban areas hazarded by underground mining activity. Vulnerability maps supported by satellite ground movement observation would mitigate the hazards of land displacements in urban areas close to mines.Keywords: fuzzy logic, open source geographic information science (OS GIS), risk assessment on urbanized areas, satellite interferometry (InSAR)
Procedia PDF Downloads 15940977 Developing Indicators in System Mapping Process Through Science-Based Visual Tools
Authors: Cristian Matti, Valerie Fowles, Eva Enyedi, Piotr Pogorzelski
Abstract:
The system mapping process can be defined as a knowledge service where a team of facilitators, experts and practitioners facilitate a guided conversation, enable the exchange of information and support an iterative curation process. System mapping processes rely on science-based tools to introduce and simplify a variety of components and concepts of socio-technical systems through metaphors while facilitating an interactive dialogue process to enable the design of co-created maps. System maps work then as “artifacts” to provide information and focus the conversation into specific areas around the defined challenge and related decision-making process. Knowledge management facilitates the curation of that data gathered during the system mapping sessions through practices of documentation and subsequent knowledge co-production for which common practices from data science are applied to identify new patterns, hidden insights, recurrent loops and unexpected elements. This study presents empirical evidence on the application of these techniques to explore mechanisms by which visual tools provide guiding principles to portray system components, key variables and types of data through the lens of climate change. In addition, data science facilitates the structuring of elements that allow the analysis of layers of information through affinity and clustering analysis and, therefore, develop simple indicators for supporting the decision-making process. This paper addresses methodological and empirical elements on the horizontal learning process that integrate system mapping through visual tools, interpretation, cognitive transformation and analysis. The process is designed to introduce practitioners to simple iterative and inclusive processes that create actionable knowledge and enable a shared understanding of the system in which they are embedded.Keywords: indicators, knowledge management, system mapping, visual tools
Procedia PDF Downloads 19540976 Application of Drones in Agriculture
Authors: Reza Taherlouei Safa, Mohammad Aboonajmi
Abstract:
Agriculture plays an essential role in providing food for the world's population. It also offers numerous benefits to countries, including non-food products, transportation, and environmental balance. Precision agriculture, which employs advanced tools to monitor variability and manage inputs, can help achieve these benefits. The increasing demand for food security puts pressure on decision-makers to ensure sufficient food production worldwide. To support sustainable agriculture, unmanned aerial vehicles (UAVs) can be utilized to manage farms and increase yields. This paper aims to provide an understanding of UAV usage and its applications in agriculture. The objective is to review the various applications of UAVs in agriculture. Based on a comprehensive review of existing research, it was found that different sensors provide varying analyses for agriculture applications. Therefore, the purpose of the project must be determined before using UAV technology for better data quality and analysis. In conclusion, identifying a suitable sensor and UAV is crucial to gather accurate data and precise analysis when using UAVs in agriculture.Keywords: drone, precision agriculture, farmer income, UAV
Procedia PDF Downloads 8140975 A Mutually Exclusive Task Generation Method Based on Data Augmentation
Authors: Haojie Wang, Xun Li, Rui Yin
Abstract:
In order to solve the memorization overfitting in the model-agnostic meta-learning MAML algorithm, a method of generating mutually exclusive tasks based on data augmentation is proposed. This method generates a mutex task by corresponding one feature of the data to multiple labels so that the generated mutex task is inconsistent with the data distribution in the initial dataset. Because generating mutex tasks for all data will produce a large number of invalid data and, in the worst case, lead to an exponential growth of computation, this paper also proposes a key data extraction method that only extract part of the data to generate the mutex task. The experiments show that the method of generating mutually exclusive tasks can effectively solve the memorization overfitting in the meta-learning MAML algorithm.Keywords: mutex task generation, data augmentation, meta-learning, text classification.
Procedia PDF Downloads 14340974 Challenges for IoT Adoption in India: A Study Based on Foresight Analysis for 2025
Authors: Shruti Chopra, Vikas Rao Vadi
Abstract:
In the era of the digital world, the Internet of Things (IoT) has been receiving significant attention. Its ubiquitous connectivity between humans, machines to machines (M2M) and machines to humans provides it a potential to transform the society and establish an ecosystem to serve new dimensions to the economy of the country. Thereby, this study has attempted to identify the challenges that seem prevalent in IoT adoption in India through the literature survey. Further, the data has been collected by taking the opinions of experts to conduct the foresight analysis and it has been analyzed with the help of scenario planning process – Micmac, Mactor, Multipol, and Smic-Prob. As a methodology, the study has identified the relationship between variables through variable analysis using Micmac and actor analysis using Mactor, this paper has attempted to generate the entire field of possibilities in terms of hypotheses and construct various scenarios through Multipol. And lastly, the findings of the study include final scenarios that are selected using Smic-Prob by assigning the probability to all the scenarios (including the conditional probability). This study may help the practitioners and policymakers to remove the obstacles to successfully implement the IoT in India.Keywords: Internet of Thing (IoT), foresight analysis, scenario planning, challenges, policymaking
Procedia PDF Downloads 14740973 Structural Evolution of Na6Mn(SO4)4 from High-Pressure Synchrotron Powder X-ray Diffraction
Authors: Monalisa Pradhan, Ajana Dutta, Irshad Kariyattuparamb Abbas, Boby Joseph, T. N. Guru Row, Diptikanta Swain, Gopal K. Pradhan
Abstract:
Compounds with the Vanthoffite crystal structure having general formula Na6M(SO₄)₄ (M= Mg, Mn, Ni , Co, Fe, Cu and Zn) display a variety of intriguing physical properties intimately related to their structural arrangements. The compound Na6Mn(SO4)4 shows antiferromagnetic ordering at low temperature where the in-plane Mn-O•••O-Mn interactions facilitates antiferromagnetic ordering via a super-exchange interaction between the Mn atoms through the oxygen atoms . The inter-atomic bond distances and angles can easily be tuned by applying external pressure and can be probed using high resolution X-ray diffraction. Moreover, because the magnetic interaction among the Mn atoms are super-exchange type via Mn-O•••O-Mn path, the variation of the Mn-O•••O-Mn dihedral angle and Mn-O bond distances under high pressure inevitably affects the magnetic properties. Therefore, it is evident that high pressure studies on the magnetically ordered materials would shed light on the interplay between their structural properties and magnetic ordering. This will indeed confirm the role of buckling of the Mn-O polyhedral in understanding the origin of anti-ferromagnetism. In this context, we carried out the pressure dependent X-ray diffraction measurement in a diamond anvil cell (DAC) up to a maximum pressure of 17 GPa to study the phase transition and determine equation of state from the volume compression data. Upon increasing the pressure, we didn’t observe any new diffraction peaks or sudden discontinuity in the pressure dependences of the d values up to the maximum achieved pressure of ~17 GPa. However, it is noticed that beyond 12 GPa the a and b lattice parameters become identical while there is a discontinuity in the β value around the same pressure. This indicates a subtle transition to a pseudo-monoclinic phase. Using the third order Birch-Murnaghan equation of state (EOS) to fit the volume compression data for the entire range, we found the bulk modulus (B0) to be 44 GPa. If we consider the subtle transition at 12 GPa, we tried to fit another equation state for the volume beyond 12 GPa using the second order Birch-Murnaghan EOS. This gives a bulk modulus of ~ 34 GPa for this phase.Keywords: mineral, structural phase transition, high pressure XRD, spectroscopy
Procedia PDF Downloads 8740972 Data Gathering and Analysis for Arabic Historical Documents
Authors: Ali Dulla
Abstract:
This paper introduces a new dataset (and the methodology used to generate it) based on a wide range of historical Arabic documents containing clean data simple and homogeneous-page layouts. The experiments are implemented on printed and handwritten documents obtained respectively from some important libraries such as Qatar Digital Library, the British Library and the Library of Congress. We have gathered and commented on 150 archival document images from different locations and time periods. It is based on different documents from the 17th-19th century. The dataset comprises differing page layouts and degradations that challenge text line segmentation methods. Ground truth is produced using the Aletheia tool by PRImA and stored in an XML representation, in the PAGE (Page Analysis and Ground truth Elements) format. The dataset presented will be easily available to researchers world-wide for research into the obstacles facing various historical Arabic documents such as geometric correction of historical Arabic documents.Keywords: dataset production, ground truth production, historical documents, arbitrary warping, geometric correction
Procedia PDF Downloads 16840971 Ultrasonic Agglomeration of Protein Matrices and Its Effect on Thermophysical, Macro- and Microstructural Properties
Authors: Daniela Rivera-Tobar Mario Perez-Won, Roberto Lemus-Mondaca, Gipsy Tabilo-Munizaga
Abstract:
Different dietary trends worldwide seek to consume foods with anti-inflammatory properties, rich in antioxidants, proteins, and unsaturated fatty acids that lead to better metabolic, intestinal, mental, and cardiac health. In this sense, food matrices with high protein content based on macro and microalgae are an excellent alternative to meet the new needs of consumers. An emerging and environmentally friendly technology for producing protein matrices is ultrasonic agglomeration. It consists of the formation of permanent bonds between particles, improving the agglomeration of the matrix compared to conventionally agglomerated products (compression). Among the advantages of this process are the reduction of nutrient loss and the avoidance of binding agents. The objective of this research was to optimize the ultrasonic agglomeration process in matrices composed of Spirulina (Arthrospira platensis) powder and Cochayuyo (Durvillae Antartica) flour, by means of the response variable (Young's modulus) and the independent variables were the process conditions (percentage of ultrasonic amplitude: 70, 80 and 90; ultrasonic agglomeration times and cycles: 20, 25 and 30 seconds, and 3, 4 and 5). It was evaluated using a central composite design and analyzed using response surface methodology. In addition, the effects of agglomeration on thermophysical and microstructural properties were evaluated. It was determined that ultrasonic compression with 80 and 90% amplitude caused conformational changes according to Fourier infrared spectroscopy (FTIR) analysis, the best condition with respect to observed microstructure images (SEM) and differential scanning calorimetry (DSC) analysis, was the condition of 90% amplitude 25 and 30 seconds with 3 and 4 cycles of ultrasound. In conclusion, the agglomerated matrices present good macro and microstructural properties which would allow the design of food systems with better nutritional and functional properties.Keywords: ultrasonic agglomeration, physical properties of food, protein matrices, macro and microalgae
Procedia PDF Downloads 6140970 Parameter Identification Analysis in the Design of Rock Fill Dams
Authors: G. Shahzadi, A. Soulaimani
Abstract:
This research work aims to identify the physical parameters of the constitutive soil model in the design of a rockfill dam by inverse analysis. The best parameters of the constitutive soil model, are those that minimize the objective function, defined as the difference between the measured and numerical results. The Finite Element code (Plaxis) has been utilized for numerical simulation. Polynomial and neural network-based response surfaces have been generated to analyze the relationship between soil parameters and displacements. The performance of surrogate models has been analyzed and compared by evaluating the root mean square error. A comparative study has been done based on objective functions and optimization techniques. Objective functions are categorized by considering measured data with and without uncertainty in instruments, defined by the least square method, which estimates the norm between the predicted displacements and the measured values. Hydro Quebec provided data sets for the measured values of the Romaine-2 dam. Stochastic optimization, an approach that can overcome local minima, and solve non-convex and non-differentiable problems with ease, is used to obtain an optimum value. Genetic Algorithm (GA), Particle Swarm Optimization (PSO) and Differential Evolution (DE) are compared for the minimization problem, although all these techniques take time to converge to an optimum value; however, PSO provided the better convergence and best soil parameters. Overall, parameter identification analysis could be effectively used for the rockfill dam application and has the potential to become a valuable tool for geotechnical engineers for assessing dam performance and dam safety.Keywords: Rockfill dam, parameter identification, stochastic analysis, regression, PLAXIS
Procedia PDF Downloads 14640969 SA-SPKC: Secure and Efficient Aggregation Scheme for Wireless Sensor Networks Using Stateful Public Key Cryptography
Authors: Merad Boudia Omar Rafik, Feham Mohammed
Abstract:
Data aggregation in wireless sensor networks (WSNs) provides a great reduction of energy consumption. The limited resources of sensor nodes make the choice of an encryption algorithm very important for providing security for data aggregation. Asymmetric cryptography involves large ciphertexts and heavy computations but solves, on the other hand, the problem of key distribution of symmetric one. The latter provides smaller ciphertexts and speed computations. Also, the recent researches have shown that achieving the end-to-end confidentiality and the end-to-end integrity at the same is a challenging task. In this paper, we propose (SA-SPKC), a novel security protocol which addresses both security services for WSNs, and where only the base station can verify the individual data and identify the malicious node. Our scheme is based on stateful public key encryption (StPKE). The latter combines the best features of both kinds of encryption along with state in order to reduce the computation overhead. Our analysisKeywords: secure data aggregation, wireless sensor networks, elliptic curve cryptography, homomorphic encryption
Procedia PDF Downloads 29740968 Study of Influencing Factors of Shrinking Cities Based on Factor Analysis: The Example of Halle Germany
Authors: Fang Yao, Minglei Chen
Abstract:
City shrinkage is one of the thorny problems that many European cities have to face with nowadays. It is mainly expressed as the decrease of population in these cities. Eastern Germany is one of the pioneers of European shrinking cities with long shrinking history. Selecting one representative shrinking city Halle(Saale) in eastern Germany as research objective, collecting and investigating nearly 20 years (1993-2010) municipal data after the reunification of Germany. These data based on five dimensions, which are demographic, economic, social, spatial and environmental and total 16 eligible variables. Using Factor Analysis to dealing with these variables in order to assess the most important factors affecting shrinking Halle. The Factor Analysis shows that there are three main factors determine the shrinkage of Halle, namely demographical and economical factor, social stability factor, and city vitality factor. Three factors acts at different period of Halle’s shrinkage: from 1993 to 1997 the demographical and economical factor played an important role; from 1997 to 2004 the social stability is significant to city shrinkage; since 2005 city vitality factors determines the shrinkage of Halle. In recent years, the shrinkage in Halle mitigates that shows the sign of growing population. Thus the city Halle should focus on attaching more importance on the city vitality factor to prevent the city from shrinkage. Meanwhile, the city should possess a positive perspective that to shift the growth-oriented development to tap the potential of shrinking cities. This method is expected to apply to further research and other shrinking cities.Keywords: demography, factor analysis, Halle, shrinking cities
Procedia PDF Downloads 41440967 Automatic Tagging and Accuracy in Assamese Text Data
Authors: Chayanika Hazarika Bordoloi
Abstract:
This paper is an attempt to work on a highly inflectional language called Assamese. This is also one of the national languages of India and very little has been achieved in terms of computational research. Building a language processing tool for a natural language is not very smooth as the standard and language representation change at various levels. This paper presents inflectional suffixes of Assamese verbs and how the statistical tools, along with linguistic features, can improve the tagging accuracy. Conditional random fields (CRF tool) was used to automatically tag and train the text data; however, accuracy was improved after linguistic featured were fed into the training data. Assamese is a highly inflectional language; hence, it is challenging to standardizing its morphology. Inflectional suffixes are used as a feature of the text data. In order to analyze the inflections of Assamese word forms, a list of suffixes is prepared. This list comprises suffixes, comprising of all possible suffixes that various categories can take is prepared. Assamese words can be classified into inflected classes (noun, pronoun, adjective and verb) and un-inflected classes (adverb and particle). The corpus used for this morphological analysis has huge tokens. The corpus is a mixed corpus and it has given satisfactory accuracy. The accuracy rate of the tagger has gradually improved with the modified training data.Keywords: CRF, morphology, tagging, tagset
Procedia PDF Downloads 19440966 Revolutionizing Traditional Farming Using Big Data/Cloud Computing: A Review on Vertical Farming
Authors: Milind Chaudhari, Suhail Balasinor
Abstract:
Due to massive deforestation and an ever-increasing population, the organic content of the soil is depleting at a much faster rate. Due to this, there is a big chance that the entire food production in the world will drop by 40% in the next two decades. Vertical farming can help in aiding food production by leveraging big data and cloud computing to ensure plants are grown naturally by providing the optimum nutrients sunlight by analyzing millions of data points. This paper outlines the most important parameters in vertical farming and how a combination of big data and AI helps in calculating and analyzing these millions of data points. Finally, the paper outlines how different organizations are controlling the indoor environment by leveraging big data in enhancing food quantity and quality.Keywords: big data, IoT, vertical farming, indoor farming
Procedia PDF Downloads 175