Search results for: earth observation data cube
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26459

Search results for: earth observation data cube

23609 Effective Urban Design on Environmental Quality Improvement of Historical Textures: A Case Study on Khajeh Khezr Neighborhood in Kerman City

Authors: Saman Sobhani

Abstract:

Historical neighborhoods have special values inside them, and, in addition to inducing a sense of collective memories, they have to have some criteria in respect of achieving desirable environmental quality in order for citizens to live. From the perspective of urban planners and designers, a neighborhood as an urban space has to satisfy various needs of citizens in terms of activities as well as their spiritual requirements. In the research based on the component of environmental quality in one of the neighborhoods with historical value resulting from the theoretical model presented (functional-structural, physical-spatial, and substantive), integrated analysis has been performed on the Khajeh Khezr neighborhood in Kerman. Then, after studying the weaknesses and strengths points of it based on the AIDA model, some mechanisms have been presented to promote environmental quality based on neighborhood organization, and related urban design projects have been defined accordingly. Analyzing the findings shows that inhabitants in the Khajeh Khezr neighborhood are not much satisfied with the quality of the urban environment of the neighborhood. In the research, the descriptive-analytical method and review of texts have been used in the form of library studies, and case study has been applied as well as observation and questionnaire in the form of field studies.

Keywords: environmental quality, Kerman, Khajeh Khezr, neighborhood

Procedia PDF Downloads 80
23608 Interpretation and Clustering Framework for Analyzing ECG Survey Data

Authors: Irum Matloob, Shoab Ahmad Khan, Fahim Arif

Abstract:

As Indo-Pak has been the victim of heart diseases since many decades. Many surveys showed that percentage of cardiac patients is increasing in Pakistan day by day, and special attention is needed to pay on this issue. The framework is proposed for performing detailed analysis of ECG survey data which is conducted for measuring prevalence of heart diseases statistics in Pakistan. The ECG survey data is evaluated or filtered by using automated Minnesota codes and only those ECGs are used for further analysis which is fulfilling the standardized conditions mentioned in the Minnesota codes. Then feature selection is performed by applying proposed algorithm based on discernibility matrix, for selecting relevant features from the database. Clustering is performed for exposing natural clusters from the ECG survey data by applying spectral clustering algorithm using fuzzy c means algorithm. The hidden patterns and interesting relationships which have been exposed after this analysis are useful for further detailed analysis and for many other multiple purposes.

Keywords: arrhythmias, centroids, ECG, clustering, discernibility matrix

Procedia PDF Downloads 464
23607 LiDAR Based Real Time Multiple Vehicle Detection and Tracking

Authors: Zhongzhen Luo, Saeid Habibi, Martin v. Mohrenschildt

Abstract:

Self-driving vehicle require a high level of situational awareness in order to maneuver safely when driving in real world condition. This paper presents a LiDAR based real time perception system that is able to process sensor raw data for multiple target detection and tracking in dynamic environment. The proposed algorithm is nonparametric and deterministic that is no assumptions and priori knowledge are needed from the input data and no initializations are required. Additionally, the proposed method is working on the three-dimensional data directly generated by LiDAR while not scarifying the rich information contained in the domain of 3D. Moreover, a fast and efficient for real time clustering algorithm is applied based on a radially bounded nearest neighbor (RBNN). Hungarian algorithm procedure and adaptive Kalman filtering are used for data association and tracking algorithm. The proposed algorithm is able to run in real time with average run time of 70ms per frame.

Keywords: lidar, segmentation, clustering, tracking

Procedia PDF Downloads 410
23606 School Choice and Institutional or Familial Habitus: Reciprocity in Parents-School Relationships

Authors: Fatemeh Yazdani

Abstract:

This paper explores the student intake policies in high-performing private schools in Iran by studying both sides involved in the school choice processes, parents and the school leaders. It is based on in-depth interviews with 27 parents and private schools’ staff and principals supplemented by ethnographic observation in two private schools in Tehran. From the Bourdieusian point of view, this paper argues that the school leadership engineers the composition of private schools’ students via different gatekeeping strategies, and these strategies represent and reconstruct the school’s institutional habitus. It further explores the ways that parents who look for quality education among non-state education providers deal with the school's institutional habitus based on their familial habitus and possessed economic, social, and cultural capital. The conclusion highlights that investigating school choice as a reciprocal process between family and school leadership can shed more light on the ways that an exclusive environment has been created in some high-performing private schools for certain class strata maintaining a distance that needs to be kept from ‘others.’ In a broader sense, this paper engages into an exploration of social inequality reproduction through private education.

Keywords: institutional habitus, private education, school choice, social inequality, student intake

Procedia PDF Downloads 101
23605 On-line Control of the Natural and Anthropogenic Safety in Krasnoyarsk Region

Authors: T. Penkova, A. Korobko, V. Nicheporchuk, L. Nozhenkova, A. Metus

Abstract:

This paper presents an approach of on-line control of the state of technosphere and environment objects based on the integration of Data Warehouse, OLAP and Expert systems technologies. It looks at the structure and content of data warehouse that provides consolidation and storage of monitoring data. There is a description of OLAP-models that provide a multidimensional analysis of monitoring data and dynamic analysis of principal parameters of controlled objects. The authors suggest some criteria of emergency risk assessment using expert knowledge about danger levels. It is demonstrated now some of the proposed solutions could be adopted in territorial decision making support systems. Operational control allows authorities to detect threat, prevent natural and anthropogenic emergencies and ensure a comprehensive safety of territory.

Keywords: decision making support systems, emergency risk assessment, natural and anthropogenic safety, on-line control, territory

Procedia PDF Downloads 398
23604 Composite Electrodes Containing Ni-Fe-Cr as an Activatable Oxygen Evolution Catalyst

Authors: Olga A. Krysiak, Grzegorz Cichowicz, Wojciech Hyk, Michal Cyranski, Jan Augustynski

Abstract:

Metal oxides are known electrocatalyst in water oxidation reaction. Due to the fact that it is desirable for efficient oxygen evolution catalyst to contain numerous redox-active metal ions to guard four electron water oxidation reaction, mixed metal oxides exhibit enhanced catalytic activity towards oxygen evolution reaction compared to single metal oxide systems. On the surface of fluorine doped tin oxide coated glass slide (FTO) deposited (doctor blade technique) mixed metal oxide layer composed of nickel, iron, and chromium. Oxide coating was acquired by heat treatment of the aqueous precursors' solutions of the corresponding salts. As-prepared electrodes were photosensitive and acted as an efficient oxygen evolution catalyst. Our results showed that obtained by this method electrodes can be activated which leads to achieving of higher current densities. The recorded current and photocurrent associated with oxygen evolution process were at least two orders of magnitude higher in the presence of oxide layer compared to bare FTO electrode. The overpotential of the process is low (ca. 0,2 V). We have also checked the activity of the catalyst at different known photoanodes used in sun-driven water splitting. Herein, we demonstrate that we were able to achieve efficient oxygen evolution catalysts using relatively cheap precursor consisting of earth abundant metals and simple method of preparation.

Keywords: chromium, electrocatalysis, iron, metal oxides, nickel, oxygen evolution

Procedia PDF Downloads 205
23603 Analysis of Particulate Matter Concentration, EC, OC Emission and Elemental Composition for Biodiesel-Fuelled Diesel Engine

Authors: A. M. Ashraful, H .H. Masjuki, M. A. Kalam

Abstract:

Comparative investigations were performed on the particles matter emitted from a DI diesel engine utilizing palm biodiesel. In this experiment, palm biodiesel PB10 (90% diesel and 10% palm biodiesel), PB20 (80% diesel, 20% palm biodiesel) and diesel fuel samples exhaust were investigated at different working condition (25% and 50% load at 1500 rpm constant speed). Observation of this experiment it clearly seen that at low load condition particle matter concentration of palm biodiesel exhaust were de-creased than that of diesel fuel. At no load and 25% load condition PB10 biodiesel blend exhibited 2.2 times lower PM concentration than that of diesel fuel. On the other hand, elemental carbon (EC) and organic emission for PB10 showed decreases trend as varies 4.2% to 6.6% and 32 to 39% respectively, while elemental carbon percentage increased by 0.85 to 10% respectively. Similarly, metal composition of PB10 biodiesel blend increased by 4.8 to 26.5% respectively. SEM images for B10 and B20 demonstrated granular structure particulates with greater grain sizes compared with diesel fuel. Finally, the experimental outcomes showed that the blend composition and degree of unsaturation of the methyl ester present in biodiesel influence on the particulate matter formation.

Keywords: particulate matter, elemental carbon, organic carbon, biodiesel

Procedia PDF Downloads 383
23602 Anti Staphylococcus aureus and Methicillin Resistant Staphylococcus aureus Action of Thermophilic Fungi Acrophialophora levis IBSD19 and Determination of Its Mode of Action Using Electron Microscopy

Authors: Shivankar Agrawal, Indira Sarangthem

Abstract:

Staphylococcus aureus and Methicillin-resistant Staphylococcus aureus (MRSA) remains one of the major causes of healthcare-associated and community-onset infections worldwide. Hence the search for non-toxic natural compounds having antibacterial activity has intensified for future drug development. The exploration of less studied niches of Earth can highly increase the possibility to discover novel bioactive compounds. Therefore, in this study, the cultivable fraction of fungi from the sediments of natural hot springs has been studied to mine potential fungal candidates with antibacterial activity against the human pathogen Staphylococcus aureus and Methicillin-resistant Staphylococcus aureus. We isolated diverse strains of thermophilic fungi from a collection of samples from sediment. Following a standard method, we isolated a promising thermophilic fungus strain IBSD19, identified as Acrophialophora levis, possessing the potential to produce an anti-Staphylococcus aureus agent. The growth conditions were optimized and scaled to fermentation, and its produced extract was subjected to chemical extraction. The ethyl acetate fraction was found to display significant activity against Staphylococcus aureus and MRSA with a minimum inhibitory concentration (MIC) of 0.5 mg/ml and 4 mg/ml, respectively. The cell membrane integrity assay and SEM suggested that the fungal metabolites cause bacteria clustering and further lysis of the cell.

Keywords: antibacterial activity, antioxidant, fungi, Staphylococcus aureus, MRSA, thermophiles

Procedia PDF Downloads 129
23601 Handling, Exporting and Archiving Automated Mineralogy Data Using TESCAN TIMA

Authors: Marek Dosbaba

Abstract:

Within the mining sector, SEM-based Automated Mineralogy (AM) has been the standard application for quickly and efficiently handling mineral processing tasks. Over the last decade, the trend has been to analyze larger numbers of samples, often with a higher level of detail. This has necessitated a shift from interactive sample analysis performed by an operator using a SEM, to an increased reliance on offline processing to analyze and report the data. In response to this trend, TESCAN TIMA Mineral Analyzer is designed to quickly create a virtual copy of the studied samples, thereby preserving all the necessary information. Depending on the selected data acquisition mode, TESCAN TIMA can perform hyperspectral mapping and save an X-ray spectrum for each pixel or segment, respectively. This approach allows the user to browse through elemental distribution maps of all elements detectable by means of energy dispersive spectroscopy. Re-evaluation of the existing data for the presence of previously unconsidered elements is possible without the need to repeat the analysis. Additional tiers of data such as a secondary electron or cathodoluminescence images can also be recorded. To take full advantage of these information-rich datasets, TIMA utilizes a new archiving tool introduced by TESCAN. The dataset size can be reduced for long-term storage and all information can be recovered on-demand in case of renewed interest. TESCAN TIMA is optimized for network storage of its datasets because of the larger data storage capacity of servers compared to local drives, which also allows multiple users to access the data remotely. This goes hand in hand with the support of remote control for the entire data acquisition process. TESCAN also brings a newly extended open-source data format that allows other applications to extract, process and report AM data. This offers the ability to link TIMA data to large databases feeding plant performance dashboards or geometallurgical models. The traditional tabular particle-by-particle or grain-by-grain export process is preserved and can be customized with scripts to include user-defined particle/grain properties.

Keywords: Tescan, electron microscopy, mineralogy, SEM, automated mineralogy, database, TESCAN TIMA, open format, archiving, big data

Procedia PDF Downloads 105
23600 Development and Application of an Intelligent Masonry Modulation in BIM Tools: Literature Review

Authors: Sara A. Ben Lashihar

Abstract:

The heritage building information modelling (HBIM) of the historical masonry buildings has expanded lately to meet the urgent needs for conservation and structural analysis. The masonry structures are unique features for ancient building architectures worldwide that have special cultural, spiritual, and historical significance. However, there is a research gap regarding the reliability of the HBIM modeling process of these structures. The HBIM modeling process of the masonry structures faces significant challenges due to the inherent complexity and uniqueness of their structural systems. Most of these processes are based on tracing the point clouds and rarely follow documents, archival records, or direct observation. The results of these techniques are highly abstracted models where the accuracy does not exceed LOD 200. The masonry assemblages, especially curved elements such as arches, vaults, and domes, are generally modeled with standard BIM components or in-place models, and the brick textures are graphically input. Hence, future investigation is necessary to establish a methodology to generate automatically parametric masonry components. These components are developed algorithmically according to mathematical and geometric accuracy and the validity of the survey data. The main aim of this paper is to provide a comprehensive review of the state of the art of the existing researches and papers that have been conducted on the HBIM modeling of the masonry structural elements and the latest approaches to achieve parametric models that have both the visual fidelity and high geometric accuracy. The paper reviewed more than 800 articles, proceedings papers, and book chapters focused on "HBIM and Masonry" keywords from 2017 to 2021. The studies were downloaded from well-known, trusted bibliographic databases such as Web of Science, Scopus, Dimensions, and Lens. As a starting point, a scientometric analysis was carried out using VOSViewer software. This software extracts the main keywords in these studies to retrieve the relevant works. It also calculates the strength of the relationships between these keywords. Subsequently, an in-depth qualitative review followed the studies with the highest frequency of occurrence and the strongest links with the topic, according to the VOSViewer's results. The qualitative review focused on the latest approaches and the future suggestions proposed in these researches. The findings of this paper can serve as a valuable reference for researchers, and BIM specialists, to make more accurate and reliable HBIM models for historic masonry buildings.

Keywords: HBIM, masonry, structure, modeling, automatic, approach, parametric

Procedia PDF Downloads 162
23599 Design of a Standard Weather Data Acquisition Device for the Federal University of Technology, Akure Nigeria

Authors: Isaac Kayode Ogunlade

Abstract:

Data acquisition (DAQ) is the process by which physical phenomena from the real world are transformed into an electrical signal(s) that are measured and converted into a digital format for processing, analysis, and storage by a computer. The DAQ is designed using PIC18F4550 microcontroller, communicating with Personal Computer (PC) through USB (Universal Serial Bus). The research deployed initial knowledge of data acquisition system and embedded system to develop a weather data acquisition device using LM35 sensor to measure weather parameters and the use of Artificial Intelligence(Artificial Neural Network - ANN)and statistical approach(Autoregressive Integrated Moving Average – ARIMA) to predict precipitation (rainfall). The device is placed by a standard device in the Department of Meteorology, Federal University of Technology, Akure (FUTA) to know the performance evaluation of the device. Both devices (standard and designed) were subjected to 180 days with the same atmospheric condition for data mining (temperature, relative humidity, and pressure). The acquired data is trained in MATLAB R2012b environment using ANN, and ARIMAto predict precipitation (rainfall). Root Mean Square Error (RMSE), Mean Absolute Error (MAE), Correction Square (R2), and Mean Percentage Error (MPE) was deplored as standardize evaluation to know the performance of the models in the prediction of precipitation. The results from the working of the developed device show that the device has an efficiency of 96% and is also compatible with Personal Computer (PC) and laptops. The simulation result for acquired data shows that ANN models precipitation (rainfall) prediction for two months (May and June 2017) revealed a disparity error of 1.59%; while ARIMA is 2.63%, respectively. The device will be useful in research, practical laboratories, and industrial environments.

Keywords: data acquisition system, design device, weather development, predict precipitation and (FUTA) standard device

Procedia PDF Downloads 85
23598 Spatial Data Science for Data Driven Urban Planning: The Youth Economic Discomfort Index for Rome

Authors: Iacopo Testi, Diego Pajarito, Nicoletta Roberto, Carmen Greco

Abstract:

Today, a consistent segment of the world’s population lives in urban areas, and this proportion will vastly increase in the next decades. Therefore, understanding the key trends in urbanization, likely to unfold over the coming years, is crucial to the implementation of sustainable urban strategies. In parallel, the daily amount of digital data produced will be expanding at an exponential rate during the following years. The analysis of various types of data sets and its derived applications have incredible potential across different crucial sectors such as healthcare, housing, transportation, energy, and education. Nevertheless, in city development, architects and urban planners appear to rely mostly on traditional and analogical techniques of data collection. This paper investigates the prospective of the data science field, appearing to be a formidable resource to assist city managers in identifying strategies to enhance the social, economic, and environmental sustainability of our urban areas. The collection of different new layers of information would definitely enhance planners' capabilities to comprehend more in-depth urban phenomena such as gentrification, land use definition, mobility, or critical infrastructural issues. Specifically, the research results correlate economic, commercial, demographic, and housing data with the purpose of defining the youth economic discomfort index. The statistical composite index provides insights regarding the economic disadvantage of citizens aged between 18 years and 29 years, and results clearly display that central urban zones and more disadvantaged than peripheral ones. The experimental set up selected the city of Rome as the testing ground of the whole investigation. The methodology aims at applying statistical and spatial analysis to construct a composite index supporting informed data-driven decisions for urban planning.

Keywords: data science, spatial analysis, composite index, Rome, urban planning, youth economic discomfort index

Procedia PDF Downloads 127
23597 AI-Based Technologies in International Arbitration: An Exploratory Study on the Practicability of Applying AI Tools in International Arbitration

Authors: Annabelle Onyefulu-Kingston

Abstract:

One of the major purposes of AI today is to evaluate and analyze millions of micro and macro data in order to determine what is relevant in a particular case and proffer it in an adequate manner. Microdata, as far as it relates to AI in international arbitration, is the millions of key issues specifically mentioned by either one or both parties or by their counsels, arbitrators, or arbitral tribunals in arbitral proceedings. This can be qualifications of expert witness and admissibility of evidence, amongst others. Macro data, on the other hand, refers to data derived from the resolution of the dispute and, consequently, the final and binding award. A notable example of this includes the rationale of the award and specific and general damages awarded, amongst others. This paper aims to critically evaluate and analyze the possibility of technological inclusion in international arbitration. This research will be imploring the qualitative method by evaluating existing literature on the consequence of applying AI to both micro and macro data in international arbitration, and how this can be of assistance to parties, counsels, and arbitrators.

Keywords: AI-based technologies, algorithms, arbitrators, international arbitration

Procedia PDF Downloads 80
23596 Assessment of Records Management in Registry Department of Kebbi State University of Science and Technology, Aliero Nigeria

Authors: Murtala Aminu, Salisu Adamu Aliero, Adamu Muhammed

Abstract:

Records are a vital asset in ensuring that the institution is governed effectively and efficiently, and is accountable to its staff, students and the community that it serves. The major purpose of this study was to assess record management of the registry department of Kebbi state University of science and technology Aliero. To be able to achieve this objective, research questions were formulated and answers obtained, which centered on records creation, record management policy, challenges facing records management. The review of related literature revealed that there is need for records to be properly managed and in doing so there is need for good records management policy that clearly spells out the various programs required for effective records management. Survey research method was used involving questionnaire, and observation. The findings revealed that the registry department of the University still has a long way to go with respect to day-today records management. The study recommended provision for adequate, modern, safe and functional storage facilities, sufficient and regular funding, recruitment of trained personnel, on the job training for existing staff, computerization of all units records, and uninterrupted power supply to all parts of the unit as a means of ensuring proper records management.

Keywords: records, management, records management policy, registry

Procedia PDF Downloads 310
23595 A Virtual Grid Based Energy Efficient Data Gathering Scheme for Heterogeneous Sensor Networks

Authors: Siddhartha Chauhan, Nitin Kumar Kotania

Abstract:

Traditional Wireless Sensor Networks (WSNs) generally use static sinks to collect data from the sensor nodes via multiple forwarding. Therefore, network suffers with some problems like long message relay time, bottle neck problem which reduces the performance of the network. Many approaches have been proposed to prevent this problem with the help of mobile sink to collect the data from the sensor nodes, but these approaches still suffer from the buffer overflow problem due to limited memory size of sensor nodes. This paper proposes an energy efficient scheme for data gathering which overcomes the buffer overflow problem. The proposed scheme creates virtual grid structure of heterogeneous nodes. Scheme has been designed for sensor nodes having variable sensing rate. Every node finds out its buffer overflow time and on the basis of this cluster heads are elected. A controlled traversing approach is used by the proposed scheme in order to transmit data to sink. The effectiveness of the proposed scheme is verified by simulation.

Keywords: buffer overflow problem, mobile sink, virtual grid, wireless sensor networks

Procedia PDF Downloads 381
23594 Information Communication Technology Based Road Traffic Accidents’ Identification, and Related Smart Solution Utilizing Big Data

Authors: Ghulam Haider Haidaree, Nsenda Lukumwena

Abstract:

Today the world of research enjoys abundant data, available in virtually any field, technology, science, and business, politics, etc. This is commonly referred to as big data. This offers a great deal of precision and accuracy, supportive of an in-depth look at any decision-making process. When and if well used, Big Data affords its users with the opportunity to produce substantially well supported and good results. This paper leans extensively on big data to investigate possible smart solutions to urban mobility and related issues, namely road traffic accidents, its casualties, and fatalities based on multiple factors, including age, gender, location occurrences of accidents, etc. Multiple technologies were used in combination to produce an Information Communication Technology (ICT) based solution with embedded technology. Those technologies include principally Geographic Information System (GIS), Orange Data Mining Software, Bayesian Statistics, to name a few. The study uses the Leeds accident 2016 to illustrate the thinking process and extracts thereof a model that can be tested, evaluated, and replicated. The authors optimistically believe that the proposed model will significantly and smartly help to flatten the curve of road traffic accidents in the fast-growing population densities, which increases considerably motor-based mobility.

Keywords: accident factors, geographic information system, information communication technology, mobility

Procedia PDF Downloads 203
23593 Analysis of ECGs Survey Data by Applying Clustering Algorithm

Authors: Irum Matloob, Shoab Ahmad Khan, Fahim Arif

Abstract:

As Indo-pak has been the victim of heart diseases since many decades. Many surveys showed that percentage of cardiac patients is increasing in Pakistan day by day, and special attention is needed to pay on this issue. The framework is proposed for performing detailed analysis of ECG survey data which is conducted for measuring the prevalence of heart diseases statistics in Pakistan. The ECG survey data is evaluated or filtered by using automated Minnesota codes and only those ECGs are used for further analysis which is fulfilling the standardized conditions mentioned in the Minnesota codes. Then feature selection is performed by applying proposed algorithm based on discernibility matrix, for selecting relevant features from the database. Clustering is performed for exposing natural clusters from the ECG survey data by applying spectral clustering algorithm using fuzzy c means algorithm. The hidden patterns and interesting relationships which have been exposed after this analysis are useful for further detailed analysis and for many other multiple purposes.

Keywords: arrhythmias, centroids, ECG, clustering, discernibility matrix

Procedia PDF Downloads 347
23592 The Impact of Motivation on Employee Performance in South Korea

Authors: Atabong Awung Lekeazem

Abstract:

The purpose of this paper is to identify the impact or role of incentives on employee’s performance with a particular emphasis on Korean workers. The process involves defining and explaining the different types of motivation. In defining them, we also bring out the difference between the two major types of motivations. The second phase of the paper shall involve gathering data/information from a sample population and then analyzing the data. In the analysis, we shall get to see the almost similar mentality or value which Koreans attach to motivation, which a slide different view coming only from top management personnel. The last phase shall have us presenting the data and coming to a conclusion from which possible knowledge on how managers and potential managers can ignite the best out of their employees.

Keywords: motivation, employee’s performance, Korean workers, business information systems

Procedia PDF Downloads 401
23591 Decision-Making using Fuzzy Linguistic Hypersoft Set Topology

Authors: Muhammad Saqlain, Poom Kumam

Abstract:

Language being an abstract system and creative act, is quite complicated as its meaning varies depending on the context. The context is determined by the empirical knowledge of a person, which is derived from observation and experience. About further subdivided attributes, the decision-making challenges may entail quantitative and qualitative factors. However, because there is no norm for putting a numerical value on language, existing approaches cannot carry out the operations of linguistic knowledge. The assigning of mathematical values (fuzzy, intuitionistic, and neutrosophic) to any decision-making problem; without considering any rule of linguistic knowledge is ambiguous and inaccurate. Thus, this paper aims to provide a generic model for these issues. This paper provides the linguistic set structure of the fuzzy hypersoft set (FLHSS) to solve decision-making issues. We have proposed the definition some basic operations like AND, NOT, OR, AND, compliment, negation, etc., along with Topology and examples, and properties. Secondly, the operational laws for the fuzzy linguistic hypersoft set have been proposed to deal with the decision-making issues. Implementing proposed aggregate operators and operational laws can be used to convert linguistic quantifiers into numerical values. This will increase the accuracy and precision of the fuzzy hypersoft set structure to deal with decision-making issues.

Keywords: linguistic quantifiers, aggregate operators, multi-criteria decision making (mcdm)., fuzzy topology

Procedia PDF Downloads 91
23590 Improved Classification Procedure for Imbalanced and Overlapped Situations

Authors: Hankyu Lee, Seoung Bum Kim

Abstract:

The issue with imbalance and overlapping in the class distribution becomes important in various applications of data mining. The imbalanced dataset is a special case in classification problems in which the number of observations of one class (i.e., major class) heavily exceeds the number of observations of the other class (i.e., minor class). Overlapped dataset is the case where many observations are shared together between the two classes. Imbalanced and overlapped data can be frequently found in many real examples including fraud and abuse patients in healthcare, quality prediction in manufacturing, text classification, oil spill detection, remote sensing, and so on. The class imbalance and overlap problem is the challenging issue because this situation degrades the performance of most of the standard classification algorithms. In this study, we propose a classification procedure that can effectively handle imbalanced and overlapped datasets by splitting data space into three parts: nonoverlapping, light overlapping, and severe overlapping and applying the classification algorithm in each part. These three parts were determined based on the Hausdorff distance and the margin of the modified support vector machine. An experiments study was conducted to examine the properties of the proposed method and compared it with other classification algorithms. The results showed that the proposed method outperformed the competitors under various imbalanced and overlapped situations. Moreover, the applicability of the proposed method was demonstrated through the experiment with real data.

Keywords: classification, imbalanced data with class overlap, split data space, support vector machine

Procedia PDF Downloads 304
23589 The Use of Music Therapy to Improve Non-Verbal Communication Skills for Children with Autism

Authors: Maria Vinca Novenia

Abstract:

The number of school-aged children with autism in Indonesia has been increasing each year. Autism is a developmental disorder which can be diagnosed in childhood. One of the symptoms is the lack of communication skills. Music therapy is known as an effective treatment for children with autism. Music elements and structures create a good space for children with autism to express their feelings and communicate their thoughts. School-aged children are expected to be able to communicate non-verbally very well, but children with autism experience the difficulties of communicating non-verbally. The aim of this research is to analyze the significance of music therapy treatment to improve non-verbal communication tools for children with autism. This research informs teachers and parents on how music can be used as a media to communicate with children with autism. The qualitative method is used to analyze this research, while the result is described with the microanalysis technique. The result is measured specifically from the whole experiment, hours of every week, minutes of every session, and second of every moment. The samples taken are four school-aged children with autism in the age range of six to 11 years old. This research is conducted within four months started with observation, interview, literature research, and direct experiment. The result demonstrates that music therapy could be effectively used as a non-verbal communication tool for children with autism, such as changes of body gesture, eye contact, and facial expression.

Keywords: autism, improvisation, microanalysis, music therapy, nonverbal communication, school-aged

Procedia PDF Downloads 212
23588 Mapping of Geological Structures Using Aerial Photography

Authors: Ankit Sharma, Mudit Sachan, Anurag Prakash

Abstract:

Rapid growth in data acquisition technologies through drones, have led to advances and interests in collecting high-resolution images of geological fields. Being advantageous in capturing high volume of data in short flights, a number of challenges have to overcome for efficient analysis of this data, especially while data acquisition, image interpretation and processing. We introduce a method that allows effective mapping of geological fields using photogrammetric data of surfaces, drainage area, water bodies etc, which will be captured by airborne vehicles like UAVs, we are not taking satellite images because of problems in adequate resolution, time when it is captured may be 1 yr back, availability problem, difficult to capture exact image, then night vision etc. This method includes advanced automated image interpretation technology and human data interaction to model structures and. First Geological structures will be detected from the primary photographic dataset and the equivalent three dimensional structures would then be identified by digital elevation model. We can calculate dip and its direction by using the above information. The structural map will be generated by adopting a specified methodology starting from choosing the appropriate camera, camera’s mounting system, UAVs design ( based on the area and application), Challenge in air borne systems like Errors in image orientation, payload problem, mosaicing and geo referencing and registering of different images to applying DEM. The paper shows the potential of using our method for accurate and efficient modeling of geological structures, capture particularly from remote, of inaccessible and hazardous sites.

Keywords: digital elevation model, mapping, photogrammetric data analysis, geological structures

Procedia PDF Downloads 681
23587 Using Optical Character Recognition to Manage the Unstructured Disaster Data into Smart Disaster Management System

Authors: Dong Seop Lee, Byung Sik Kim

Abstract:

In the 4th Industrial Revolution, various intelligent technologies have been developed in many fields. These artificial intelligence technologies are applied in various services, including disaster management. Disaster information management does not just support disaster work, but it is also the foundation of smart disaster management. Furthermore, it gets historical disaster information using artificial intelligence technology. Disaster information is one of important elements of entire disaster cycle. Disaster information management refers to the act of managing and processing electronic data about disaster cycle from its’ occurrence to progress, response, and plan. However, information about status control, response, recovery from natural and social disaster events, etc. is mainly managed in the structured and unstructured form of reports. Those exist as handouts or hard-copies of reports. Such unstructured form of data is often lost or destroyed due to inefficient management. It is necessary to manage unstructured data for disaster information. In this paper, the Optical Character Recognition approach is used to convert handout, hard-copies, images or reports, which is printed or generated by scanners, etc. into electronic documents. Following that, the converted disaster data is organized into the disaster code system as disaster information. Those data are stored in the disaster database system. Gathering and creating disaster information based on Optical Character Recognition for unstructured data is important element as realm of the smart disaster management. In this paper, Korean characters were improved to over 90% character recognition rate by using upgraded OCR. In the case of character recognition, the recognition rate depends on the fonts, size, and special symbols of character. We improved it through the machine learning algorithm. These converted structured data is managed in a standardized disaster information form connected with the disaster code system. The disaster code system is covered that the structured information is stored and retrieve on entire disaster cycle such as historical disaster progress, damages, response, and recovery. The expected effect of this research will be able to apply it to smart disaster management and decision making by combining artificial intelligence technologies and historical big data.

Keywords: disaster information management, unstructured data, optical character recognition, machine learning

Procedia PDF Downloads 121
23586 Ground-Structure Interaction Analysis of Aged Tunnels

Authors: Behrang Dadfar, Hossein Bidhendi, Jimmy Susetyo, John Paul Abbatangelo

Abstract:

Finding structural demand under various conditions that a structure may experience during its service life is an important step towards structural life-cycle analysis. In this paper, structural demand for the precast concrete tunnel lining (PCTL) segments of Toronto’s 60-year-old subway tunnels is investigated. Numerical modelling was conducted using FLAC3D, a finite difference-based software capable of simulating ground-structure interaction and ground material’s flow in three dimensions. The specific structural details of the segmental tunnel lining, such as the convex shape of the PCTL segments at radial joints and the PCTL segment pockets, were considered in the numerical modelling. Also, the model was developed in a way to accommodate the flexibility required for the simulation of various deterioration scenarios, shapes, and patterns that have been observed over more than 20 years. The soil behavior was simulated by using plastic-hardening constitutive model of FLAC3D. The effect of the depth of the tunnel, the coefficient of lateral earth pressure as well as the patterns of deterioration of the segments were studied. The structural capacity under various deterioration patterns and the existing loading conditions was evaluated using axial-flexural interaction curves that were developed for each deterioration pattern. The results were used to provide recommendations for the next phase of tunnel lining rehabilitation program.

Keywords: precast concrete tunnel lining, ground-structure interaction, numerical modelling, deterioration, tunnels

Procedia PDF Downloads 157
23585 Engineered Reactor Components for Durable Iron Flow Battery

Authors: Anna Ivanovskaya, Alexandra E. L. Overland, Swetha Chandrasekaran, Buddhinie S. Jayathilake

Abstract:

Iron-based redox flow batteries (IRFB) are promising for grid-scale storage because of their low-cost and environmental safety. Earth-abundant iron can enable affordable grid-storage to meet DOE’s target material cost <$20/kWh and levelized cost for storage $0.05/kWh. In conventional redox flow batteries, energy is stored in external electrolyte tanks and electrolytes are circulated through the cell units to achieve electrochemical energy conversions. However, IRFBs are hybrid battery systems where metallic iron deposition at the negative side of the battery controls the storage capacity. This adds complexity to the design of a porous structure of 3D-electrodes to achieve a desired high storage capacity. In addition, there is a need to control parasitic hydrogen evolution reaction which accompanies the metal deposition process, increases the pH, lowers the energy efficiency, and limits the durability. To achieve sustainable operation of IRFBs, electrolyte pH, which affects the solubility of reactants and the rate of parasitic reactions, needs to be dynamically readjusted. In the present study we explore the impact of complexing agents on maintaining solubility of the reactants and find the optimal electrolyte conditions and battery operating regime, which are specific for IRFBs with additives, and demonstrate the robust operation.

Keywords: flow battery, iron-based redox flow battery, IRFB, energy storage, electrochemistry

Procedia PDF Downloads 73
23584 Predicting Seoul Bus Ridership Using Artificial Neural Network Algorithm with Smartcard Data

Authors: Hosuk Shin, Young-Hyun Seo, Eunhak Lee, Seung-Young Kho

Abstract:

Currently, in Seoul, users have the privilege to avoid riding crowded buses with the installation of Bus Information System (BIS). BIS has three levels of on-board bus ridership level information (spacious, normal, and crowded). However, there are flaws in the system due to it being real time which could provide incomplete information to the user. For example, a bus comes to the station, and on the BIS it shows that the bus is crowded, but on the stop that the user is waiting many people get off, which would mean that this station the information should show as normal or spacious. To fix this problem, this study predicts the bus ridership level using smart card data to provide more accurate information about the passenger ridership level on the bus. An Artificial Neural Network (ANN) is an interconnected group of nodes, that was created based on the human brain. Forecasting has been one of the major applications of ANN due to the data-driven self-adaptive methods of the algorithm itself. According to the results, the ANN algorithm was stable and robust with somewhat small error ratio, so the results were rational and reasonable.

Keywords: smartcard data, ANN, bus, ridership

Procedia PDF Downloads 162
23583 Combination of Artificial Neural Network Model and Geographic Information System for Prediction Water Quality

Authors: Sirilak Areerachakul

Abstract:

Water quality has initiated serious management efforts in many countries. Artificial Neural Network (ANN) models are developed as forecasting tools in predicting water quality trend based on historical data. This study endeavors to automatically classify water quality. The water quality classes are evaluated using 6 factor indices. These factors are pH value (pH), Dissolved Oxygen (DO), Biochemical Oxygen Demand (BOD), Nitrate Nitrogen (NO3N), Ammonia Nitrogen (NH3N) and Total Coliform (T-Coliform). The methodology involves applying data mining techniques using multilayer perceptron (MLP) neural network models. The data consisted of 11 sites of Saen Saep canal in Bangkok, Thailand. The data is obtained from the Department of Drainage and Sewerage Bangkok Metropolitan Administration during 2007-2011. The results of multilayer perceptron neural network exhibit a high accuracy multilayer perception rate at 94.23% in classifying the water quality of Saen Saep canal in Bangkok. Subsequently, this encouraging result could be combined with GIS data improves the classification accuracy significantly.

Keywords: artificial neural network, geographic information system, water quality, computer science

Procedia PDF Downloads 335
23582 Reentrant Spin-Glass State Formation in Polycrystalline Er₂NiSi₃

Authors: Santanu Pakhira, Chandan Mazumdar, R. Ranganathan, Maxim Avdeev

Abstract:

Magnetically frustrated systems are of great interest and one of the most adorable topics for the researcher of condensed matter physics, due to their various interesting properties, viz. ground state degeneracy, finite entropy at zero temperature, lowering of ordering temperature, etc. Ternary intermetallics with the composition RE₂TX₃ (RE = rare-earth element, T= d electron transition metal and X= p electron element) crystallize in hexagonal AlB₂ type crystal structure (space group P6/mmm). In a hexagonal crystal structure with the antiferromagnetic interaction between the moments, the center moment is geometrically frustrated. Magnetic frustration along with disorder arrangements of non-magnetic ions are the building blocks for metastable spin-glass ground state formation for most of the compounds of this stoichiometry. The newly synthesized compound Er₂NiSi₃ compound forms in single phase in AlB₂ type structure with space group P6/mmm. The compound orders antiferromagnetically below 5.4 K and spin freezing of the frustrated magnetic moments occurs below 3 K for the compound. The compound shows magnetic relaxation behavior and magnetic memory effect below its freezing temperature. Neutron diffraction patterns for temperatures below the spin freezing temperature have been analyzed using FULLPROF software package. Diffuse magnetic scattering at low temperatures yields spin glass state formation for the compound.

Keywords: antiferromagnetism, magnetic frustration, spin-glass, neutron diffraction

Procedia PDF Downloads 256
23581 Improving Temporal Correlations in Empirical Orthogonal Function Expansions for Data Interpolating Empirical Orthogonal Function Algorithm

Authors: Ping Bo, Meng Yunshan

Abstract:

Satellite-derived sea surface temperature (SST) is a key parameter for many operational and scientific applications. However, the disadvantage of SST data is a high percentage of missing data which is mainly caused by cloud coverage. Data Interpolating Empirical Orthogonal Function (DINEOF) algorithm is an EOF-based technique for reconstructing the missing data and has been widely used in oceanographic field. The reconstruction of SST images within a long time series using DINEOF can cause large discontinuities and one solution for this problem is to filter the temporal covariance matrix to reduce the spurious variability. Based on the previous researches, an algorithm is presented in this paper to improve the temporal correlations in EOF expansion. Similar with the previous researches, a filter, such as Laplacian filter, is implemented on the temporal covariance matrix, but the temporal relationship between two consecutive images which is used in the filter is considered in the presented algorithm, for example, two images in the same season are more likely correlated than those in the different seasons, hence the latter one is less weighted in the filter. The presented approach is tested for the monthly nighttime 4-km Advanced Very High Resolution Radiometer (AVHRR) Pathfinder SST for the long-term period spanning from 1989 to 2006. The results obtained from the presented algorithm are compared to those from the original DINEOF algorithm without filtering and from the DINEOF algorithm with filtering but without taking temporal relationship into account.

Keywords: data interpolating empirical orthogonal function, image reconstruction, sea surface temperature, temporal filter

Procedia PDF Downloads 321
23580 Sparse Unmixing of Hyperspectral Data by Exploiting Joint-Sparsity and Rank-Deficiency

Authors: Fanqiang Kong, Chending Bian

Abstract:

In this work, we exploit two assumed properties of the abundances of the observed signatures (endmembers) in order to reconstruct the abundances from hyperspectral data. Joint-sparsity is the first property of the abundances, which assumes the adjacent pixels can be expressed as different linear combinations of same materials. The second property is rank-deficiency where the number of endmembers participating in hyperspectral data is very small compared with the dimensionality of spectral library, which means that the abundances matrix of the endmembers is a low-rank matrix. These assumptions lead to an optimization problem for the sparse unmixing model that requires minimizing a combined l2,p-norm and nuclear norm. We propose a variable splitting and augmented Lagrangian algorithm to solve the optimization problem. Experimental evaluation carried out on synthetic and real hyperspectral data shows that the proposed method outperforms the state-of-the-art algorithms with a better spectral unmixing accuracy.

Keywords: hyperspectral unmixing, joint-sparse, low-rank representation, abundance estimation

Procedia PDF Downloads 248