Search results for: data source
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 28358

Search results for: data source

25628 Social Work Advocacy Regarding Equitable Hiring Of Latinos

Authors: Roberto Lorenzo

Abstract:

Much has been said about the dynamics of the Latin American experience in the United States, however, there seems to be very little data regarding the perception of career identity. Although we do have some Latinos within the professional ranks, there is not nearly enough to claim that we have practiced enough cultural competence to create equity in the professional sphere in the United States. In this thesis, data will be provided regarding labor force statistics highlighting the industries that Latin Americans frequent. Also provided will be the citing of data that suggests further necessity of cultural competence within the professional realm regarding Latin Americans. In addition, methods that were spoken about over the course of our social work education will be discussed in order to connect to possible solutions to this issue.

Keywords: hiring, Latinos, professional equity, cultural competence

Procedia PDF Downloads 20
25627 Design of a Standard Weather Data Acquisition Device for the Federal University of Technology, Akure Nigeria

Authors: Isaac Kayode Ogunlade

Abstract:

Data acquisition (DAQ) is the process by which physical phenomena from the real world are transformed into an electrical signal(s) that are measured and converted into a digital format for processing, analysis, and storage by a computer. The DAQ is designed using PIC18F4550 microcontroller, communicating with Personal Computer (PC) through USB (Universal Serial Bus). The research deployed initial knowledge of data acquisition system and embedded system to develop a weather data acquisition device using LM35 sensor to measure weather parameters and the use of Artificial Intelligence(Artificial Neural Network - ANN)and statistical approach(Autoregressive Integrated Moving Average – ARIMA) to predict precipitation (rainfall). The device is placed by a standard device in the Department of Meteorology, Federal University of Technology, Akure (FUTA) to know the performance evaluation of the device. Both devices (standard and designed) were subjected to 180 days with the same atmospheric condition for data mining (temperature, relative humidity, and pressure). The acquired data is trained in MATLAB R2012b environment using ANN, and ARIMAto predict precipitation (rainfall). Root Mean Square Error (RMSE), Mean Absolute Error (MAE), Correction Square (R2), and Mean Percentage Error (MPE) was deplored as standardize evaluation to know the performance of the models in the prediction of precipitation. The results from the working of the developed device show that the device has an efficiency of 96% and is also compatible with Personal Computer (PC) and laptops. The simulation result for acquired data shows that ANN models precipitation (rainfall) prediction for two months (May and June 2017) revealed a disparity error of 1.59%; while ARIMA is 2.63%, respectively. The device will be useful in research, practical laboratories, and industrial environments.

Keywords: data acquisition system, design device, weather development, predict precipitation and (FUTA) standard device

Procedia PDF Downloads 92
25626 Spatial Data Science for Data Driven Urban Planning: The Youth Economic Discomfort Index for Rome

Authors: Iacopo Testi, Diego Pajarito, Nicoletta Roberto, Carmen Greco

Abstract:

Today, a consistent segment of the world’s population lives in urban areas, and this proportion will vastly increase in the next decades. Therefore, understanding the key trends in urbanization, likely to unfold over the coming years, is crucial to the implementation of sustainable urban strategies. In parallel, the daily amount of digital data produced will be expanding at an exponential rate during the following years. The analysis of various types of data sets and its derived applications have incredible potential across different crucial sectors such as healthcare, housing, transportation, energy, and education. Nevertheless, in city development, architects and urban planners appear to rely mostly on traditional and analogical techniques of data collection. This paper investigates the prospective of the data science field, appearing to be a formidable resource to assist city managers in identifying strategies to enhance the social, economic, and environmental sustainability of our urban areas. The collection of different new layers of information would definitely enhance planners' capabilities to comprehend more in-depth urban phenomena such as gentrification, land use definition, mobility, or critical infrastructural issues. Specifically, the research results correlate economic, commercial, demographic, and housing data with the purpose of defining the youth economic discomfort index. The statistical composite index provides insights regarding the economic disadvantage of citizens aged between 18 years and 29 years, and results clearly display that central urban zones and more disadvantaged than peripheral ones. The experimental set up selected the city of Rome as the testing ground of the whole investigation. The methodology aims at applying statistical and spatial analysis to construct a composite index supporting informed data-driven decisions for urban planning.

Keywords: data science, spatial analysis, composite index, Rome, urban planning, youth economic discomfort index

Procedia PDF Downloads 135
25625 AI-Based Technologies in International Arbitration: An Exploratory Study on the Practicability of Applying AI Tools in International Arbitration

Authors: Annabelle Onyefulu-Kingston

Abstract:

One of the major purposes of AI today is to evaluate and analyze millions of micro and macro data in order to determine what is relevant in a particular case and proffer it in an adequate manner. Microdata, as far as it relates to AI in international arbitration, is the millions of key issues specifically mentioned by either one or both parties or by their counsels, arbitrators, or arbitral tribunals in arbitral proceedings. This can be qualifications of expert witness and admissibility of evidence, amongst others. Macro data, on the other hand, refers to data derived from the resolution of the dispute and, consequently, the final and binding award. A notable example of this includes the rationale of the award and specific and general damages awarded, amongst others. This paper aims to critically evaluate and analyze the possibility of technological inclusion in international arbitration. This research will be imploring the qualitative method by evaluating existing literature on the consequence of applying AI to both micro and macro data in international arbitration, and how this can be of assistance to parties, counsels, and arbitrators.

Keywords: AI-based technologies, algorithms, arbitrators, international arbitration

Procedia PDF Downloads 95
25624 A Virtual Grid Based Energy Efficient Data Gathering Scheme for Heterogeneous Sensor Networks

Authors: Siddhartha Chauhan, Nitin Kumar Kotania

Abstract:

Traditional Wireless Sensor Networks (WSNs) generally use static sinks to collect data from the sensor nodes via multiple forwarding. Therefore, network suffers with some problems like long message relay time, bottle neck problem which reduces the performance of the network. Many approaches have been proposed to prevent this problem with the help of mobile sink to collect the data from the sensor nodes, but these approaches still suffer from the buffer overflow problem due to limited memory size of sensor nodes. This paper proposes an energy efficient scheme for data gathering which overcomes the buffer overflow problem. The proposed scheme creates virtual grid structure of heterogeneous nodes. Scheme has been designed for sensor nodes having variable sensing rate. Every node finds out its buffer overflow time and on the basis of this cluster heads are elected. A controlled traversing approach is used by the proposed scheme in order to transmit data to sink. The effectiveness of the proposed scheme is verified by simulation.

Keywords: buffer overflow problem, mobile sink, virtual grid, wireless sensor networks

Procedia PDF Downloads 391
25623 Information Communication Technology Based Road Traffic Accidents’ Identification, and Related Smart Solution Utilizing Big Data

Authors: Ghulam Haider Haidaree, Nsenda Lukumwena

Abstract:

Today the world of research enjoys abundant data, available in virtually any field, technology, science, and business, politics, etc. This is commonly referred to as big data. This offers a great deal of precision and accuracy, supportive of an in-depth look at any decision-making process. When and if well used, Big Data affords its users with the opportunity to produce substantially well supported and good results. This paper leans extensively on big data to investigate possible smart solutions to urban mobility and related issues, namely road traffic accidents, its casualties, and fatalities based on multiple factors, including age, gender, location occurrences of accidents, etc. Multiple technologies were used in combination to produce an Information Communication Technology (ICT) based solution with embedded technology. Those technologies include principally Geographic Information System (GIS), Orange Data Mining Software, Bayesian Statistics, to name a few. The study uses the Leeds accident 2016 to illustrate the thinking process and extracts thereof a model that can be tested, evaluated, and replicated. The authors optimistically believe that the proposed model will significantly and smartly help to flatten the curve of road traffic accidents in the fast-growing population densities, which increases considerably motor-based mobility.

Keywords: accident factors, geographic information system, information communication technology, mobility

Procedia PDF Downloads 208
25622 Emotions Triggered by Children’s Literature Images

Authors: Ana Maria Reis d'Azevedo Breda, Catarina Maria Neto da Cruz

Abstract:

The role of images/illustrations in communicating meanings and triggering emotions assumes an increasingly relevant role in contemporary texts, regardless of the age group for which they are intended or the nature of the texts that host them. It is no coincidence that children's books are full of illustrations and that the image/text ratio decreases as the age group grows. The vast majority of children's books can be considered multimodal texts containing text and images/illustrations interacting with each other to provide the young reader with a broader and more creative understanding of the book's narrative. This interaction is very diverse, ranging from images/illustrations that are not essential for understanding the storytelling to those that contribute significantly to the meaning of the story. Usually, these books are also read by adults, namely by parents, educators, and teachers who act as mediators between the book and the children, explaining aspects that are or seem to be too complex for the child's context. It should be noted that there are books labeled as children's books that are clearly intended for both children and adults. In this work, following a qualitative and interpretative methodology based on written productions, participant observation, and field notes, we will describe the perceptions of future teachers of the 1st cycle of basic education, attending a master's degree at a Portuguese university, about the role of the image in literary and non-literary texts, namely in mathematical texts, and how these can constitute precious resources for emotional regulation and for the design of creative didactic situations. The analysis of the collected data allowed us to obtain evidence regarding the evolution of the participants' perception regarding the crucial role of images in children's literature, not only as an emotional regulator for young readers but also as a creative source for the design of meaningful didactical situations, crossing other scientific areas, other than the mother tongue, namely mathematics.

Keywords: children’s literature, emotions, multimodal texts, soft skills

Procedia PDF Downloads 94
25621 Disclosure Extension of Oil and Gas Reserve Quantum

Authors: Ali Alsawayeh, Ibrahim Eldanfour

Abstract:

This paper examines the extent of disclosure of oil and gas reserve quantum in annual reports of international oil and gas exploration and production companies, particularly companies in untested international markets, such as Canada, the UK and the US, and seeks to determine the underlying factors that affect the level of disclosure on oil reserve quantum. The study is concerned with the usefulness of disclosure of oil and gas reserves quantum to investors and other users. Given the primacy of the annual report (10-k) as a source of supplemental reserves data about the company and as the channel through which companies disseminate information about their performance, the annual reports for one year (2009) were the central focus of the study. This comparative study seeks to establish whether differences exist between the sample companies, based on new disclosure requirements by the Securities and Exchange Commission (SEC) in respect of reserves classification and definition. The extent of disclosure of reserve is provided and compared among the selected companies. Statistical analysis is performed to determine whether any differences exist in the extent of disclosure of reserve under the determinant variables. This study shows that some factors would affect the extent of disclosure of reserve quantum in the above-mentioned countries, namely: company’s size, leverage and quality of auditor. Companies that provide reserves quantum in detail appear to display higher size. The findings also show that the level of leverage has affected companies’ reserves quantum disclosure. Indeed, companies that provide detailed reserves quantum disclosure tend to employ a ‘high-quality auditor’. In addition, the study found significant independent variable such as Profit Sharing Contracts (PSC). This factor could explain variations in the level of disclosure of oil reserve quantum between the contractor and host governments. The implementation of SEC oil and gas reporting requirements do not enhance companies’ valuation because the new rules are based only on past and present reserves information (proven reserves); hence, future valuation of oil and gas companies is missing for the market.

Keywords: comparison, company characteristics, disclosure, reserve quantum, regulation

Procedia PDF Downloads 405
25620 Analysis of ECGs Survey Data by Applying Clustering Algorithm

Authors: Irum Matloob, Shoab Ahmad Khan, Fahim Arif

Abstract:

As Indo-pak has been the victim of heart diseases since many decades. Many surveys showed that percentage of cardiac patients is increasing in Pakistan day by day, and special attention is needed to pay on this issue. The framework is proposed for performing detailed analysis of ECG survey data which is conducted for measuring the prevalence of heart diseases statistics in Pakistan. The ECG survey data is evaluated or filtered by using automated Minnesota codes and only those ECGs are used for further analysis which is fulfilling the standardized conditions mentioned in the Minnesota codes. Then feature selection is performed by applying proposed algorithm based on discernibility matrix, for selecting relevant features from the database. Clustering is performed for exposing natural clusters from the ECG survey data by applying spectral clustering algorithm using fuzzy c means algorithm. The hidden patterns and interesting relationships which have been exposed after this analysis are useful for further detailed analysis and for many other multiple purposes.

Keywords: arrhythmias, centroids, ECG, clustering, discernibility matrix

Procedia PDF Downloads 351
25619 The Impact of Motivation on Employee Performance in South Korea

Authors: Atabong Awung Lekeazem

Abstract:

The purpose of this paper is to identify the impact or role of incentives on employee’s performance with a particular emphasis on Korean workers. The process involves defining and explaining the different types of motivation. In defining them, we also bring out the difference between the two major types of motivations. The second phase of the paper shall involve gathering data/information from a sample population and then analyzing the data. In the analysis, we shall get to see the almost similar mentality or value which Koreans attach to motivation, which a slide different view coming only from top management personnel. The last phase shall have us presenting the data and coming to a conclusion from which possible knowledge on how managers and potential managers can ignite the best out of their employees.

Keywords: motivation, employee’s performance, Korean workers, business information systems

Procedia PDF Downloads 414
25618 Fabrication and Analysis of Vertical Double-Diffused Metal Oxide Semiconductor (VDMOS)

Authors: Deepika Sharma, Bal Krishan

Abstract:

In this paper, the structure of N-channel VDMOS was designed and analyzed using Silvaco TCAD tools by varying N+ source doping concentration, P-Body doping concentration, gate oxide thickness and the diffuse time. VDMOS is considered to be ideal power switches due to its high input impedance and fast switching speed. The performance of the device was analyzed from the Ids vs Vgs curve. The electrical characteristics such as threshold voltage, gate oxide thickness and breakdown voltage for the proposed device structures were extarcted. Effect of epitaxial layer on various parameters is also observed.

Keywords: on-resistance, threshold voltage, epitaxial layer, breakdown voltage

Procedia PDF Downloads 327
25617 Improved Classification Procedure for Imbalanced and Overlapped Situations

Authors: Hankyu Lee, Seoung Bum Kim

Abstract:

The issue with imbalance and overlapping in the class distribution becomes important in various applications of data mining. The imbalanced dataset is a special case in classification problems in which the number of observations of one class (i.e., major class) heavily exceeds the number of observations of the other class (i.e., minor class). Overlapped dataset is the case where many observations are shared together between the two classes. Imbalanced and overlapped data can be frequently found in many real examples including fraud and abuse patients in healthcare, quality prediction in manufacturing, text classification, oil spill detection, remote sensing, and so on. The class imbalance and overlap problem is the challenging issue because this situation degrades the performance of most of the standard classification algorithms. In this study, we propose a classification procedure that can effectively handle imbalanced and overlapped datasets by splitting data space into three parts: nonoverlapping, light overlapping, and severe overlapping and applying the classification algorithm in each part. These three parts were determined based on the Hausdorff distance and the margin of the modified support vector machine. An experiments study was conducted to examine the properties of the proposed method and compared it with other classification algorithms. The results showed that the proposed method outperformed the competitors under various imbalanced and overlapped situations. Moreover, the applicability of the proposed method was demonstrated through the experiment with real data.

Keywords: classification, imbalanced data with class overlap, split data space, support vector machine

Procedia PDF Downloads 308
25616 Effect of Mixture of Flaxseed and Pumpkin Seeds Powder on Hypercholesterolemia

Authors: Zahra Ashraf

Abstract:

Flax and pumpkin seeds are a rich source of unsaturated fatty acids, antioxidants and fiber, known to have anti-atherogenic properties. Hypercholesterolemia is a state characterized by the elevated level of cholesterol in the blood. This research was designed to study the effect of flax and pumpkin seeds powder mixture on hypercholesterolemia and body weight. Rat’s species were selected as human representative. Thirty male albino rats were divided into three groups: a control group, a CD-chol group (control diet+cholesterol) fed with 1.5% cholesterol and FP-chol group (flaxseed and pumpkin seed powder+ cholesterol) fed with 1.5% cholesterol. Flax and pumpkin seed powder mixed at proportion of (5/1) (omega-3 and omega-6). Blood samples were collected to examine lipid profile and body weight was also measured. Thus the data was subjected to analysis of variance. In CD-chol group, body weight, total cholesterol TC, triacylglycerides TG in plasma, plasma LDL-C, ratio significantly increased with a decrease in plasma HDL (good cholesterol). In FP-chol group lipid parameters and body weights were decreased significantly with an increase in HDL and decrease in LDL (bad cholesterol). The mean values of body weight, total cholesterol, triglycerides, low density lipoprotein and high density lipoproteins in FP-chol group were 240.66±11.35g, 59.60±2.20mg/dl, 50.20±1.79 mg/dl, 36.20±1.62mg/dl, 36.40±2.20 mg/dl, respectively. Flaxseed and pumpkin seeds powder mixture showed reduction in body weight, serum cholesterol, low density lipoprotein and triglycerides. While significant increase was shown in high density lipoproteins when given to hypercholesterolemic rats. Our results suggested that flax and pumpkin seed mixture has hypocholesterolemic effects which were probably mediated by polyunsaturated fatty acids (omega-3 and omega-6) present in seed mixture.

Keywords: hypercolesterolemia, omega 3 and omega 6 fatty acids, cardiovascular diseases

Procedia PDF Downloads 420
25615 Mapping of Geological Structures Using Aerial Photography

Authors: Ankit Sharma, Mudit Sachan, Anurag Prakash

Abstract:

Rapid growth in data acquisition technologies through drones, have led to advances and interests in collecting high-resolution images of geological fields. Being advantageous in capturing high volume of data in short flights, a number of challenges have to overcome for efficient analysis of this data, especially while data acquisition, image interpretation and processing. We introduce a method that allows effective mapping of geological fields using photogrammetric data of surfaces, drainage area, water bodies etc, which will be captured by airborne vehicles like UAVs, we are not taking satellite images because of problems in adequate resolution, time when it is captured may be 1 yr back, availability problem, difficult to capture exact image, then night vision etc. This method includes advanced automated image interpretation technology and human data interaction to model structures and. First Geological structures will be detected from the primary photographic dataset and the equivalent three dimensional structures would then be identified by digital elevation model. We can calculate dip and its direction by using the above information. The structural map will be generated by adopting a specified methodology starting from choosing the appropriate camera, camera’s mounting system, UAVs design ( based on the area and application), Challenge in air borne systems like Errors in image orientation, payload problem, mosaicing and geo referencing and registering of different images to applying DEM. The paper shows the potential of using our method for accurate and efficient modeling of geological structures, capture particularly from remote, of inaccessible and hazardous sites.

Keywords: digital elevation model, mapping, photogrammetric data analysis, geological structures

Procedia PDF Downloads 686
25614 Using Optical Character Recognition to Manage the Unstructured Disaster Data into Smart Disaster Management System

Authors: Dong Seop Lee, Byung Sik Kim

Abstract:

In the 4th Industrial Revolution, various intelligent technologies have been developed in many fields. These artificial intelligence technologies are applied in various services, including disaster management. Disaster information management does not just support disaster work, but it is also the foundation of smart disaster management. Furthermore, it gets historical disaster information using artificial intelligence technology. Disaster information is one of important elements of entire disaster cycle. Disaster information management refers to the act of managing and processing electronic data about disaster cycle from its’ occurrence to progress, response, and plan. However, information about status control, response, recovery from natural and social disaster events, etc. is mainly managed in the structured and unstructured form of reports. Those exist as handouts or hard-copies of reports. Such unstructured form of data is often lost or destroyed due to inefficient management. It is necessary to manage unstructured data for disaster information. In this paper, the Optical Character Recognition approach is used to convert handout, hard-copies, images or reports, which is printed or generated by scanners, etc. into electronic documents. Following that, the converted disaster data is organized into the disaster code system as disaster information. Those data are stored in the disaster database system. Gathering and creating disaster information based on Optical Character Recognition for unstructured data is important element as realm of the smart disaster management. In this paper, Korean characters were improved to over 90% character recognition rate by using upgraded OCR. In the case of character recognition, the recognition rate depends on the fonts, size, and special symbols of character. We improved it through the machine learning algorithm. These converted structured data is managed in a standardized disaster information form connected with the disaster code system. The disaster code system is covered that the structured information is stored and retrieve on entire disaster cycle such as historical disaster progress, damages, response, and recovery. The expected effect of this research will be able to apply it to smart disaster management and decision making by combining artificial intelligence technologies and historical big data.

Keywords: disaster information management, unstructured data, optical character recognition, machine learning

Procedia PDF Downloads 129
25613 A Study on the Usage of Library versus the Internet as Sources of Information with Reference to the Undergraduate Students in the Faculties of Humanities, Social Sciences, Science and Commerce and Management in the University of Kelaniya

Authors: Dilini Bodhinayaka, Aunsha Sajeewanie Rubasinghe

Abstract:

The library of the University of Kelaniya plays a significant role in supporting the academic work of the university. As at July, 2016 the library of the University of Kelaniya comprised of 250301 printed books, 2157 CD-ROMs, 1203 theses and 800 non-book materials. Furthermore, the library is subscribed to about 60 local journals, access to over 12,500 full text academic journals and around 100,000 e-books. The library provides the services and resources that support in teaching, doing research and learning. On the other hand, undergraduate students have adopted and continued to use the online information retrieval for their academic and research work. This study aims to compare the usage of internet and the usage of library among undergraduates in the faculties of Humanities, Social Sciences, Science and Commerce & Management in the University of Kelaniya. Also, the research attempts to determine the factors of enthusiasm or the disinterest in the students in using library and Internet. All the undergraduate students in the University (8440 students at the time of the study) were taken as the population of the study and the sample of 15% was selected out of the population using stratified sampling method. A total of 1266 questionnaires were distributed among undergraduates of the above mentioned faculties. The qualitative data were analyzed using Descriptive Statistical Method. Findings, of the study indicated that undergraduate students of the faculties of Humanities, Social Sciences, Science and Commerce & Management use both the library and the internet to fulfill their information needs. But, the students in the faculty of Science and Commerce & Management use the internet sources more than the library. The undergraduates in the faculties of Humanities and Social Sciences frequently use the university library than the internet. Although, majority agreed that the internet is the most preferred source of information they have no an adequate awareness about the available internet resources in the E-library of the University of Kelaniya.

Keywords: university libraries, University of Kelaniya, online resources, undergraduates in Sri Lanka

Procedia PDF Downloads 238
25612 Long-Range Transport of Biomass Burning Aerosols over South America: A Case Study in the 2019 Amazon Rainforest Wildfires Season

Authors: Angel Liduvino Vara-Vela, Dirceu Luis Herdies, Debora Souza Alvim, Eder Paulo Vendrasco, Silvio Nilo Figueroa, Jayant Pendharkar, Julio Pablo Reyes Fernandez

Abstract:

Biomass-burning episodes are quite common in the central Amazon rainforest and represent a dominant source of aerosols during the dry season, between August and October. The increase in the occurrence of fires in 2019 in the world’s largest biomes has captured the attention of the international community. In particular, a rare and extreme smoke-related event occurred in the afternoon of Monday, August 19, 2019, in the most populous city in the Western Hemisphere, the São Paulo Metropolitan Area (SPMA), located in southeastern Brazil. The sky over the SPMA suddenly blackened, with the day turning into night, as reported by several news media around the world. In order to clarify whether or not the smoke that plunged the SPMA into sudden darkness was related to wildfires in the Amazon rainforest region, a set of 48-hour simulations over South America were performed using the Weather Research and Forecasting with Chemistry (WRF-Chem) model at 20 km horizontal resolution, on a daily basis, during the period from August 16 to August 19, 2019. The model results were satisfactorily compared against satellite-based data products and in situ measurements collected from air quality monitoring sites. Although a very strong smoke transport coming from the Amazon rainforest was observed in the middle of the afternoon on August 19, its impact on air quality over the SPMA took place in upper levels far above the surface, where, conversely, low air pollutant concentrations were observed.

Keywords: Amazon rainforest, biomass burning aerosols, São Paulo metropolitan area, WRF-Chem model

Procedia PDF Downloads 139
25611 Predicting Seoul Bus Ridership Using Artificial Neural Network Algorithm with Smartcard Data

Authors: Hosuk Shin, Young-Hyun Seo, Eunhak Lee, Seung-Young Kho

Abstract:

Currently, in Seoul, users have the privilege to avoid riding crowded buses with the installation of Bus Information System (BIS). BIS has three levels of on-board bus ridership level information (spacious, normal, and crowded). However, there are flaws in the system due to it being real time which could provide incomplete information to the user. For example, a bus comes to the station, and on the BIS it shows that the bus is crowded, but on the stop that the user is waiting many people get off, which would mean that this station the information should show as normal or spacious. To fix this problem, this study predicts the bus ridership level using smart card data to provide more accurate information about the passenger ridership level on the bus. An Artificial Neural Network (ANN) is an interconnected group of nodes, that was created based on the human brain. Forecasting has been one of the major applications of ANN due to the data-driven self-adaptive methods of the algorithm itself. According to the results, the ANN algorithm was stable and robust with somewhat small error ratio, so the results were rational and reasonable.

Keywords: smartcard data, ANN, bus, ridership

Procedia PDF Downloads 167
25610 Combination of Artificial Neural Network Model and Geographic Information System for Prediction Water Quality

Authors: Sirilak Areerachakul

Abstract:

Water quality has initiated serious management efforts in many countries. Artificial Neural Network (ANN) models are developed as forecasting tools in predicting water quality trend based on historical data. This study endeavors to automatically classify water quality. The water quality classes are evaluated using 6 factor indices. These factors are pH value (pH), Dissolved Oxygen (DO), Biochemical Oxygen Demand (BOD), Nitrate Nitrogen (NO3N), Ammonia Nitrogen (NH3N) and Total Coliform (T-Coliform). The methodology involves applying data mining techniques using multilayer perceptron (MLP) neural network models. The data consisted of 11 sites of Saen Saep canal in Bangkok, Thailand. The data is obtained from the Department of Drainage and Sewerage Bangkok Metropolitan Administration during 2007-2011. The results of multilayer perceptron neural network exhibit a high accuracy multilayer perception rate at 94.23% in classifying the water quality of Saen Saep canal in Bangkok. Subsequently, this encouraging result could be combined with GIS data improves the classification accuracy significantly.

Keywords: artificial neural network, geographic information system, water quality, computer science

Procedia PDF Downloads 343
25609 Prospect for Peace: Criticism to Over-Focusing on Religion in Conflicts

Authors: Leyi Wang

Abstract:

The effect of religion on conflicts is usually over-focused. Religion is not the root cause of conflicts. There are always social, political or economic factors pushing the acceleration of conflicts. Meanwhile, the charisma of religion on calling for adherents is often utilized by political leaders as a tool of providing legitimacy to the initiating of violence and mobilizing the public during conflicts. What people identify from the connections between religion and conflicts is fake. There are some strategies used by politicians to upgrade the conflicts into violence. Consequently, there are some assumptions of which try to limit the religion’s effects on accelerating conflicts. This essay aims to discuss the roles of religion in international relations and argues that the religion difference is not the real source of conflicts in the globe, by reviewing the relevant literature for understanding the research background and gap of this topic. Also, this essay will suggest some implementations on dealing with the regional conflicts.

Keywords: religion, conflicts, criticism, international relations

Procedia PDF Downloads 185
25608 Improving Temporal Correlations in Empirical Orthogonal Function Expansions for Data Interpolating Empirical Orthogonal Function Algorithm

Authors: Ping Bo, Meng Yunshan

Abstract:

Satellite-derived sea surface temperature (SST) is a key parameter for many operational and scientific applications. However, the disadvantage of SST data is a high percentage of missing data which is mainly caused by cloud coverage. Data Interpolating Empirical Orthogonal Function (DINEOF) algorithm is an EOF-based technique for reconstructing the missing data and has been widely used in oceanographic field. The reconstruction of SST images within a long time series using DINEOF can cause large discontinuities and one solution for this problem is to filter the temporal covariance matrix to reduce the spurious variability. Based on the previous researches, an algorithm is presented in this paper to improve the temporal correlations in EOF expansion. Similar with the previous researches, a filter, such as Laplacian filter, is implemented on the temporal covariance matrix, but the temporal relationship between two consecutive images which is used in the filter is considered in the presented algorithm, for example, two images in the same season are more likely correlated than those in the different seasons, hence the latter one is less weighted in the filter. The presented approach is tested for the monthly nighttime 4-km Advanced Very High Resolution Radiometer (AVHRR) Pathfinder SST for the long-term period spanning from 1989 to 2006. The results obtained from the presented algorithm are compared to those from the original DINEOF algorithm without filtering and from the DINEOF algorithm with filtering but without taking temporal relationship into account.

Keywords: data interpolating empirical orthogonal function, image reconstruction, sea surface temperature, temporal filter

Procedia PDF Downloads 324
25607 Sparse Unmixing of Hyperspectral Data by Exploiting Joint-Sparsity and Rank-Deficiency

Authors: Fanqiang Kong, Chending Bian

Abstract:

In this work, we exploit two assumed properties of the abundances of the observed signatures (endmembers) in order to reconstruct the abundances from hyperspectral data. Joint-sparsity is the first property of the abundances, which assumes the adjacent pixels can be expressed as different linear combinations of same materials. The second property is rank-deficiency where the number of endmembers participating in hyperspectral data is very small compared with the dimensionality of spectral library, which means that the abundances matrix of the endmembers is a low-rank matrix. These assumptions lead to an optimization problem for the sparse unmixing model that requires minimizing a combined l2,p-norm and nuclear norm. We propose a variable splitting and augmented Lagrangian algorithm to solve the optimization problem. Experimental evaluation carried out on synthetic and real hyperspectral data shows that the proposed method outperforms the state-of-the-art algorithms with a better spectral unmixing accuracy.

Keywords: hyperspectral unmixing, joint-sparse, low-rank representation, abundance estimation

Procedia PDF Downloads 261
25606 Electronic Physical Activity Record (EPAR): Key for Data Driven Physical Activity Healthcare Services

Authors: Rishi Kanth Saripalle

Abstract:

Medical experts highly recommend to include physical activity in everyone’s daily routine irrespective of gender or age as it helps to improve various medical issues or curb potential issues. Simultaneously, experts are also diligently trying to provide various healthcare services (interventions, plans, exercise routines, etc.) for promoting healthy living and increasing physical activity in one’s ever increasing hectic schedules. With the introduction of wearables, individuals are able to keep track, analyze, and visualize their daily physical activities. However, there seems to be no common agreed standard for representing, gathering, aggregating and analyzing an individual’s physical activity data from disparate multiple sources (exercise pans, multiple wearables, etc.). This issue makes it highly impractical to develop any data-driven physical activity applications and healthcare programs. Further, the inability to integrate the physical activity data into an individual’s Electronic Health Record to provide a wholistic image of that individual’s health is still eluding the experts. This article has identified three primary reasons for this potential issue. First, there is no agreed standard, both structure and semantic, for representing and sharing physical activity data across disparate systems. Second, various organizations (e.g., LA fitness, Gold’s Gym, etc.) and research backed interventions and programs still primarily rely on paper or unstructured format (such as text or notes) to keep track of the data generated from physical activities. Finally, most of the wearable devices operate in silos. This article identifies the underlying problem, explores the idea of reusing existing standards, and identifies the essential modules required to move forward.

Keywords: electronic physical activity record, physical activity in EHR EIM, tracking physical activity data, physical activity data standards

Procedia PDF Downloads 282
25605 A Practical Approach Towards Disinfection Challenges in Sterile Manufacturing Area

Authors: Doris Lacej, Eni Bushi

Abstract:

Cleaning and disinfection procedures are essential for maintaining the cleanliness status of the pharmaceutical manufacturing environment particularly of the cleanrooms and sterile unit area. The Good Manufacturing Practice (GMP) Annex 1 recommendation highly requires the implementation of the standard and validated cleaning and disinfection protocols. However, environmental monitoring has shown that even a validated cleaning method with certified agents may result in the presence of atypical microorganisms’ colony that exceeds GMP limits for a specific cleanroom area. In response to this issue, this case study aims to arrive at the root cause of the microbial contamination observed in the sterile production environment in Profarma pharmaceutical industry in Albania through applying a problem-solving practical approach that ensures the appropriate sterility grade. The guidelines and literature emphasize the importance of several factors in the prevention of possible microbial contamination occurring in cleanrooms, grade A and C. These factors are integrated into a practical framework, to identify the root cause of the presence of Aspergillus Niger colony in the sterile production environment in Profarma pharmaceutical industry in Albania. In addition, the application of a semi-automatic disinfecting system such as H2O2 FOG into sterile grade A and grade C cleanrooms has been an effective solution in eliminating the atypical colony of Aspergillus Niger. Selecting the appropriate detergents and disinfectants at the right concentration, frequency, and combination; the presence of updated and standardized guidelines for cleaning and disinfection as well as continuous training of operators on these practices in accordance with the updated GMP guidelines are some of the identified factors that influence the success of achieving sterility grade. However, to ensure environmental sustainability it is important to be prepared for identifying the source of contamination and making the appropriate decision. The proposed case-based practical approach may help pharmaceutical companies to achieve sterile production and cleanliness environmental sustainability in challenging situations. Apart from the integration of valid agents and standardized cleaning and disinfection protocols according to GMP Annex 1, pharmaceutical companies must be careful and investigate the source and all the steps that can influence the results of an abnormal situation. Subsequently apart from identifying the root cause it is important to solve the problem with a successful alternative approach.

Keywords: cleanrooms, disinfectants, environmental monitoring, GMP Annex 1

Procedia PDF Downloads 216
25604 An Investigation of the Effects of Gripping Systems in Geosynthetic Shear Testing

Authors: Charles Sikwanda

Abstract:

The use of geosynthetic materials in geotechnical engineering projects has rapidly increased over the past several years. These materials have resulted in improved performance and cost reduction of geotechnical structures as compared to the use of conventional materials. However, working with geosynthetics requires knowledge of interface parameters for design. These parameters are typically determined by the large direct shear device in accordance with ASTM-D5321 and ASTM-D6243 standards. Although these laboratory tests are standardized, the quality of the results can be largely affected by several factors that include; the shearing rate, applied normal stress, gripping mechanism, and type of the geosynthetic specimens tested. Amongst these factors, poor surface gripping of a specimen is the major source of the discrepancy. If the specimen is inadequately secured to the shearing blocks, it experiences progressive failure and shear strength that deviates from the true field performance of the tested material. This leads to inaccurate, unsafe, and cost ineffective designs of projects. Currently, the ASTM-D5321 and ASTM-D6243 standards do not provide a standardized gripping system for geosynthetic shear strength testing. Over the years, researchers have come up with different gripping systems that can be used such as; glue, metal textured surface, sandblasting, and sandpaper. However, these gripping systems are regularly not adequate to sufficiently secure the tested specimens to the shearing device. This has led to large variability in test results and difficulties in results interpretation. Therefore, this study was aimed at determining the effects of gripping systems in geosynthetic interface shear strength testing using a 300 x 300 mm direct shear box. The results of the research will contribute to easy data interpretation and increase result accuracy and reproducibility.

Keywords: geosynthetics, shear strength parameters, gripping systems, gripping

Procedia PDF Downloads 203
25603 Developing Pavement Structural Deterioration Curves

Authors: Gregory Kelly, Gary Chai, Sittampalam Manoharan, Deborah Delaney

Abstract:

A Structural Number (SN) can be calculated for a road pavement from the properties and thicknesses of the surface, base course, sub-base, and subgrade. Historically, the cost of collecting structural data has been very high. Data were initially collected using Benkelman Beams and now by Falling Weight Deflectometer (FWD). The structural strength of pavements weakens over time due to environmental and traffic loading factors, but due to a lack of data, no structural deterioration curve for pavements has been implemented in a Pavement Management System (PMS). International Roughness Index (IRI) is a measure of the road longitudinal profile and has been used as a proxy for a pavement’s structural integrity. This paper offers two conceptual methods to develop Pavement Structural Deterioration Curves (PSDC). Firstly, structural data are grouped in sets by design Equivalent Standard Axles (ESA). An ‘Initial’ SN (ISN), Intermediate SN’s (SNI) and a Terminal SN (TSN), are used to develop the curves. Using FWD data, the ISN is the SN after the pavement is rehabilitated (Financial Accounting ‘Modern Equivalent’). Intermediate SNIs, are SNs other than the ISN and TSN. The TSN was defined as the SN of the pavement when it was approved for pavement rehabilitation. The second method is to use Traffic Speed Deflectometer data (TSD). The road network already divided into road blocks, is grouped by traffic loading. For each traffic loading group, road blocks that have had a recent pavement rehabilitation, are used to calculate the ISN and those planned for pavement rehabilitation to calculate the TSN. The remaining SNs are used to complete the age-based or if available, historical traffic loading-based SNI’s.

Keywords: conceptual, pavement structural number, pavement structural deterioration curve, pavement management system

Procedia PDF Downloads 544
25602 The Basics of Cognitive Behavioral Family Therapy and the Treatment of Various Physical and Mental Diseases

Authors: Mahta Mohamadkashi

Abstract:

The family is the most important source of security and health for the people of the society, and at the same time, it is the main field of creating all kinds of social and psychological problems. On the one hand, a family is a natural group with many goals and roles that are important and necessary for all family members. On the other hand, the family is a strong and organized group that recruits the therapist because of the goals that are concealed in its policy and procedures. The relationship between the environment and the family background with mental illnesses has been the focus of various researchers for a long time, and the research and experiments that have been conducted to show that the functioning of the family is related to the mental health of the members of the family. Currently, several theoretical perspectives with different approaches seek to explain and resolve psychological problems and family conflicts that can be mentioned. This research aims to investigate "cognitive-behavioral family therapy" by using the "family therapy" research method which is included the descriptive-analytical method and the method of collecting library information, with special reliance on Persian and Latin books and articles. for considering one of the important approaches of family therapy that we are going which have been known as data and its conditions that also includes requirements and limitations. For this purpose, in the beginning, brief background and introduction about family and family therapy are going to describe, and then the basics of cognitive-behavioral family therapy and the implementation process and various techniques of this approach can go through a big discussion. After that, we will apply this approach in the treatment of various physical and mental diseases in the form of related research, and we will examine the ups and downs of the implementation procedures, limitations, and future directions in this field. In general, This study emphasizes the role of the family system in the occurrence of psychological diseases and disorders and also validates the role of the family system in the treatment of those diseases and disorders. Also, cognitive-behavioral family therapy has been approved as an effective treatment approach for a variety of mental disorders.

Keywords: cognitive-behavioral, family, family therapy, cognitive-behavioral family therapy

Procedia PDF Downloads 101
25601 Nilsson Model Performance in Estimating Bed Load Sediment, Case Study: Tale Zang Station

Authors: Nader Parsazadeh

Abstract:

The variety of bed sediment load relationships, insufficient information and data, and the influence of river conditions make the selection of an optimum relationship for a given river extremely difficult. Hence, in order to select the best formulae, the bed load equations should be evaluated. The affecting factors need to be scrutinized, and equations should be verified. Also, re-evaluation may be needed. In this research, sediment bed load of Dez Dam at Tal-e Zang Station has been studied. After reviewing the available references, the most common formulae were selected that included Meir-Peter and Muller, using MS Excel to compute and evaluate data. Then, 52 series of already measured data at the station were re-measured, and the sediment bed load was determined. 1. The calculated bed load obtained by different equations showed a great difference with that of measured data. 2. r difference ratio from 0.5 to 2.00 was 0% for all equations except for Nilsson and Shields equations while it was 61.5 and 59.6% for Nilsson and Shields equations, respectively. 3. By reviewing results and discarding probably erroneous measured data measurements (by human or machine), one may use Nilsson Equation due to its r value higher than 1 as an effective equation for estimating bed load at Tal-e Zang Station in order to predict activities that depend upon bed sediment load estimate to be determined. Also, since only few studies have been conducted so far, these results may be of assistance to the operators and consulting companies.

Keywords: bed load, empirical relation ship, sediment, Tale Zang Station

Procedia PDF Downloads 362
25600 Hierarchical Filtering Method of Threat Alerts Based on Correlation Analysis

Authors: Xudong He, Jian Wang, Jiqiang Liu, Lei Han, Yang Yu, Shaohua Lv

Abstract:

Nowadays, the threats of the internet are enormous and increasing; however, the classification of huge alert messages generated in this environment is relatively monotonous. It affects the accuracy of the network situation assessment, and also brings inconvenience to the security managers to deal with the emergency. In order to deal with potential network threats effectively and provide more effective data to improve the network situation awareness. It is essential to build a hierarchical filtering method to prevent the threats. In this paper, it establishes a model for data monitoring, which can filter systematically from the original data to get the grade of threats and be stored for using again. Firstly, it filters the vulnerable resources, open ports of host devices and services. Then use the entropy theory to calculate the performance changes of the host devices at the time of the threat occurring and filter again. At last, sort the changes of the performance value at the time of threat occurring. Use the alerts and performance data collected in the real network environment to evaluate and analyze. The comparative experimental analysis shows that the threat filtering method can effectively filter the threat alerts effectively.

Keywords: correlation analysis, hierarchical filtering, multisource data, network security

Procedia PDF Downloads 201
25599 Improving the Optoacoustic Signal by Monitoring the Changes of Coupling Medium

Authors: P. Prasannakumar, L. Myoung Young, G. Seung Kye, P. Sang Hun, S. Chul Gyu

Abstract:

In this paper, we discussed the coupling medium in the optoacoustic imaging. The coupling medium is placed between the scanned object and the ultrasound transducers. Water with varying temperature was used as the coupling medium. The water temperature is gradually varied between 25 to 40 degrees. This heating process is taken with care in order to avoid the bubble formation. Rise in the photoacoustic signal is noted through an unfocused transducer with frequency of 2.25 MHz as the temperature increases. The temperature rise is monitored using a NTC thermistor and the values in degrees are calculated using an embedded evaluation kit. Also the temperature is transmitted to PC through a serial communication. All these processes are synchronized using a trigger signal from the laser source.

Keywords: embedded, optoacoustic, ultrasound , unfocused transducer

Procedia PDF Downloads 349