Search results for: Traveling Salesman Problem
131 Text Mining Technique for Data Mining Application
Authors: M. Govindarajan
Abstract:
Text Mining is around applying knowledge discovery techniques to unstructured text is termed knowledge discovery in text (KDT), or Text data mining or Text Mining. In decision tree approach is most useful in classification problem. With this technique, tree is constructed to model the classification process. There are two basic steps in the technique: building the tree and applying the tree to the database. This paper describes a proposed C5.0 classifier that performs rulesets, cross validation and boosting for original C5.0 in order to reduce the optimization of error ratio. The feasibility and the benefits of the proposed approach are demonstrated by means of medial data set like hypothyroid. It is shown that, the performance of a classifier on the training cases from which it was constructed gives a poor estimate by sampling or using a separate test file, either way, the classifier is evaluated on cases that were not used to build and evaluate the classifier are both are large. If the cases in hypothyroid.data and hypothyroid.test were to be shuffled and divided into a new 2772 case training set and a 1000 case test set, C5.0 might construct a different classifier with a lower or higher error rate on the test cases. An important feature of see5 is its ability to classifiers called rulesets. The ruleset has an error rate 0.5 % on the test cases. The standard errors of the means provide an estimate of the variability of results. One way to get a more reliable estimate of predictive is by f-fold –cross- validation. The error rate of a classifier produced from all the cases is estimated as the ratio of the total number of errors on the hold-out cases to the total number of cases. The Boost option with x trials instructs See5 to construct up to x classifiers in this manner. Trials over numerous datasets, large and small, show that on average 10-classifier boosting reduces the error rate for test cases by about 25%.Keywords: C5.0, Error Ratio, text mining, training data, test data.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2489130 Eco-Agriculture for Effective Solid Waste Management in Minna, Nigeria
Authors: A. Abdulkadir, Y. M. Bello, A. A. Okhimamhe, H. Ibrahim, M. B. Matazu, L. S. Barau
Abstract:
The increasing volume of solid waste generated, collected and disposed daily complicate adequate management of solid waste by relevant agency like Niger State Environmental Protection Agency (NISEPA). In addition, the impacts of solid waste on the natural environment and human livelihood require identification of cost-effective ways for sustainable municipal waste management in Nigeria. These signal the need for identifying environment-friendly initiative and local solution to address the problem of municipal solid waste. A research field was secured at Pago, Minna, Niger State which is located in the guinea savanna belt of Nigeria, within longitude 60 361 4311 - 4511 and latitude 90 291 37.6111 - .6211 N. Poultry droppings, decomposed household waste manure and NPK treatments were used. The experimental field was divided into three replications and four (4) treatments on each replication making a total of twelve (12) plots. The treatments were allotted using Randomized Complete Block Design (RCBD) and Data collected was analyzed using SPSS software and RCBD. The result depicts variation in plant height and number of leaves at 50% flowering; Poultry dropping records the highest height while the number of leaves for waste manure competes fairly well with NPK treatment. Similarly, the varying treatments significantly increase vegetable yield, as the control (non-treatment) records the least yield for the three vegetable samples. Adoption of this organic manure for cultivation does not only enhance environment quality and attainment of food security but will contribute to local economic development, poverty alleviation as well as social inclusion.Keywords: Environmental issues, food security, NISEPA, solid waste.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2371129 Biological Methods to Control Parasitic Weed Phelipanche ramosa L. Pomel in the Field Tomato Crop
Authors: F. Lops, G. Disciglio, A. Carlucci, G. Gatta, L. Frabboni, A. Tarantino, E. Tarantino
Abstract:
Phelipanche ramosa L. Pomel is a root holoparasitic weed plant of many cultivations, particularly of tomato (Lycopersicum esculentum L.) crop. In Italy, Phelipanche problem is increasing, both in density and in acreage. The biological control of this parasitic weed involves the use of living organisms as numerous fungi and bacteria that can infect the parasitic weed, while it may improve the crop growth. This paper deals with the biocontrol with microorganism, including Arbuscular mycorrhizal (AM) fungi and fungal pathogens as Fusarium oxisporum spp. Colonization of crop roots by AM fungi can provide protection of crops against parasitic weeds because of a reduction in their seed germination and attachment, while F. oxisporum, isolated from diseased broomrape tubercles, proved to be highly virulent on P. ramosa. The experimental trial was carried out in open field at Foggia province (Apulia Region, Southern Italy), during the spring-summer season 2016, in order to evaluate the effect of four biological treatments: AM fungi and Fusarium oxisporum applied in the soil alone or combined together, and Rizosum Max® product, compared with the untreated control, to reduce the P. ramosa infestation in processing tomato crop. The principal results to be drawn from this study under field condition, in contrast of those reported previously under laboratory and greenhouse conditions, show that both AM fungi and F. oxisporum do not provide the reduction of the number of emerged shoots of P. ramosa. This can arise probably from the low efficacy seedling of the agent pathogens for the control of this parasite in the field. On the contrary, the Rizosum Max® product, containing AM fungi and some rizophere bacteria combined with several minerals and organic substances, appears to be most effective for the reduction of P. ramosa infestation.
Keywords: Arbuscular mycorrhizal fungi, biocontrol methods, Phelipanche ramosa, F. oxisporum spp.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1066128 A Novel GNSS Integrity Augmentation System for Civil and Military Aircraft
Authors: Roberto Sabatini, Terry Moore, Chris Hill
Abstract:
This paper presents a novel Global Navigation Satellite System (GNSS) Avionics Based Integrity Augmentation (ABIA) system architecture suitable for civil and military air platforms, including Unmanned Aircraft Systems (UAS). Taking the move from previous research on high-accuracy Differential GNSS (DGNSS) systems design, integration and experimental flight test activities conducted at the Italian Air Force Flight Test Centre (CSV-RSV), our research focused on the development of a novel approach to the problem of GNSS ABIA for mission- and safety-critical air vehicle applications and for multi-sensor avionics architectures based on GNSS. Detailed mathematical models were developed to describe the main causes of GNSS signal outages and degradation in flight, namely: antenna obscuration, multipath, fading due to adverse geometry and Doppler shift. Adopting these models in association with suitable integrity thresholds and guidance algorithms, the ABIA system is able to generate integrity cautions (predictive flags) and warnings (reactive flags), as well as providing steering information to the pilot and electronic commands to the aircraft/UAS flight control systems. These features allow real-time avoidance of safety-critical flight conditions and fast recovery of the required navigation performance in case of GNSS data losses. In other words, this novel ABIA system addresses all three cornerstones of GNSS integrity augmentation in mission- and safety-critical applications: prediction (caution flags), reaction (warning flags) and correction (alternate flight path computation).
Keywords: Global Navigation Satellite Systems (GNSS), Integrity Augmentation, Unmanned Aircraft Systems, Aircraft Based Augmentation, Avionics Based Integrity Augmentation, Safety-Critical Applications.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3244127 The Design of Multiple Detection Parallel Combined Spread Spectrum Communication System
Authors: Lixin Tian, Wei Xue
Abstract:
Many jobs in society go underground, such as mine mining, tunnel construction and subways, which are vital to the development of society. Once accidents occur in these places, the interruption of traditional wired communication is not conducive to the development of rescue work. In order to realize the positioning, early warning and command functions of underground personnel and improve rescue efficiency, it is necessary to develop and design an emergency ground communication system. It is easy to be subjected to narrowband interference when performing conventional underground communication. Spreading communication can be used for this problem. However, general spread spectrum methods such as direct spread communication are inefficient, so it is proposed to use parallel combined spread spectrum (PCSS) communication to improve efficiency. The PCSS communication not only has the anti-interference ability and the good concealment of the traditional spread spectrum system, but also has a relatively high frequency band utilization rate and a strong information transmission capability. So, this technology has been widely used in practice. This paper presents a PCSS communication model-multiple detection parallel combined spread spectrum (MDPCSS) communication system. In this paper, the principle of MDPCSS communication system is described, that is, the sequence at the transmitting end is processed in blocks and cyclically shifted to facilitate multiple detection at the receiving end. The block diagrams of the transmitter and receiver of the MDPCSS communication system are introduced. At the same time, the calculation formula of the system bit error rate (BER) is introduced, and the simulation and analysis of the BER of the system are completed. By comparing with the common parallel PCSS communication, we can draw a conclusion that it is indeed possible to reduce the BER and improve the system performance. Furthermore, the influence of different pseudo-code lengths selected on the system BER is simulated and analyzed, and the conclusion is that the larger the pseudo-code length is, the smaller the system error rate is.
Keywords: Cyclic shift, multiple detection, parallel combined spread spectrum, PN code.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 555126 Object Detection in Digital Images under Non-Standardized Conditions Using Illumination and Shadow Filtering
Authors: Waqqas-ur-Rehman Butt, Martin Servin, Marion Pause
Abstract:
In recent years, object detection has gained much attention and very encouraging research area in the field of computer vision. The robust object boundaries detection in an image is demanded in numerous applications of human computer interaction and automated surveillance systems. Many methods and approaches have been developed for automatic object detection in various fields, such as automotive, quality control management and environmental services. Inappropriately, to the best of our knowledge, object detection under illumination with shadow consideration has not been well solved yet. Furthermore, this problem is also one of the major hurdles to keeping an object detection method from the practical applications. This paper presents an approach to automatic object detection in images under non-standardized environmental conditions. A key challenge is how to detect the object, particularly under uneven illumination conditions. Image capturing conditions the algorithms need to consider a variety of possible environmental factors as the colour information, lightening and shadows varies from image to image. Existing methods mostly failed to produce the appropriate result due to variation in colour information, lightening effects, threshold specifications, histogram dependencies and colour ranges. To overcome these limitations we propose an object detection algorithm, with pre-processing methods, to reduce the interference caused by shadow and illumination effects without fixed parameters. We use the Y CrCb colour model without any specific colour ranges and predefined threshold values. The segmented object regions are further classified using morphological operations (Erosion and Dilation) and contours. Proposed approach applied on a large image data set acquired under various environmental conditions for wood stack detection. Experiments show the promising result of the proposed approach in comparison with existing methods.Keywords: Image processing, Illumination equalization, Shadow filtering, Object detection, Colour models, Image segmentation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1020125 Risk in the South African Sectional Title Industry: An Assurance Perspective
Authors: Leandi Steenkamp
Abstract:
The sectional title industry has been a part of the property landscape in South Africa for almost half a century, and plays a significant role in addressing the housing problem in the country. Stakeholders such as owners and investors in sectional title property are in most cases not directly involved in the management thereof, and place reliance on the audited annual financial statements of bodies corporate for decision-making purposes. Although the industry seems to be highly regulated, the legislation regarding accounting and auditing of sectional title is vague and ambiguous. Furthermore, there are no industry-specific auditing and accounting standards to guide accounting and auditing practitioners in performing their work and industry financial benchmarks are not readily available. In addition, financial pressure on sectional title schemes is often very high due to the fact that some owners exercise unrealistic pressure to keep monthly levies as low as possible. All these factors have an impact on the business risk as well as audit risk of bodies corporate. Very little academic research has been undertaken on the sectional title industry in South Africa from an accounting and auditing perspective. The aim of this paper is threefold: Firstly, to discuss the findings of a literature review on uncertainties, ambiguity and confusing aspects in current legislation regarding the audit of a sectional title property that may cause or increase audit and business risk. Secondly, empirical findings of risk-related aspects from the results of interviews with three groups of body corporate role-players will be discussed. The role-players were body corporate trustee chairpersons, body corporate managing agents and accounting and auditing practitioners of bodies corporate. Specific reference will be made to business risk and audit risk. Thirdly, practical recommendations will be made on possibilities of closing the audit expectation gap, and further research opportunities in this regard will be discussed.Keywords: Assurance, audit, audit risk, body corporate, corporate governance, sectional title.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1283124 The Influence of Travel Experience within Perceived Public Transport Quality
Authors: Armando Cartenì, Ilaria Henke
Abstract:
The perceived public transport quality is an important driver that influences both customer satisfaction and mobility choices. The competition among transport operators needs to improve the quality of the services and identify which attributes are perceived as relevant by passengers. Among the “traditional” public transport quality attributes there are, for example: travel and waiting time, regularity of the services, and ticket price. By contrast, there are some “non-conventional” attributes that could significantly influence customer satisfaction jointly with the “traditional” ones. Among these, the beauty/aesthetics of the transport terminals (e.g. rail station and bus terminal) is probably one of the most impacting on user perception. Starting from these considerations, the point stressed in this paper was if (and how munch) the travel experience of the overall travel (e.g. how long is the travel, how many transport modes must be used) influences the perception of the public transport quality. The aim of this paper was to investigate the weight of the terminal quality (e.g. aesthetic, comfort and service offered) within the overall travel experience. The case study was the extra-urban Italian bus network. The passengers of the major Italian terminal bus were interviewed and the analysis of the results shows that about the 75% of the travelers, are available to pay up to 30% more for the ticket price for having a high quality terminal. A travel experience effect was observed: the average perceived transport quality varies with the characteristic of the overall trip. The passengers that have a “long trip” (travel time greater than 2 hours) perceived as “low” the overall quality of the trip even if they pass through a high quality terminal. The opposite occurs for the “short trip” passengers. This means that if a traveler passes through a high quality station, the overall perception of that terminal could be significantly reduced if he is tired from a long trip. This result is important and if confirmed through other case studies, will allow to conclude that the “travel experience impact" must be considered as an explicit design variable for public transport services and planning.Keywords: Transportation planning, sustainable mobility, decision support system, discrete choice model, design problem.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1180123 Representation of Memory of Forced Displacement in Central and Eastern Europe after World War II in Polish and German Cinemas
Authors: Ilona Copik
Abstract:
The aim of this study is to analyze the representation of memories of the forced displacement of Poles and Germans from the eastern territories in 1945 as depicted by Polish and German feature films between the years 1945-1960. The aftermath of World War II and the Allied agreements concluded at Yalta and Potsdam (1945) resulted in changes in national borders in Central and Eastern Europe and the large-scale transfer of civilians. The westward migration became a symbol of the new post-war division of Europe, new spheres of influence separated by the Iron Curtain. For years it was a controversial topic in both Poland and Germany due to the geopolitical alignment (the socialist East and capitalist West of Europe), as well as the unfinished debate between the victims and perpetrators of the war. The research premise is to take a comparative view of the conflicted cultures of Polish and German memory, to reflect on the possibility of an international dialogue about the past recorded in film images, and to discover the potential of film as a narrative warning against totalitarian inclinations. Until now, films made between 1945 and 1960 in Poland and the German occupation zones have been analyzed mainly in the context of artistic strategies subordinated to ideology and historical politics. In this study, the intention is to take a critical approach leading to the recognition of how films work as collective memory media, how they reveal the mechanisms of memory/ forgetting, and what settlement topoi and migration myths they contain. The main hypothesis is that feature films about forced displacement, in addition to the politics of history - separate in each country - reveal comparable transnational individual experiences: the chaos of migration, the trauma of losing one's home, the conflicts accompanying the familiar/foreign, the difficulty of cultural adaptation, the problem of lost identity, etc.
Keywords: Forced displacement, Polish and German cinema, war victims, World War II.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 152122 Application of Various Methods for Evaluation of Heavy Metal Pollution in Soils around Agarak Copper-Molybdenum Mine Complex, Armenia
Authors: K. A. Ghazaryan, H. S. Movsesyan, N. P. Ghazaryan
Abstract:
The present study was aimed in assessing the heavy metal pollution of the soils around Agarak copper-molybdenum mine complex and related environmental risks. This mine complex is located in the south-east part of Armenia, and the present study was conducted in 2013. The soils of the five riskiest sites of this region were studied: surroundings of the open mine, the sites adjacent to processing plant of Agarak copper-molybdenum mine complex, surroundings of Darazam active tailing dump, the recultivated tailing dump of “ravine - 2”, and the recultivated tailing dump of “ravine - 3”. The mountain cambisol was the main soil type in the study sites. The level of soil contamination by heavy metals was assessed by Contamination factors (Cf), Degree of contamination (Cd), Geoaccumulation index (I-geo) and Enrichment factor (EF). The distribution pattern of trace metals in the soil profile according to Cf, Cd, I-geo and EF values shows that the soil is much polluted. Almost in all studied sites, Cu, Mo, Pb, and Cd were the main polluting heavy metals, and this was conditioned by Agarak copper-molybdenum mine complex activity. It is necessary to state that the pollution problem becomes pressing as some parts of these highly polluted region are inhabited by population, and agriculture is highly developed there; therefore, heavy metals can be transferred into human bodies through food chains and have direct influence on public health. Since the induced pollution can pose serious threats to public health, further investigations on soil and vegetation pollution are recommended. Finally, Cf calculating based on distance from the pollution source and the wind direction can provide more reasonable results.
Keywords: Agarak copper-molybdenum mine complex, heavy metals, soil contamination, enrichment factor, Armenia.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1249121 Indoor Air Quality Analysis for Renovating Building: A Case Study of Student Studio, Department of Landscape, Chiangmai, Thailand
Authors: Warangkana Juangjandee
Abstract:
The rapidly increasing number of population in the limited area creates an effect on the idea of the improvement of the area to suit the environment and the needs of people. Faculty of architecture Chiang Mai University is also expanding in both variety fields of study and quality of education. In 2020, the new department will be introduced in the faculty which is Department of Landscape Architecture. With the limitation of the area in the existing building, the faculty plan to renovate some parts of its school for anticipates the number of students who will join the program in the next two years. As a result, the old wooden workshop area is selected to be renovated as student studio space. With such condition, it is necessary to study the restriction and the distinctive environment of the site prior to the improvement in order to find ways to manage the existing space due to the fact that the primary functions that have been practiced in the site, an old wooden workshop space and the new function, studio space, are too different. 72.9% of the annual times in the room are considered to be out of the thermal comfort condition with high relative humidity. This causes non-comfort condition for occupants which could promote mould growth. This study aims to analyze thermal comfort condition in the Landscape Learning Studio Area for finding the solution to improve indoor air quality and respond to local conditions. The research methodology will be in two parts: 1) field gathering data on the case study 2) analysis and finding the solution of improving indoor air quality. The result of the survey indicated that the room needs to solve non-comfort condition problem. This can be divided into two ways which are raising ventilation and indoor temperature, e.g. improving building design and stack driven ventilation, using fan for enhancing more internal ventilation.
Keywords: Relative humidity, renovation, temperature, thermal comfort.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 863120 Toward Indoor and Outdoor Surveillance Using an Improved Fast Background Subtraction Algorithm
Authors: A. El Harraj, N. Raissouni
Abstract:
The detection of moving objects from a video image sequences is very important for object tracking, activity recognition, and behavior understanding in video surveillance. The most used approach for moving objects detection / tracking is background subtraction algorithms. Many approaches have been suggested for background subtraction. But, these are illumination change sensitive and the solutions proposed to bypass this problem are time consuming. In this paper, we propose a robust yet computationally efficient background subtraction approach and, mainly, focus on the ability to detect moving objects on dynamic scenes, for possible applications in complex and restricted access areas monitoring, where moving and motionless persons must be reliably detected. It consists of three main phases, establishing illumination changes invariance, background/foreground modeling and morphological analysis for noise removing. We handle illumination changes using Contrast Limited Histogram Equalization (CLAHE), which limits the intensity of each pixel to user determined maximum. Thus, it mitigates the degradation due to scene illumination changes and improves the visibility of the video signal. Initially, the background and foreground images are extracted from the video sequence. Then, the background and foreground images are separately enhanced by applying CLAHE. In order to form multi-modal backgrounds we model each channel of a pixel as a mixture of K Gaussians (K=5) using Gaussian Mixture Model (GMM). Finally, we post process the resulting binary foreground mask using morphological erosion and dilation transformations to remove possible noise. For experimental test, we used a standard dataset to challenge the efficiency and accuracy of the proposed method on a diverse set of dynamic scenes.
Keywords: Video surveillance, background subtraction, Contrast Limited Histogram Equalization, illumination invariance, object tracking, object detection, behavior understanding, dynamic scenes.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2088119 GridNtru: High Performance PKCS
Authors: Narasimham Challa, Jayaram Pradhan
Abstract:
Cryptographic algorithms play a crucial role in the information society by providing protection from unauthorized access to sensitive data. It is clear that information technology will become increasingly pervasive, Hence we can expect the emergence of ubiquitous or pervasive computing, ambient intelligence. These new environments and applications will present new security challenges, and there is no doubt that cryptographic algorithms and protocols will form a part of the solution. The efficiency of a public key cryptosystem is mainly measured in computational overheads, key size and bandwidth. In particular the RSA algorithm is used in many applications for providing the security. Although the security of RSA is beyond doubt, the evolution in computing power has caused a growth in the necessary key length. The fact that most chips on smart cards can-t process key extending 1024 bit shows that there is need for alternative. NTRU is such an alternative and it is a collection of mathematical algorithm based on manipulating lists of very small integers and polynomials. This allows NTRU to high speeds with the use of minimal computing power. NTRU (Nth degree Truncated Polynomial Ring Unit) is the first secure public key cryptosystem not based on factorization or discrete logarithm problem. This means that given sufficient computational resources and time, an adversary, should not be able to break the key. The multi-party communication and requirement of optimal resource utilization necessitated the need for the present day demand of applications that need security enforcement technique .and can be enhanced with high-end computing. This has promoted us to develop high-performance NTRU schemes using approaches such as the use of high-end computing hardware. Peer-to-peer (P2P) or enterprise grids are proven as one of the approaches for developing high-end computing systems. By utilizing them one can improve the performance of NTRU through parallel execution. In this paper we propose and develop an application for NTRU using enterprise grid middleware called Alchemi. An analysis and comparison of its performance for various text files is presented.Keywords: Alchemi, GridNtru, Ntru, PKCS.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1692118 A Proposed Optimized and Efficient Intrusion Detection System for Wireless Sensor Network
Authors: Abdulaziz Alsadhan, Naveed Khan
Abstract:
In recent years intrusions on computer network are the major security threat. Hence, it is important to impede such intrusions. The hindrance of such intrusions entirely relies on its detection, which is primary concern of any security tool like Intrusion detection system (IDS). Therefore, it is imperative to accurately detect network attack. Numerous intrusion detection techniques are available but the main issue is their performance. The performance of IDS can be improved by increasing the accurate detection rate and reducing false positive. The existing intrusion detection techniques have the limitation of usage of raw dataset for classification. The classifier may get jumble due to redundancy, which results incorrect classification. To minimize this problem, Principle component analysis (PCA), Linear Discriminant Analysis (LDA) and Local Binary Pattern (LBP) can be applied to transform raw features into principle features space and select the features based on their sensitivity. Eigen values can be used to determine the sensitivity. To further classify, the selected features greedy search, back elimination, and Particle Swarm Optimization (PSO) can be used to obtain a subset of features with optimal sensitivity and highest discriminatory power. This optimal feature subset is used to perform classification. For classification purpose, Support Vector Machine (SVM) and Multilayer Perceptron (MLP) are used due to its proven ability in classification. The Knowledge Discovery and Data mining (KDD’99) cup dataset was considered as a benchmark for evaluating security detection mechanisms. The proposed approach can provide an optimal intrusion detection mechanism that outperforms the existing approaches and has the capability to minimize the number of features and maximize the detection rates.
Keywords: Particle Swarm Optimization (PSO), Principle component analysis (PCA), Linear Discriminant Analysis (LDA), Local Binary Pattern (LBP), Support Vector Machine (SVM), Multilayer Perceptron (MLP).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2766117 Sustainability Analysis and Quality Assessment of Rainwater Harvested from Green Roofs: A Review
Authors: Mst. Nilufa Sultana, Shatirah Akib, Muhammad Aqeel Ashraf, Mohamed Roseli Zainal Abidin
Abstract:
Most people today are aware that global climate change is not just a scientific theory but also a fact with worldwide consequences. Global climate change is due to rapid urbanization, industrialization, high population growth and current vulnerability of the climatic condition. Water is becoming scarce as a result of global climate change. To mitigate the problem arising due to global climate change and its drought effect, harvesting rainwater from green roofs, an environmentally-friendly and versatile technology, is becoming one of the best assessment criteria and gaining attention in Malaysia. This paper addresses the sustainability of green roofs and examines the quality of water harvested from green roofs in comparison to rainwater. The factors that affect the quality of such water, taking into account, for example, roofing materials, climatic conditions, the frequency of rainfall frequency and the first flush. A green roof was installed on the Humid Tropic Centre (HTC) is a place of the study on monitoring program for urban Stormwater Management Manual for Malaysia (MSMA), Eco-Hydrological Project in Kuala Lumpur, and the rainwater was harvested and evaluated on the basis of four parameters i.e., conductivity, dissolved oxygen (DO), pH and temperature. These parameters were found to fall between Class I and Class III of the Interim National Water Quality Standards (INWQS) and the Water Quality Index (WQI). Some preliminary treatment such as disinfection and filtration could likely to improve the value of these parameters to class I. This review paper clearly indicates that there is a need for more research to address other microbiological and chemical quality parameters to ensure that the harvested water is suitable for use potable water for domestic purposes. The change in all physical, chemical and microbiological parameters with respect to storage time will be a major focus of future studies in this field.
Keywords: Green roofs, INWQS, MSMA-SME, Rainwater harvesting.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2902116 Mathematical Modeling of the AMCs Cross-Contamination Removal in the FOUPs: Finite Element Formulation and Application in FOUP’s Decontamination
Authors: N. Santatriniaina, J. Deseure, T.Q. Nguyen, H. Fontaine, C. Beitia, L. Rakotomanana
Abstract:
Nowadays, with the increasing of the wafer's size and the decreasing of critical size of integrated circuit manufacturing in modern high-tech, microelectronics industry needs a maximum attention to challenge the contamination control. The move to 300 [mm] is accompanied by the use of Front Opening Unified Pods for wafer and his storage. In these pods an airborne cross contamination may occur between wafers and the pods. A predictive approach using modeling and computational methods is very powerful method to understand and qualify the AMCs cross contamination processes. This work investigates the required numerical tools which are employed in order to study the AMCs cross-contamination transfer phenomena between wafers and FOUPs. Numerical optimization and finite element formulation in transient analysis were established. Analytical solution of one dimensional problem was developed and the calibration process of physical constants was performed. The least square distance between the model (analytical 1D solution) and the experimental data are minimized. The behavior of the AMCs intransient analysis was determined. The model framework preserves the classical forms of the diffusion and convection-diffusion equations and yields to consistent form of the Fick's law. The adsorption process and the surface roughness effect were also traduced as a boundary condition using the switch condition Dirichlet to Neumann and the interface condition. The methodology is applied, first using the optimization methods with analytical solution to define physical constants, and second using finite element method including adsorption kinetic and the switch of Dirichlet to Neumann condition.
Keywords: AMCs, FOUP, cross-contamination, adsorption, diffusion, numerical analysis, wafers, Dirichlet to Neumann, finite elements methods, Fick’s law, optimization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3176115 Ergonomics and Its Applicability in the Design Process in Egypt Challenges and Prospects
Authors: Mohamed Moheyeldin Mahmoud
Abstract:
Egypt suffers from a severe shortage of data and charts concerning the physical dimensions, measurements, qualities and consumer behavior. The shortage of needed information and appropriate methods has forced the Egyptian designer to use any other foreign standard when designing a product for the Egyptian consumer which has led to many problems. The urgently needed database concerning the physical specifications, measurements of the Egyptian consumers, as well as the need to support the Ergonomics given courses in many colleges and institutes with the latest technologies, is stated as the research problem. Descriptive analytical method relying on the compiling, comparing and analyzing of information and facts in order to get acceptable perceptions, ideas and considerations is the used methodology by the researcher. The research concludes that: 1. Good interaction relationship between users and products shows the success of that product. 2. An integration linkage between the most prominent fields of science specially Ergonomics, Interaction Design and Ethnography should be encouraged to provide an ultimately updated database concerning the nature, specifications and environment of the Egyptian consumer, in order to achieve a higher benefit for both user and product. 3. Chinese economic policy based on the study of market requirements long before any market activities should be emulated. 4. Using Ethnography supports the design activities creating new products or updating existent ones through measuring the compatibility of products with their environment and user expectations, While contracting a joint cooperation between military colleges, sports education institutes from one side, and design institutes from the other side to provide an ultimately updated (annually updated) database concerning some specifications about students of both sexes applying in those institutes (height, weight, etc.) to provide the Industrial designer with the needed information when creating a new product or updating an existing one concerning that category is recommended by the researcher.
Keywords: Adapt ergonomics, ethnography, interaction design.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 811114 The Prevalence of Organized Retail Crime in Riyadh, Saudi Arabia
Authors: Saleh Dabil
Abstract:
This study investigates the level of existence of organized retail crime in supermarkets of Riyadh, Saudi Arabia. The store managers, security managers and general employees were asked about the types of retail crimes occur in the stores. Three independent variables were related to the report of organized retail theft. The independent variables are: 1) the supermarket profile (volume, location, standard and type of the store), 2) the social physical environment of the store (maintenance, cleanness and overall organizational cooperation), 3) the security techniques and loss prevention electronics techniques used. The theoretical framework of this study based on the social disorganization theory. This study concluded that the organized retail theft, in specific, organized theft is moderately apparent in Riyadh stores. The general result showed that the environment of the stores has an effect on the prevalence of organized retail theft with relation to the gender of thieves, age groups, working shift, type of stolen items as well as the number of thieves in one case. Among other reasons, some factors of the organized theft are: economic pressure of customers based on the location of the store. The dealing of theft also was investigated to have a clear picture of stores dealing with organized retail theft. The result showed that mostly, thieves sent without any action and sometimes given written warning. Very few cases dealt with by police. There are other factors in the study can be looked up in the text. This study suggests solving the problem of organized theft; first, is "the well distributing of the duties and responsibilities between the employees especially for security purposes". Second "Installation of strong security system" and "Making well-designed store layout". Third is "giving training for general employees" and "to give periodically security skills training of employees". There are other suggestions in the study can be looked up in the text.
Keywords: Organized Crime, Retail, Theft, Loss prevention, Store environment.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2335113 Effective Planning of Public Transportation Systems: A Decision Support Application
Authors: Ferdi Sönmez, Nihal Yorulmaz
Abstract:
Decision making on the true planning of the public transportation systems to serve potential users is a must for metropolitan areas. To take attraction of travelers to projected modes of transport, adequately fair overall travel times should be provided. In this fashion, other benefits such as lower traffic congestion, road safety and lower noise and atmospheric pollution may be earned. The congestion which comes with increasing demand of public transportation is becoming a part of our lives and making residents’ life difficult. Hence, regulations should be done to reduce this congestion. To provide a constructive and balanced regulation in public transportation systems, right stations should be located in right places. In this study, it is aimed to design and implement a Decision Support System (DSS) Application to determine the optimal bus stop places for public transport in Istanbul which is one of the biggest and oldest cities in the world. Required information is gathered from IETT (Istanbul Electricity, Tram and Tunnel) Enterprises which manages all public transportation services in Istanbul Metropolitan Area. By using the most real-like values, cost assignments are made. The cost is calculated with the help of equations produced by bi-level optimization model. For this study, 300 buses, 300 drivers, 10 lines and 110 stops are used. The user cost of each station and the operator cost taken place in lines are calculated. Some components like cost, security and noise pollution are considered as significant factors affecting the solution of set covering problem which is mentioned for identifying and locating the minimum number of possible bus stops. Preliminary research and model development for this study refers to previously published article of the corresponding author. Model results are represented with the intent of decision support to the specialists on locating stops effectively.
Keywords: User cost, bi-level optimization model, decision support, operator cost, transportation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 729112 Performance Evaluation and Plugging Characteristics of Controllable Self-Aggregating Colloidal Particle Profile Control Agent
Authors: Zhiguo Yang, Xiangan Yue, Minglu Shao, Yang Yue, Tianqi Yue
Abstract:
In low permeability reservoirs, the reservoir pore throat is small and the micro heterogeneity is prominent. Conventional microsphere profile control agents generally have good injectability but poor plugging effect; however, profile control agents with good plugging effect generally have poor injectability, which makes it difficult for agent to realize deep profile control of reservoir. To solve this problem, styrene and acrylamide were used as monomers in the laboratory. Emulsion polymerization was used to prepare the Controllable Self-Aggregating Colloidal Particle (CSA), which was rich in amide group. The CSA microsphere dispersion solution with a particle diameter smaller than the pore throat diameter was injected into the reservoir to ensure that the profile control agent had good inject ability. After dispersing the CSA microsphere to the deep part of the reservoir, the CSA microspheres dispersed in static for a certain period of time will self-aggregate into large-sized particle clusters to achieve plugging of hypertonic channels. The CSA microsphere has the characteristics of low expansion and avoids shear fracture in the process of migration. It can be observed by transmission electron microscope that CSA microspheres still maintain regular and uniform spherical and core-shell heterogeneous structure after aging at 100 ºC for 35 days, and CSA microspheres have good thermal stability. The results of bottle test showed that with the increase of cation concentration, the aggregation time of CSA microspheres gradually shortened, and the influence of divalent cations was greater than that of monovalent ions. Physical simulation experiments show that CSA microspheres have good injectability, and the aggregated CSA particle clusters can produce effective plugging and migrate to the deep part of the reservoir for profile control.
Keywords: Heterogeneous reservoir, deep profile control, emulsion polymerization, colloidal particles, plugging characteristic.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 486111 Multistage Condition Monitoring System of Aircraft Gas Turbine Engine
Authors: A. M. Pashayev, D. D. Askerov, C. Ardil, R. A. Sadiqov, P. S. Abdullayev
Abstract:
Researches show that probability-statistical methods application, especially at the early stage of the aviation Gas Turbine Engine (GTE) technical condition diagnosing, when the flight information has property of the fuzzy, limitation and uncertainty is unfounded. Hence the efficiency of application of new technology Soft Computing at these diagnosing stages with the using of the Fuzzy Logic and Neural Networks methods is considered. According to the purpose of this problem training with high accuracy of fuzzy multiple linear and non-linear models (fuzzy regression equations) which received on the statistical fuzzy data basis is made. For GTE technical condition more adequate model making dynamics of skewness and kurtosis coefficients- changes are analysed. Researches of skewness and kurtosis coefficients values- changes show that, distributions of GTE work parameters have fuzzy character. Hence consideration of fuzzy skewness and kurtosis coefficients is expedient. Investigation of the basic characteristics changes- dynamics of GTE work parameters allows drawing conclusion on necessity of the Fuzzy Statistical Analysis at preliminary identification of the engines' technical condition. Researches of correlation coefficients values- changes shows also on their fuzzy character. Therefore for models choice the application of the Fuzzy Correlation Analysis results is offered. At the information sufficiency is offered to use recurrent algorithm of aviation GTE technical condition identification (Hard Computing technology is used) on measurements of input and output parameters of the multiple linear and non-linear generalised models at presence of noise measured (the new recursive Least Squares Method (LSM)). The developed GTE condition monitoring system provides stageby- stage estimation of engine technical conditions. As application of the given technique the estimation of the new operating aviation engine technical condition was made.Keywords: aviation gas turbine engine, technical condition, fuzzy logic, neural networks, fuzzy statistics
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1570110 Production Process for Diesel Fuel Components Polyoxymethylene Dimethyl Ethers from Methanol and Formaldehyde Solution
Authors: Xiangjun Li, Huaiyuan Tian, Wujie Zhang, Dianhua Liu
Abstract:
Polyoxymethylene dimethyl ethers (PODEn) as clean diesel additive can improve the combustion efficiency and quality of diesel fuel and alleviate the problem of atmospheric pollution. Considering synthetic routes, PODE production from methanol and formaldehyde is regarded as the most economical and promising synthetic route. However, methanol used for synthesizing PODE can produce water, which causes the loss of active center of catalyst and hydrolysis of PODEn in the production process. Macroporous strong acidic cation exchange resin catalyst was prepared, which has comparative advantages over other common solid acid catalysts in terms of stability and catalytic efficiency for synthesizing PODE. Catalytic reactions were carried out under 353 K, 1 MPa and 3mL·gcat-1·h-1 in a fixed bed reactor. Methanol conversion and PODE3-6 selectivity reached 49.91% and 23.43%, respectively. Catalyst lifetime evaluation showed that resin catalyst retained its catalytic activity for 20 days without significant changes and catalytic activity of completely deactivated resin catalyst can basically return to previous level by simple acid regeneration. The acid exchange capacities of original and deactivated catalyst were 2.5191 and 0.0979 mmol·g-1, respectively, while regenerated catalyst reached 2.0430 mmol·g-1, indicating that the main reason for resin catalyst deactivation is that Brønsted acid sites of original resin catalyst were temporarily replaced by non-hydrogen ion cations. A separation process consisting of extraction and distillation for PODE3-6 product was designed for separation of water and unreacted formaldehyde from reactive mixture and purification of PODE3-6, respectively. The concentration of PODE3-6 in final product can reach up to 97%. These results indicate that the scale-up production of PODE3-6 from methanol and formaldehyde solution is feasible.
Keywords: Inactivation, polyoxymethylene dimethyl ethers, separation process, sulfonic cation exchange resin.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 903109 Study on Optimization of Air Infiltration at Entrance of a Commercial Complex in Zhejiang Province
Authors: Yujie Zhao, Jiantao Weng
Abstract:
In the past decade, with the rapid development of China's economy, the purchasing power and physical demand of residents have been improved, which results in the vast emergence of public buildings like large shopping malls. However, the architects usually focus on the internal functions and streamlines of these buildings, ignoring the impact of the environment on the subjective feelings of building users. Only in Zhejiang province, the infiltration of cold air in winter frequently occurs at the entrance of sizeable commercial complex buildings that have been in operation, which will affect the environmental comfort of the building lobby and internal public spaces. At present, to reduce these adverse effects, it is usually adopted to add active equipment, such as setting air curtains to block air exchange or adding heating air conditioners. From the perspective of energy consumption, the infiltration of cold air into the entrance will increase the heat consumption of indoor heating equipment, which will indirectly cause considerable economic losses during the whole winter heating stage. Therefore, it is of considerable significance to explore the suitable entrance forms for improving the environmental comfort of commercial buildings and saving energy. In this paper, a commercial complex with apparent cold air infiltration problem in Hangzhou is selected as the research object to establish a model. The environmental parameters of the building entrance, including temperature, wind speed, and infiltration air volume, are obtained by Computational Fluid Dynamics (CFD) simulation, from which the heat consumption caused by the natural air infiltration in the winter and its potential economic loss is estimated as the objective metric. This study finally obtains the optimization direction of the building entrance form of the commercial complex by comparing the simulation results of other local commercial complex projects with different entrance forms. The conclusions will guide the entrance design of the same type of commercial complex in this area.
Keywords: Air infiltration, commercial complex, heat consumption, CFD simulation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 765108 Wasting Human and Computer Resources
Authors: Mária Csernoch, Piroska Biró
Abstract:
The legends about “user-friendly” and “easy-to-use” birotical tools (computer-related office tools) have been spreading and misleading end-users. This approach has led us to the extremely high number of incorrect documents, causing serious financial losses in the creating, modifying, and retrieving processes. Our research proved that there are at least two sources of this underachievement: (1) The lack of the definition of the correctly edited, formatted documents. Consequently, end-users do not know whether their methods and results are correct or not. They are not aware of their ignorance. They are so ignorant that their ignorance does not allow them to realize their lack of knowledge. (2) The end-users’ problem solving methods. We have found that in non-traditional programming environments end-users apply, almost exclusively, surface approach metacognitive methods to carry out their computer related activities, which are proved less effective than deep approach methods. Based on these findings we have developed deep approach methods which are based on and adapted from traditional programming languages. In this study, we focus on the most popular type of birotical documents, the text based documents. We have provided the definition of the correctly edited text, and based on this definition, adapted the debugging method known in programming. According to the method, before the realization of text editing, a thorough debugging of already existing texts and the categorization of errors are carried out. With this method in advance to real text editing users learn the requirements of text based documents and also of the correctly formatted text. The method has been proved much more effective than the previously applied surface approach methods. The advantages of the method are that the real text handling requires much less human and computer sources than clicking aimlessly in the GUI (Graphical User Interface), and the data retrieval is much more effective than from error-prone documents.
Keywords: Deep approach metacognitive methods, error-prone birotical documents, financial losses, human and computer resources.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1911107 Development System for Emotion Detection Based on Brain Signals and Facial Images
Authors: Suprijanto, Linda Sari, Vebi Nadhira , IGN. Merthayasa. Farida I.M
Abstract:
Detection of human emotions has many potential applications. One of application is to quantify attentiveness audience in order evaluate acoustic quality in concern hall. The subjective audio preference that based on from audience is used. To obtain fairness evaluation of acoustic quality, the research proposed system for multimodal emotion detection; one modality based on brain signals that measured using electroencephalogram (EEG) and the second modality is sequences of facial images. In the experiment, an audio signal was customized which consist of normal and disorder sounds. Furthermore, an audio signal was played in order to stimulate positive/negative emotion feedback of volunteers. EEG signal from temporal lobes, i.e. T3 and T4 was used to measured brain response and sequence of facial image was used to monitoring facial expression during volunteer hearing audio signal. On EEG signal, feature was extracted from change information in brain wave, particularly in alpha and beta wave. Feature of facial expression was extracted based on analysis of motion images. We implement an advance optical flow method to detect the most active facial muscle form normal to other emotion expression that represented in vector flow maps. The reduce problem on detection of emotion state, vector flow maps are transformed into compass mapping that represents major directions and velocities of facial movement. The results showed that the power of beta wave is increasing when disorder sound stimulation was given, however for each volunteer was giving different emotion feedback. Based on features derived from facial face images, an optical flow compass mapping was promising to use as additional information to make decision about emotion feedback.
Keywords: Multimodal Emotion Detection, EEG, Facial Image, Optical Flow, compass mapping, Brain Wave
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2292106 Automatic Distance Compensation for Robust Voice-based Human-Computer Interaction
Authors: Randy Gomez, Keisuke Nakamura, Kazuhiro Nakadai
Abstract:
Distant-talking voice-based HCI system suffers from performance degradation due to mismatch between the acoustic speech (runtime) and the acoustic model (training). Mismatch is caused by the change in the power of the speech signal as observed at the microphones. This change is greatly influenced by the change in distance, affecting speech dynamics inside the room before reaching the microphones. Moreover, as the speech signal is reflected, its acoustical characteristic is also altered by the room properties. In general, power mismatch due to distance is a complex problem. This paper presents a novel approach in dealing with distance-induced mismatch by intelligently sensing instantaneous voice power variation and compensating model parameters. First, the distant-talking speech signal is processed through microphone array processing, and the corresponding distance information is extracted. Distance-sensitive Gaussian Mixture Models (GMMs), pre-trained to capture both speech power and room property are used to predict the optimal distance of the speech source. Consequently, pre-computed statistic priors corresponding to the optimal distance is selected to correct the statistics of the generic model which was frozen during training. Thus, model combinatorics are post-conditioned to match the power of instantaneous speech acoustics at runtime. This results to an improved likelihood in predicting the correct speech command at farther distances. We experiment using real data recorded inside two rooms. Experimental evaluation shows voice recognition performance using our method is more robust to the change in distance compared to the conventional approach. In our experiment, under the most acoustically challenging environment (i.e., Room 2: 2.5 meters), our method achieved 24.2% improvement in recognition performance against the best-performing conventional method.
Keywords: Human Machine Interaction, Human Computer Interaction, Voice Recognition, Acoustic Model Compensation, Acoustic Speech Enhancement.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1885105 The Influence of Organic Waste on Vegetable Nutritional Components and Healthy Livelihood, Minna, Niger State, Nigeria
Authors: A. Abdulkadir, A. A. Okhimamhe, Y. M. Bello, H. Ibrahim, D. H. Makun, M. T. Usman
Abstract:
Household waste form a larger proportion of waste generated across the state, accumulation of organic waste is an apparent problem and the existing dump sites could be overstress. Niger state has abundant arable land and water resources thus should be one of the highest producers of agricultural crops in the country. However, the major challenge to agricultural sector today is loss of soil nutrient coupled with high cost of fertilizer. These have continued to increase the use of fertilizer and decomposed solid waste for enhance agricultural yield, which have varying effects on the soil as well a threat to human livelihood. Consequently, vegetable yield samples from poultry droppings, decomposed household waste manure, NPK treatments and control from each replication were subjected to proximate analysis to determine the nutritional and antinutritional component as well as heavy metal concentration. Data collected was analyzed using SPSS software and Randomized complete Block Design means were compared. The result shows that the treatments do not devoid the concentrations of any nutritional components while the anti-nutritional analysis proved that NPK had higher oxalate content than control and organic treats. The concentration of lead and cadmium are within safe permissible level while the mercury level exceeded the FAO/WHO maximum permissible limit for the entire treatments depicts the need for urgent intervention to minimize mercury levels in soil and manure in order to mitigate its toxic effect. Thus, eco-agriculture should be widely accepted and promoted by the stakeholders for soil amendment, higher yield, strategies for sustainable environmental protection, food security, poverty eradication, attainment of sustainable development and healthy livelihood.Keywords: Anti-nutritional, healthy livelihood, nutritional waste, organic waste.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1629104 Improving Subjective Bias Detection Using Bidirectional Encoder Representations from Transformers and Bidirectional Long Short-Term Memory
Authors: Ebipatei Victoria Tunyan, T. A. Cao, Cheol Young Ock
Abstract:
Detecting subjectively biased statements is a vital task. This is because this kind of bias, when present in the text or other forms of information dissemination media such as news, social media, scientific texts, and encyclopedias, can weaken trust in the information and stir conflicts amongst consumers. Subjective bias detection is also critical for many Natural Language Processing (NLP) tasks like sentiment analysis, opinion identification, and bias neutralization. Having a system that can adequately detect subjectivity in text will boost research in the above-mentioned areas significantly. It can also come in handy for platforms like Wikipedia, where the use of neutral language is of importance. The goal of this work is to identify the subjectively biased language in text on a sentence level. With machine learning, we can solve complex AI problems, making it a good fit for the problem of subjective bias detection. A key step in this approach is to train a classifier based on BERT (Bidirectional Encoder Representations from Transformers) as upstream model. BERT by itself can be used as a classifier; however, in this study, we use BERT as data preprocessor as well as an embedding generator for a Bi-LSTM (Bidirectional Long Short-Term Memory) network incorporated with attention mechanism. This approach produces a deeper and better classifier. We evaluate the effectiveness of our model using the Wiki Neutrality Corpus (WNC), which was compiled from Wikipedia edits that removed various biased instances from sentences as a benchmark dataset, with which we also compare our model to existing approaches. Experimental analysis indicates an improved performance, as our model achieved state-of-the-art accuracy in detecting subjective bias. This study focuses on the English language, but the model can be fine-tuned to accommodate other languages.
Keywords: Subjective bias detection, machine learning, BERT–BiLSTM–Attention, text classification, natural language processing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 830103 Multidimensional Performance Tracking
Authors: C. Ardil
Abstract:
In this study, a model, together with a software tool that implements it, has been developed to determine the performance ratings of employees in an organization operating in the information technology sector using the indicators obtained from employees' online study data. Weighted Sum (WS) Method and Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) method based on multidimensional decision making approach were used in the study. WS and TOPSIS methods provide multidimensional decision making (MDDM) methods that allow all dimensions to be evaluated together considering specific weights, allowing employees to objectively evaluate the problem of online performance tracking. The application of WS and TOPSIS mathematical methods, which can combine alternatives with a large number of dimensions and reach simultaneous solution, has been implemented through an online performance tracking software. In the application of WS and TOPSIS methods, objective dimension weights were calculated by using entropy information (EI) and standard deviation (SD) methods from the data obtained by employees' online performance tracking method, decision matrix was formed by using performance scores for each employee, and a single performance score was calculated for each employee. Based on the calculated performance score, employees were given a performance evaluation decision. The results of Pareto set evidence and comparative mathematical analysis validate that employees' performance preference rankings in WS and TOPSIS methods are closely related. This suggests the compatibility, applicability, and validity of the proposed method to the MDDM problems in which a large number of alternative and dimension types are taken into account. With this study, an objective, realistic, feasible and understandable mathematical method, together with a software tool that implements it has been demonstrated. This is considered to be preferable because of the subjectivity, limitations and high cost of the methods traditionally used in the measurement and performance appraisal in the information technology sector.Keywords: Weighted sum, entropy ınformation, standard deviation, online performance tracking, performance evaluation, performance management, multidimensional decision making.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1111102 Biosynthesis and In vitro Studies of Silver Bionanoparticles Synthesized from Aspergillusspecies and its Antimicrobial Activity against Multi Drug Resistant Clinical Isolates
Authors: M. Saravanan
Abstract:
Antimicrobial resistant is becoming a major factor in virtually all hospital acquired infection may soon untreatable is a serious public health problem. These concerns have led to major research effort to discover alternative strategies for the treatment of bacterial infection. Nanobiotehnology is an upcoming and fast developing field with potential application for human welfare. An important area of nanotechnology for development of reliable and environmental friendly process for synthesis of nanoscale particles through biological systems In the present studies are reported on the use of fungal strain Aspergillus species for the extracellular synthesis of bionanoparticles from 1 mM silver nitrate (AgNO3) solution. The report would be focused on the synthesis of metallic bionanoparticles of silver using a reduction of aqueous Ag+ ion with the culture supernatants of Microorganisms. The bio-reduction of the Ag+ ions in the solution would be monitored in the aqueous component and the spectrum of the solution would measure through UV-visible spectrophotometer The bionanoscale particles were further characterized by Atomic Force Microscopy (AFM), Fourier Transform Infrared Spectroscopy (FTIR) and Thin layer chromatography. The synthesized bionanoscale particle showed a maximum absorption at 385 nm in the visible region. Atomic Force Microscopy investigation of silver bionanoparticles identified that they ranged in the size of 250 nm - 680 nm; the work analyzed the antimicrobial efficacy of the silver bionanoparticles against various multi drug resistant clinical isolates. The present Study would be emphasizing on the applicability to synthesize the metallic nanostructures and to understand the biochemical and molecular mechanism of nanoparticles formation by the cell filtrate in order to achieve better control over size and polydispersity of the nanoparticles. This would help to develop nanomedicine against various multi drug resistant human pathogens.Keywords: Bionanoparticles, UV-visible spectroscopy, AtomicForce Microscopy, Extracellular synthesis, Multi drug resistant, antimicrobial activity, Nanomedicine
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2238