Search results for: Neural Processing Element (NPE)
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7894

Search results for: Neural Processing Element (NPE)

2434 Detaching the ‘Criminal Justice Conveyor Belt’: Diversion as a Responsive Mechanism for Children in Kenya

Authors: Sarah Kinyanjui, Mahnaaz Mohamed

Abstract:

The child justice system in Kenya is organically departing from a managerial and retributive model to one that espouses restorative justice. Notably, the Children Act 2001, and the most recent, Children Act 2022, signalled an aspiration to facilitate meaningful interventions as opposed to ‘processing’ children through the justice system. In this vein, the Children Act 2022 formally recognises diversion and provides modalities for its implementation. This paper interrogates the diversion promise and reflects on the implementation of diversion as envisaged by the 2022 Act. Using restorative justice, labelling and differential association theories as well as the value of care lenses, the paper discusses diversion as a meaningful response to child offending. It further argues that while diversion presents a strong platform for the realisation of the restorative and rehabilitative ideals, in the absence of a well-planned, coordinated, and resourced framework, diversion may remain a mere alternative ‘conveyor belt’. Strategic multi-agency planning, capacity building and cooperation are highlighted as essential minimums for the realisation of the goals of diversion.

Keywords: diversion for child offenders, restorative justice, responsive criminal justice system, children act 2022 kenya

Procedia PDF Downloads 42
2433 Multi-Level Attentional Network for Aspect-Based Sentiment Analysis

Authors: Xinyuan Liu, Xiaojun Jing, Yuan He, Junsheng Mu

Abstract:

Aspect-based Sentiment Analysis (ABSA) has attracted much attention due to its capacity to determine the sentiment polarity of the certain aspect in a sentence. In previous works, great significance of the interaction between aspect and sentence has been exhibited in ABSA. In consequence, a Multi-Level Attentional Networks (MLAN) is proposed. MLAN consists of four parts: Embedding Layer, Encoding Layer, Multi-Level Attentional (MLA) Layers and Final Prediction Layer. Among these parts, MLA Layers including Aspect Level Attentional (ALA) Layer and Interactive Attentional (ILA) Layer is the innovation of MLAN, whose function is to focus on the important information and obtain multiple levels’ attentional weighted representation of aspect and sentence. In the experiments, MLAN is compared with classical TD-LSTM, MemNet, RAM, ATAE-LSTM, IAN, AOA, LCR-Rot and AEN-GloVe on SemEval 2014 Dataset. The experimental results show that MLAN outperforms those state-of-the-art models greatly. And in case study, the works of ALA Layer and ILA Layer have been proven to be effective and interpretable.

Keywords: deep learning, aspect-based sentiment analysis, attention, natural language processing

Procedia PDF Downloads 124
2432 A Study on Sentiment Analysis Using Various ML/NLP Models on Historical Data of Indian Leaders

Authors: Sarthak Deshpande, Akshay Patil, Pradip Pandhare, Nikhil Wankhede, Rushali Deshmukh

Abstract:

Among the highly significant duties for any language most effective is the sentiment analysis, which is also a key area of NLP, that recently made impressive strides. There are several models and datasets available for those tasks in popular and commonly used languages like English, Russian, and Spanish. While sentiment analysis research is performed extensively, however it is lagging behind for the regional languages having few resources such as Hindi, Marathi. Marathi is one of the languages that included in the Indian Constitution’s 8th schedule and is the third most widely spoken language in the country and primarily spoken in the Deccan region, which encompasses Maharashtra and Goa. There isn’t sufficient study on sentiment analysis methods based on Marathi text due to lack of available resources, information. Therefore, this project proposes the use of different ML/NLP models for the analysis of Marathi data from the comments below YouTube content, tweets or Instagram posts. We aim to achieve a short and precise analysis and summary of the related data using our dataset (Dates, names, root words) and lexicons to locate exact information.

Keywords: multilingual sentiment analysis, Marathi, natural language processing, text summarization, lexicon-based approaches

Procedia PDF Downloads 53
2431 Sustainability in the Purchase of Airline Tickets: Analysis of Digital Communication from the Perspective of Neuroscience

Authors: Rodríguez Sánchez Carla, Sancho-Esper Franco, Guillen-Davo Marina

Abstract:

Tourism is one of the most important sectors worldwide since it is an important economic engine for today's society. It is also one of the sectors that most negatively affect the environment in terms of CO₂ emissions due to this expansion. In light of this, airlines are developing Voluntary Carbon Offset (VCO). There is important evidence focused on analyzing the features of these VCO programs and their efficacy in reducing CO₂ emissions, and findings are mixed without a clear consensus. Different research approaches have centered on analyzing factors and consequences of VCO programs, such as economic modelling based on panel data, survey research based on traveler responses or experimental research analyzing customer decisions in a simulated context. This study belongs to the latter group because it tries to understand how different characteristics of an online ticket purchase website affect the willingness of a traveler to choose a sustainable one. The proposed behavioral model is based on several theories, such as the nudge theory, the dual processing ELM and the cognitive dissonance theory. This randomized experiment aims at overcoming previous studies based on self-reported measures that mainly study sustainable behavioral intention rather than actual decision-making. It also complements traditional self-reported independent variables by gathering objective information from an eye-tracking device. This experiment analyzes the influence of two characteristics of the online purchase website: i) the type of information regarding flight CO₂ emissions (quantitative vs. qualitative) and the comparison framework related to the sustainable purchase decision (negative: alternative with more emissions than the average flight of the route vs. positive: alternative with less emissions than the average flight of the route), therefore it is a 2x2 experiment with four alternative scenarios. A pretest was run before the actual experiment to refine the experiment features and to check the manipulations. Afterward, a different sample of students answered the pre-test questionnaire aimed at recruiting the cases and measuring several pre-stimulus measures. One week later, students came to the neurolab at the University setting to be part of the experiment, made their decision regarding online purchases and answered the post-test survey. A final sample of 21 students was gathered. The committee of ethics of the institution approved the experiment. The results show that qualitative information generates more sustainable decisions (less contaminant alternative) than quantitative information. Moreover, evidence shows that subjects are more willing to choose the sustainable decision to be more ecological (comparison of the average with the less contaminant alternative) rather than to be less contaminant (comparison of the average with the more contaminant alternative). There are also interesting differences in the information processing variables from the eye tracker. Both the total time to make the choice and the specific times by area of interest (AOI) differ depending on the assigned scenario. These results allow for a better understanding of the factors that condition the decision of a traveler to be part of a VCO program and provide useful information for airline managers to promote these programs to reduce environmental impact.

Keywords: voluntary carbon offset, airline, online purchase, carbon emission, sustainability, randomized experiment

Procedia PDF Downloads 54
2430 Expansive-Restrictive Style: Conceptualizing Knowledge Workers

Authors: Ram Manohar Singh, Meenakshi Gupta

Abstract:

Various terms such as ‘learning style’, ‘cognitive style’, ‘conceptual style’, ‘thinking style’, ‘intellectual style’ are used in literature to refer to an individual’s characteristic and consistent approach to organizing and processing information. However, style concepts are criticized for mutually overlapping definitions and confusing classification. This confusion should be addressed at the conceptual as well as empirical level. This paper is an attempt to bridge this gap in literature by proposing a new concept: expansive-restrictive intellectual style based on phenomenological analysis of an auto-ethnography and interview of 26 information technology (IT) professionals working in knowledge intensive organizations (KIOs) in India. Expansive style is an individual’s preference to expand his/her horizon of knowledge and understanding by gaining real meaning and structure of his/her work. On the contrary restrictive style is characterized by an individual’s preference to take minimalist approach at work reflected in executing a job efficiently without an attempt to understand the real meaning and structure of the work. The analysis suggests that expansive-restrictive style has three dimensions: (1) field dependence-independence (2) cognitive involvement and (3) epistemological beliefs.

Keywords: expansive, knowledge workers, restrictive, style

Procedia PDF Downloads 407
2429 An Overview on Aluminum Matrix Composites: Liquid State Processing

Authors: S. P. Jordan, G. Christian, S. P. Jeffs

Abstract:

Modern composite materials are increasingly being chosen in replacement of heavier metallic material systems within many engineering fields including aerospace and automotive industries. The increasing push towards satisfying environmental targets are fuelling new material technologies and manufacturing processes. This paper will introduce materials and manufacturing processes using metal matrix composites along with manufacturing processes optimized at Alvant Ltd., based in Basingstoke in the UK which offers modern, cost effective, selectively reinforced composites for light-weighting applications within engineering. An overview and introduction into modern optimized manufacturing methods capable of producing viable replacements for heavier metallic and lower temperature capable polymer composites are offered. A review of the capabilities and future applications of this viable material is discussed to highlight the potential involved in further optimization of old manufacturing techniques, to fully realize the potential to lightweight material using cost-effective methods.

Keywords: aluminium matrix composites, light-weighting, hybrid squeeze casting, strategically placed reinforcements

Procedia PDF Downloads 84
2428 Facility Layout Improvement: Based on Safety and Health at Work and Standards of Food Production Facility

Authors: Asifa Fitriani, Galih Prakoso

Abstract:

This study aims to improve the design layout of a Micro, Small and Medium Enterprises (SMEs) to minimize material handling and redesigning the layout of production facilities based on the safety and health and standards of food production facilities. Problems layout in the one of chip making industry mushrooms in Indonesia is cross movement between work stations, work accidents, and the standard of facilities that do not conform with the standards of the food industry. Improvement layout design using CORELAP and 5S method to give recommendation and implementation of occupational health and safety standards of food production facilities. From the analysis, improved layout using CORELAP provide a smaller displacement distance is 155.84 meters from the initial displacement distance of 335.9 meters, and providing a shorter processing time than the original 112.726 seconds to 102.831 seconds. 5S method also has recommended the completion of occupational health and safety issues as well as the standard means of food production by changing the working environment better.

Keywords: Layout Design, Corelap, 5S

Procedia PDF Downloads 521
2427 Typical Emulsions as Probiotic Food Carrier: Effect of Cells Position on Its Viability

Authors: Mengfan Li, Filip Van Bockstaele, Wenyong Lou, Frank Devlighere

Abstract:

The development of probiotics-encapsulated emulsions that maintain the viability of probiotics during processing, storage and human gastrointestinal (GI) tract environment receives great scientific and commercial interest. In this study, typical W/O and O/W emulsions with and without oil gelation were used to encapsulate L. plantarum. The effects of emulsion types on the viability of L. plantarum during storage and GI tract were investigated. Besides, the position of L. plantarum in emulsion system and its number of viable cells when threating by adverse environment was correlated in order to figure out which type of emulsion is more suitable as food carrier for probiotics encapsulation and protection. As a result, probiotics tend to migrate from oil to water phase due to the natural hydrophilicity; however, it’s harmful for cells viability when surrounding by water for a long time. Oil gelation in emulsions is one of the promising strategies for inhibiting the cells mobility and decreasing the contact with adverse factors (e.g., water, exogenous enzymes and gastric acid), thus enhancing the number of viable cells that enough to exert its beneficial effects in host.

Keywords: emulsion, gelation, encapsulation, probiotics

Procedia PDF Downloads 84
2426 Electron Beam Processing of Ethylene-Propylene-Terpolymer-Based Rubber Mixtures

Authors: M. D. Stelescu, E. Manaila, G. Craciun, D. Ighigeanu

Abstract:

The goal of the paper is to present the results regarding the influence of the irradiation dose and amount of multifunctional monomer trimethylol-propane trimethacrylate (TMPT) on ethylene-propylene-diene terpolymer rubber (EPDM) mixtures irradiated in electron beam. Blends, molded on an electrically heated laboratory roller mill and compressed in an electrically heated hydraulic press, were irradiated using the ALID 7 of 5.5 MeV linear accelerator in the dose range of 22.6 kGy to 56.5 kGy in atmospheric conditions and at room temperature of 25 °C. The share of cross-linking and degradation reactions was evaluated by means of sol-gel analysis, cross-linking density measurements, FTIR studies and Charlesby-Pinner parameter (p0/q0) calculations. The blends containing different concentrations of TMPT (3 phr and 9 phr) and irradiated with doses in the mentioned range have present the increasing of gel content and cross-linking density. Modified and new bands in FTIR spectra have appeared, because of both cross-linking and chain scission reactions.

Keywords: electron beam irradiation, EPDM rubber, crosslinking density, gel fraction

Procedia PDF Downloads 145
2425 Radar Track-based Classification of Birds and UAVs

Authors: Altilio Rosa, Chirico Francesco, Foglia Goffredo

Abstract:

In recent years, the number of Unmanned Aerial Vehicles (UAVs) has significantly increased. The rapid development of commercial and recreational drones makes them an important part of our society. Despite the growing list of their applications, these vehicles pose a huge threat to civil and military installations: detection, classification and neutralization of such flying objects become an urgent need. Radar is an effective remote sensing tool for detecting and tracking flying objects, but scenarios characterized by the presence of a high number of tracks related to flying birds make especially challenging the drone detection task: operator PPI is cluttered with a huge number of potential threats and his reaction time can be severely affected. Flying birds compared to UAVs show similar velocity, RADAR cross-section and, in general, similar characteristics. Building from the absence of a single feature that is able to distinguish UAVs and birds, this paper uses a multiple features approach where an original feature selection technique is developed to feed binary classifiers trained to distinguish birds and UAVs. RADAR tracks acquired on the field and related to different UAVs and birds performing various trajectories were used to extract specifically designed target movement-related features based on velocity, trajectory and signal strength. An optimization strategy based on a genetic algorithm is also introduced to select the optimal subset of features and to estimate the performance of several classification algorithms (Neural network, SVM, Logistic regression…) both in terms of the number of selected features and misclassification error. Results show that the proposed methods are able to reduce the dimension of the data space and to remove almost all non-drone false targets with a suitable classification accuracy (higher than 95%).

Keywords: birds, classification, machine learning, UAVs

Procedia PDF Downloads 202
2424 Resonant Auxetic Metamaterial for Automotive Applications in Vibration Isolation

Authors: Adrien Pyskir, Manuel Collet, Zoran Dimitrijevic, Claude-Henri Lamarque

Abstract:

During the last decades, great efforts have been made to reduce acoustic and vibrational disturbances in transportations, as it has become a key feature for comfort. Today, isolation and design have neutralized most of the troublesome vibrations, so that cars are quieter and more comfortable than ever. However, some problems remain unsolved, in particular concerning low-frequency isolation and the frequency-dependent stiffening of materials like rubber. To sum it up, a balance has to be found between a high static stiffness to sustain the vibration source’s mass, and low dynamic stiffness, as wideband as possible. Systems meeting these criteria are yet to be designed. We thus investigated solutions inspired by metamaterials to control efficiently low-frequency wave propagation. Structures exhibiting a negative Poisson ratio, also called auxetic structures, are known to influence the propagation of waves through beaming or damping. However, their stiffness can be quite peculiar as well, as they can present regions of zero stiffness on the stress-strain curve for compression. In addition, auxetic materials can be easily adapted in many ways, inducing great tuning potential. Using finite element software COMSOL Multiphysics, a resonant design has been tested through statics and dynamics simulations. These results are compared to experimental results. In particular, the bandgaps featured by these structures are analyzed as a function of design parameters. Great stiffness properties can be observed, including low-frequency dynamic stiffness loss and broadband transmission loss. Such features are very promising for practical isolation purpose, and we hope to adopt this kind of metamaterial into an effective industrial damper.

Keywords: auxetics, metamaterials, structural dynamics, vibration isolation

Procedia PDF Downloads 134
2423 Determining the Information Technologies Usage and Learning Preferences of Construction

Authors: Naci Büyükkaracığan, Yıldırım Akyol

Abstract:

Information technology is called the technology which provides transmission of information elsewhere regardless of time, location, distance. Today, information technology is providing the occurrence of ground breaking changes in all areas of our daily lives. Information can be reached quickly to millions of people with help of information technology. In this Study, effects of information technology on students for educations and their learning preferences were demonstrated with using data obtained from questionnaires administered to students of 2015-2016 academic year at Selcuk University Kadınhanı Faik İçil Vocational School Construction Department. The data was obtained by questionnaire consisting of 30 questions that was prepared by the researchers. SPSS 21.00 package programme was used for statistical analysis of data. Chi-square tests, Mann-Whitney U test, Kruskal-Wallis and Kolmogorov-Smirnov tests were used in the data analysis for Descriptiving statistics. In a study conducted with the participation of 61 students, 93.4% of students' reputation of their own information communication device (computer, smart phone, etc.) That have been shown to be at the same rate and to the internet. These are just a computer of itself, then 45.90% of the students. The main reasons for the students' use of the Internet, social networking sites are 85.24%, 13.11% following the news of the site, as seen. All student assignments in information technology, have stated that they use in the preparation of the project. When students acquire scientific knowledge in the profession regarding their preferred sources evaluated were seen exactly when their preferred internet. Male students showed that daily use of information technology while compared to female students was statistically significantly less. Construction Package program where students are eager to learn about the reputation of 72.13% and 91.80% identified in the well which they agreed that an indispensable element in the professional advancement of information technology.

Keywords: information technologies, computer, construction, internet, learning systems

Procedia PDF Downloads 285
2422 The Performance of Natural Light by Roof Systems in Cultural Buildings

Authors: Ana Paula Esteves, Diego S. Caetano, Louise L. B. Lomardo

Abstract:

This paper presents an approach to the performance of the natural lighting, when the use of appropriated solar lighting systems on the roof is applied in cultural buildings such as museums and foundations. The roofs, as a part of contact between the building and the external environment, require special attention in projects that aim at energy efficiency, being an important element for the capture of natural light in greater quantity, but also for being the most important point of generation of photovoltaic solar energy, even semitransparent, allowing the partial passage of light. Transparent elements in roofs, as well as superior protection of the building, can also play other roles, such as: meeting the needs of natural light for the accomplishment of the internal tasks, attending to the visual comfort; to bring benefits to the human perception and about the interior experience in a building. When these resources are well dimensioned, they also contribute to the energy efficiency and consequent character of sustainability of the building. Therefore, when properly designed and executed, a roof light system can bring higher quality natural light to the interior of the building, which is related to the human health and well-being dimension. Furthermore, it can meet the technologic, economic and environmental yearnings, making possible the more efficient use of that primordial resource, which is the light of the Sun. The article presents the analysis of buildings that used zenith light systems in search of better lighting performance in museums and foundations: the Solomon R. Guggenheim Museum in the United States, the Iberê Camargo Foundation in Brazil, the Museum of Fine Arts in Castellón in Spain and the Pinacoteca of São Paulo.

Keywords: natural lighting, roof lighting systems, natural lighting in museums, comfort lighting

Procedia PDF Downloads 194
2421 Investigation of the Properties of Biochar Obtained by Dry and Wet Torrefaction in a Fixed and in a Fluidized Bed

Authors: Natalia Muratova, Dmitry Klimov, Rafail Isemin, Sergey Kuzmin, Aleksandr Mikhalev, Oleg Milovanov

Abstract:

We investigated the processing of poultry litter into biochar using dry torrefaction methods (DT) in a fixed and fluidized bed of quartz sand blown with nitrogen, as well as wet torrefaction (WT) in a fluidized bed in a medium of water steam at a temperature of 300 °C. Torrefaction technology affects the duration of the heat treatment process and the characteristics of the biochar: the process of separating CO₂, CO, H₂ and CH₄ from a portion of fresh poultry litter during torrefaction in a fixed bed is completed after 2400 seconds, but in a fluidized bed — after 480 seconds. During WT in a fluidized bed of quartz sand, this process ends in 840 seconds after loading a portion of fresh litter, but in a fluidized bed of litter particles previously subjected to torrefaction, the process ends in 350 - 450 seconds. In terms of the ratio between (H/C) and (O/C), the litter obtained after DT and WT treatment corresponds to lignite. WT in a fluidized bed allows one to obtain biochar, in which the specific pore area is two times larger than the specific pore area of biochar obtained after DT in a fluidized bed. Biochar, obtained as a result of the poultry litter treatment in a fluidized bed using DT or WT method, is recommended to be used not only as a biofuel but also as an adsorbent or the soil fertilizer.

Keywords: biochar, poultry litter, dry and wet torrefaction, fixed bed, fluidized bed

Procedia PDF Downloads 139
2420 Enhancing the Bionic Eye: A Real-time Image Optimization Framework to Encode Color and Spatial Information Into Retinal Prostheses

Authors: William Huang

Abstract:

Retinal prostheses are currently limited to low resolution grayscale images that lack color and spatial information. This study develops a novel real-time image optimization framework and tools to encode maximum information to the prostheses which are constrained by the number of electrodes. One key idea is to localize main objects in images while reducing unnecessary background noise through region-contrast saliency maps. A novel color depth mapping technique was developed through MiniBatchKmeans clustering and color space selection. The resulting image was downsampled using bicubic interpolation to reduce image size while preserving color quality. In comparison to current schemes, the proposed framework demonstrated better visual quality in tested images. The use of the region-contrast saliency map showed improvements in efficacy up to 30%. Finally, the computational speed of this algorithm is less than 380 ms on tested cases, making real-time retinal prostheses feasible.

Keywords: retinal implants, virtual processing unit, computer vision, saliency maps, color quantization

Procedia PDF Downloads 132
2419 Use of Artificial Intelligence and Two Object-Oriented Approaches (k-NN and SVM) for the Detection and Characterization of Wetlands in the Centre-Val de Loire Region, France

Authors: Bensaid A., Mostephaoui T., Nedjai R.

Abstract:

Nowadays, wetlands are the subject of contradictory debates opposing scientific, political and administrative meanings. Indeed, given their multiple services (drinking water, irrigation, hydrological regulation, mineral, plant and animal resources...), wetlands concentrate many socio-economic and biodiversity issues. In some regions, they can cover vast areas (>100 thousand ha) of the landscape, such as the Camargue area in the south of France, inside the Rhone delta. The high biological productivity of wetlands, the strong natural selection pressures and the diversity of aquatic environments have produced many species of plants and animals that are found nowhere else. These environments are tremendous carbon sinks and biodiversity reserves depending on their age, composition and surrounding environmental conditions, wetlands play an important role in global climate projections. Covering more than 3% of the earth's surface, wetlands have experienced since the beginning of the 1990s a tremendous revival of interest, which has resulted in the multiplication of inventories, scientific studies and management experiments. The geographical and physical characteristics of the wetlands of the central region conceal a large number of natural habitats that harbour a great biological diversity. These wetlands, one of the natural habitats, are still influenced by human activities, especially agriculture, which affects its layout and functioning. In this perspective, decision-makers need to delimit spatial objects (natural habitats) in a certain way to be able to take action. Thus, wetlands are no exception to this rule even if it seems to be a difficult exercise to delimit a type of environment as whose main characteristic is often to occupy the transition between aquatic and terrestrial environment. However, it is possible to map wetlands with databases, derived from the interpretation of photos and satellite images, such as the European database Corine Land cover, which allows quantifying and characterizing for each place the characteristic wetland types. Scientific studies have shown limitations when using high spatial resolution images (SPOT, Landsat, ASTER) for the identification and characterization of small wetlands (1 hectare). To address this limitation, it is important to note that these wetlands generally represent spatially complex features. Indeed, the use of very high spatial resolution images (>3m) is necessary to map small and large areas. However, with the recent evolution of artificial intelligence (AI) and deep learning methods for satellite image processing have shown a much better performance compared to traditional processing based only on pixel structures. Our research work is also based on spectral and textural analysis on THR images (Spot and IRC orthoimage) using two object-oriented approaches, the nearest neighbour approach (k-NN) and the Super Vector Machine approach (SVM). The k-NN approach gave good results for the delineation of wetlands (wet marshes and moors, ponds, artificial wetlands water body edges, ponds, mountain wetlands, river edges and brackish marshes) with a kappa index higher than 85%.

Keywords: land development, GIS, sand dunes, segmentation, remote sensing

Procedia PDF Downloads 43
2418 Feature Weighting Comparison Based on Clustering Centers in the Detection of Diabetic Retinopathy

Authors: Kemal Polat

Abstract:

In this paper, three feature weighting methods have been used to improve the classification performance of diabetic retinopathy (DR). To classify the diabetic retinopathy, features extracted from the output of several retinal image processing algorithms, such as image-level, lesion-specific and anatomical components, have been used and fed them into the classifier algorithms. The dataset used in this study has been taken from University of California, Irvine (UCI) machine learning repository. Feature weighting methods including the fuzzy c-means clustering based feature weighting, subtractive clustering based feature weighting, and Gaussian mixture clustering based feature weighting, have been used and compered with each other in the classification of DR. After feature weighting, five different classifier algorithms comprising multi-layer perceptron (MLP), k- nearest neighbor (k-NN), decision tree, support vector machine (SVM), and Naïve Bayes have been used. The hybrid method based on combination of subtractive clustering based feature weighting and decision tree classifier has been obtained the classification accuracy of 100% in the screening of DR. These results have demonstrated that the proposed hybrid scheme is very promising in the medical data set classification.

Keywords: machine learning, data weighting, classification, data mining

Procedia PDF Downloads 314
2417 Potential of Mineral Composition Reconstruction for Monitoring the Performance of an Iron Ore Concentration Plant

Authors: Maryam Sadeghi, Claude Bazin, Daniel Hodouin, Laura Perez Barnuevo

Abstract:

The performance of a separation process is usually evaluated using performance indices calculated from elemental assays readily available from the chemical analysis laboratory. However, the separation process performance is essentially related to the properties of the minerals that carry the elements and not those of the elements. Since elements or metals can be carried by valuable and gangue minerals in the ore and that each mineral responds differently to a mineral processing method, the use of only elemental assays could lead to erroneous or uncertain conclusions on the process performance. This paper discusses the advantages of using performance indices calculated from minerals content, such as minerals recovery, for process performance assessments. A method is presented that uses elemental assays to estimate the minerals content of the solids in various process streams. The method combines the stoichiometric composition of the minerals and constraints of mass conservation for the minerals through the concentration process to estimate the minerals content from elemental assays. The advantage of assessing a concentration process using mineral based performance indices is illustrated for an iron ore concentration circuit.

Keywords: data reconciliation, iron ore concentration, mineral composition, process performance assessment

Procedia PDF Downloads 198
2416 Effect of Temperature and Time on the Yield of Silica from Rice Husk Ash

Authors: Mohammed Adamu Musa, Shehu Saminu Babba

Abstract:

The technological trend towards waste utilization and cost reduction in industrial processing has attracted use of Rice Husk as a value added material. Both rice husk (RH) and Rice Husk Ash (RHA) has been found suitable for wide range of domestic as well as industrial applications. Therefore, the purpose of this research is to produce high grade sodium silicate from rice husk ash by considering the effect of temperature and time of heating as the process variables. The experiment was performed by heating the rice husk at temperatures 500 °C, 600 °C, 700 °C and 800 °C and time 60min, 90min, 120min and 150min were used to obtain the ash. 1.0M of aqueous sodium hydroxide solution was used to dissolve the silicate from the ash, which contained crude sodium silicate. In addition, the ash was neutralized by adding 5M of HCL until the pH reached 3.5 to give silica gel. At 6000C and 120mins, 94.23% silica was obtained from the RHA. At higher temperatures (700 °C and 800 °C) the percentage yield of silica reduced due to surface melting and carbon fixation in the lattice caused by presence of potassium. For this research, 600 °C is considered to be the optimum temperature for silica production from RHA. Silica produced from RHA can generate aggregate value and can be used in areas such as pulp and paper, plastic and rubber reinforcement industries.

Keywords: burning, rice husk, rice husk ash, silica, silica gel, temperature

Procedia PDF Downloads 221
2415 Development of Electronic Governance as an Element of Reforming State Governance According to the Adjarian Example

Authors: Irakli Manvelidze, Genadi Iashvili, Giga Phartenadze, Giorgi Katamadze

Abstract:

Establishment of electronic governance in the region is facing serious problems. Organizational, technical, social and methodological problems have been identified after the research. These problems currently create serious barriers and prevent the development of effective e-governance. Lack of human resources, difference in program targets of the centre and the region, lack of citizens’ awareness about the project of electronic governance are other issues that should be mentioned. In spite of positive changes the overall situation concerning development of modern information-communication technologies in Adjara is not satisfactory. The information systems in the region can be described as transforming in a democratic way which needs serious reforms. Current situation shows that unsystematic, uncoordinated actions were made which overall represents more chaotic rather than coordinated systematic process. Therefore, a strategic document ‘Adjarian Electronic Government’ should be created which will ensure systematic development of electronic governance in the region. The implementation of the strategy of ‘Adjarian Electronic Government’ should be based on not only conceptual and instrumental but also legal basics. A legal normative basis should be created which will include formation of electronic government’s instrumental basis as well as creation of united regional system of electronic document management. Meanwhile types of documents which would be used in inter institutional relations should be defined under a legal norm. Creation of regional united system of e-filing will regulate regional public institutions, relations between local self-government and public organizations as well as it will ensure coordinated work of all regional public institutions.

Keywords: e-government, information society, public administration, reforming state governance, public institutions

Procedia PDF Downloads 271
2414 KSVD-SVM Approach for Spontaneous Facial Expression Recognition

Authors: Dawood Al Chanti, Alice Caplier

Abstract:

Sparse representations of signals have received a great deal of attention in recent years. In this paper, the interest of using sparse representation as a mean for performing sparse discriminative analysis between spontaneous facial expressions is demonstrated. An automatic facial expressions recognition system is presented. It uses a KSVD-SVM approach which is made of three main stages: A pre-processing and feature extraction stage, which solves the problem of shared subspace distribution based on the random projection theory, to obtain low dimensional discriminative and reconstructive features; A dictionary learning and sparse coding stage, which uses the KSVD model to learn discriminative under or over dictionaries for sparse coding; Finally a classification stage, which uses a SVM classifier for facial expressions recognition. Our main concern is to be able to recognize non-basic affective states and non-acted expressions. Extensive experiments on the JAFFE static acted facial expressions database but also on the DynEmo dynamic spontaneous facial expressions database exhibit very good recognition rates.

Keywords: dictionary learning, random projection, pose and spontaneous facial expression, sparse representation

Procedia PDF Downloads 284
2413 Influence of CA, SR and BA Substitution on lafeo3Performances During Chemical Looping Processes

Authors: Rong Sun, Laihong Shen

Abstract:

La-based perovskite oxygen carriers, especially the doped-La(M)FeO₃, showed excellent performances during chemical looping processes. However, the mechanisms of the undoped and doped La(M)FeO₃ are not clear at present, making the mechanisms clear may help the development of chemical looping technologies. In this paper, the method based on the density function theory (DFT) was used to analysis the influence of Ca, Sr, and Ba doping of La on the electronic structure, while the CO oxidation mechanisms on the surface of LaFeO₃ and Ca-doped LaFeO₃ oxygen carriers were also analyzed. The results showed that the band gap was decreased by the doping of low valence. While the doping of low valence element Ca, Sr, and Ba at La site simultaneously resulted to the moving of the valence band toward high energy and made the valence band cross the Fermi energy level. This was resulted from the holes generated by divalent ion substitution. The holes can change the total magnetization from antiferromagnet to weakly ferromagnetism. The calculation results about the formation of oxygen vacancy showed that substitutions of Ca, Sr, and Ba caused a large drop in oxygen vacancy formation energy, indicating that the bulk oxygen transport was improved. Based on the optimized bulk of the undoped and Ca-doped LaFeO₃(010) surface, the CO adsorption was analyzed. The results indicated that the adsorption energy increased by divalent ion substitution, meaning that the adsorption stability decreased. The results can provide a certain theoretical basis for the development of perovskite oxides in chemical looping technologies.

Keywords: chemical looping technologies, lanthanum ferrate (LaFeO₃), divalent ion substitution, CO oxidation

Procedia PDF Downloads 92
2412 Ubiquitous Collaborative Learning Activities with Virtual Teams Using CPS Processes to Develop Creative Thinking and Collaboration Skills

Authors: Sitthichai Laisema, Panita Wannapiroon

Abstract:

This study is a research and development which is intended to: 1) design ubiquitous collaborative learning activities with virtual teams using CPS processes to develop creative thinking and collaboration skills, and 2) assess the suitability of the ubiquitous collaborative learning activities. Its methods are divided into 2 phases. Phase 1 is the design of ubiquitous collaborative learning activities with virtual teams using CPS processes, phase 2 is the assessment of the suitability of the learning activities. The samples used in this study are 5 professionals in the field of learning activity design, ubiquitous learning, information technology, creative thinking, and collaboration skills. The results showed that ubiquitous collaborative learning activities with virtual teams using CPS processes to develop creative thinking and collaboration skills consist of 3 main steps which are: 1) preparation before learning, 2) learning activities processing and 3) performance appraisal. The result of the learning activities suitability assessment from the professionals is in the highest level.

Keywords: ubiquitous learning, collaborative learning, virtual team, creative problem solving

Procedia PDF Downloads 497
2411 Risk Assessment of Contamination by Heavy Metals in Sarcheshmeh Copper Complex of Iran Using Topsis Method

Authors: Hossein Hassani, Ali Rezaei

Abstract:

In recent years, the study of soil contamination problems surrounding mines and smelting plants has attracted some serious attention of the environmental experts. These elements due to the non- chemical disintegration and nature are counted as environmental stable and durable contaminants. Variability of these contaminants in the soil and the time and financial limitation for the favorable environmental application, in order to reduce the risk of their irreparable negative consequences on environment, caused to apply the favorable grading of these contaminant for the further success of the risk management processes. In this study, we use the contaminants factor risk indices, average concentration, enrichment factor and geoaccumulation indices for evaluating the metal contaminant of including Pb, Ni, Se, Mo and Zn in the soil of Sarcheshmeh copper mine area. For this purpose, 120 surface soil samples up to the depth of 30 cm have been provided from the study area. And the metals have been analyzed using ICP-MS method. Comparison of the heavy and potentially toxic elements concentration in the soil samples with the world average value of the uncontaminated soil and shale average indicates that the value of Zn, Pb, Ni, Se and Mo is higher than the world average value and only the Ni element shows the lower value than the shale average. Expert opinions on the relative importance of each indicators were used to assign a final weighting of the metals and the heavy metals were ranked using the TOPSIS approach. This allows us to carry out efficient environmental proceedings, leading to the reduction of environmental ricks form the contaminants. According to the results, Ni, Pb, Mo, Zn, and Se have the highest rate of risk contamination in the soil samples of the study area.

Keywords: contamination coefficient, geoaccumulation factor, TOPSIS techniques, Sarcheshmeh copper complex

Procedia PDF Downloads 260
2410 Hip Resurfacing Makes for Easier Surgery with Better Functional Outcomes at Time of Revision: A Case Controlled Study

Authors: O. O. Onafowokan, K. Anderson, M. R. Norton, R. G. Middleton

Abstract:

Revision total hip arthroplasty (THA) is known to be a challenging procedure with potential for poor outcomes. Due to its lack of metaphyseal encroachment, hip resurfacing arthroplasty (HRA) is classified as a bone conserving procedure. Although the literature postulates that this is an advantage at time of revision surgery, there is no evidence to either support or refute this claim. We identified 129 hips that had undergone HRA and 129 controls undergoing first revision THA. We recorded the clinical assessment and survivorship of implants in a multi-surgeon, single centre, retrospective case control series for both arms. These were matched for age and sex. Data collected included demographics, indications for surgery, Oxford Hip Score (OHS), length of surgery, length of hospital stay, blood transfusion, implant complexity and further surgical procedures. Significance was taken as p < 0.05. Mean follow up was 7.5 years (1 to 15). There was a significant 6 point difference in postoperative OHS in favour of the revision resurfacing group (p=0.0001). The revision HRA group recorded 48 minutes less length of surgery (p<0.0001), 2 days less in length of hospital stay (p=0.018), a reduced need for blood transfusion (p=0.0001), a need for less complexity in revision implants (p=0.001) and a reduced probability of further surgery being required (P=0.003). Whilst we acknowledge the limitations of this study our results suggest that, in contrast to THA, the bone conservation element of HRA may make for a less traumatic revision procedure with better functional outcomes. Use of HRA has seen a dramatic decline as a result of concerns regarding metallosis. However, this information remains of relevance when counselling young active patients about their arthroplasty options and may become pertinent in the future if the promise of ceramic hip resurfacing is ever realized.

Keywords: hip resurfacing, metallosis, revision surgery, total hip arthroplasty

Procedia PDF Downloads 74
2409 Secure Text Steganography for Microsoft Word Document

Authors: Khan Farhan Rafat, M. Junaid Hussain

Abstract:

Seamless modification of an entity for the purpose of hiding a message of significance inside its substance in a manner that the embedding remains oblivious to an observer is known as steganography. Together with today's pervasive registering frameworks, steganography has developed into a science that offers an assortment of strategies for stealth correspondence over the globe that must, however, need a critical appraisal from security breach standpoint. Microsoft Word is amongst the preferably used word processing software, which comes as a part of the Microsoft Office suite. With a user-friendly graphical interface, the richness of text editing, and formatting topographies, the documents produced through this software are also most suitable for stealth communication. This research aimed not only to epitomize the fundamental concepts of steganography but also to expound on the utilization of Microsoft Word document as a carrier for furtive message exchange. The exertion is to examine contemporary message hiding schemes from security aspect so as to present the explorative discoveries and suggest enhancements which may serve a wellspring of information to encourage such futuristic research endeavors.

Keywords: hiding information in plain sight, stealth communication, oblivious information exchange, conceal, steganography

Procedia PDF Downloads 226
2408 An Artificial Intelligence Framework to Forecast Air Quality

Authors: Richard Ren

Abstract:

Air pollution is a serious danger to international well-being and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.Air pollution is a serious danger to international wellbeing and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.Air pollution is a serious danger to international wellbeing and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.

Keywords: air quality prediction, air pollution, artificial intelligence, machine learning algorithms

Procedia PDF Downloads 103
2407 On Lie-Central Derivations and Almost Inner Lie-Derivations of Leibniz Algebras

Authors: Natalia Pacheco Rego

Abstract:

The Liezation functor is a map from the category of Leibniz algebras to the category of Lie algebras, which assigns a Leibniz algebra to the Lie algebra given by the quotient of the Leibniz algebra by the ideal spanned by the square elements of the Leibniz algebra. This functor is left adjoint to the inclusion functor that considers a Lie algebra as a Leibniz algebra. This environment fits in the framework of central extensions and commutators in semi-abelian categories with respect to a Birkhoff subcategory, where classical or absolute notions are relative to the abelianization functor. Classical properties of Leibniz algebras (properties relative to the abelianization functor) were adapted to the relative setting (with respect to the Liezation functor); in general, absolute properties have the corresponding relative ones, but not all absolute properties immediately hold in the relative case, so new requirements are needed. Following this line of research, it was conducted an analysis of central derivations of Leibniz algebras relative to the Liezation functor, called as Lie-derivations, and a characterization of Lie-stem Leibniz algebras by their Lie-central derivations was obtained. In this paper, we present an overview of these results, and we analyze some new properties concerning Lie-central derivations and almost inner Lie-derivations. Namely, a Leibniz algebra is a vector space equipped with a bilinear bracket operation satisfying the Leibniz identity. We define the Lie-bracket by [x, y]lie = [x, y] + [y, x] , for all x, y . The Lie-center of a Leibniz algebra is the two-sided ideal of elements that annihilate all the elements in the Leibniz algebra through the Lie-bracket. A Lie-derivation is a linear map which acts as a derivative with respect to the Lie-bracket. Obviously, usual derivations are Lie-derivations, but the converse is not true in general. A Lie-derivation is called a Lie-central derivation if its image is contained in the Lie-center. A Lie-derivation is called an almost inner Lie-derivation if the image of an element x is contained in the Lie-commutator of x and the Leibniz algebra. The main results we present in this talk refer to the conditions under which Lie-central derivation and almost inner Lie-derivations coincide.

Keywords: almost inner Lie-derivation, Lie-center, Lie-central derivation, Lie-derivation

Procedia PDF Downloads 122
2406 Academic Achievement in Argentinean College Students: Major Findings in Psychological Assessment

Authors: F. Uriel, M. M. Fernandez Liporace

Abstract:

In the last decade, academic achievement in higher education has become a topic of agenda in Argentina, regarding the high figures of adjustment problems, academic failure and dropout, and the low graduation rates in the context of massive classes and traditional teaching methods. Psychological variables, such as perceived social support, academic motivation and learning styles and strategies have much to offer since their measurement by tests allows a proper diagnose of their influence on academic achievement. Framed in a major research, several studies analysed multiple samples, totalizing 5135 students attending Argentinean public universities. The first goal was aimed at the identification of statistically significant differences in psychological variables -perceived social support, learning styles, learning strategies, and academic motivation- by age, gender, and degree of academic advance (freshmen versus sophomores). Thus, an inferential group differences study for each psychological dependent variable was developed by means of student’s T tests, given the features of data distribution. The second goal, aimed at examining associations between the four psychological variables on the one hand, and academic achievement on the other, was responded by correlational studies, calculating Pearson’s coefficients, employing grades as the quantitative indicator of academic achievement. The positive and significant results that were obtained led to the formulation of different predictive models of academic achievement which had to be tested in terms of adjustment and predictive power. These models took the four psychological variables above mentioned as predictors, using regression equations, examining predictors individually, in groups of two, and together, analysing indirect effects as well, and adding the degree of academic advance and gender, which had shown their importance within the first goal’s findings. The most relevant results were: first, gender showed no influence on any dependent variable. Second, only good achievers perceived high social support from teachers, and male students were prone to perceive less social support. Third, freshmen exhibited a pragmatic learning style, preferring unstructured environments, the use of examples and simultaneous-visual processing in learning, whereas sophomores manifest an assimilative learning style, choosing sequential and analytic processing modes. Despite these features, freshmen have to deal with abstract contents and sophomores, with practical learning situations due to study programs in force. Fifth, no differences in academic motivation were found between freshmen and sophomores. However, the latter employ a higher number of more efficient learning strategies. Sixth, freshmen low achievers lack intrinsic motivation. Seventh, models testing showed that social support, learning styles and academic motivation influence learning strategies, which affect academic achievement in freshmen, particularly males; only learning styles influence achievement in sophomores of both genders with direct effects. These findings led to conclude that educational psychologists, education specialists, teachers, and universities must plan urgent and major changes. These must be applied in renewed and better study programs, syllabi and classes, as well as tutoring and training systems. Such developments should be targeted to the support and empowerment of students in their academic pathways, and therefore to the upgrade of learning quality, especially in the case of freshmen, male freshmen, and low achievers.

Keywords: academic achievement, academic motivation, coping, learning strategies, learning styles, perceived social support

Procedia PDF Downloads 107
2405 Towards a Distributed Computation Platform Tailored for Educational Process Discovery and Analysis

Authors: Awatef Hicheur Cairns, Billel Gueni, Hind Hafdi, Christian Joubert, Nasser Khelifa

Abstract:

Given the ever changing needs of the job markets, education and training centers are increasingly held accountable for student success. Therefore, education and training centers have to focus on ways to streamline their offers and educational processes in order to achieve the highest level of quality in curriculum contents and managerial decisions. Educational process mining is an emerging field in the educational data mining (EDM) discipline, concerned with developing methods to discover, analyze and provide a visual representation of complete educational processes. In this paper, we present our distributed computation platform which allows different education centers and institutions to load their data and access to advanced data mining and process mining services. To achieve this, we present also a comparative study of the different clustering techniques developed in the context of process mining to partition efficiently educational traces. Our goal is to find the best strategy for distributing heavy analysis computations on many processing nodes of our platform.

Keywords: educational process mining, distributed process mining, clustering, distributed platform, educational data mining, ProM

Procedia PDF Downloads 440