Search results for: and document distances
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 413

Search results for: and document distances

83 Evaluation of the Acoustic Performance of Classrooms in Algerian Teaching Schools

Authors: Bouttout Abdelouahab, Amara Mohamed, Djakabe Saad, Remram Youcef

Abstract:

This paper presents the results of an evaluation of acoustic comfort such as background noise and reverberation time in teaching rooms in Height National School of Civil Engineering, Algeria. Four teaching rooms are evaluated: conference room, two classroom and amphitheatre. The acoustic quality of the classrooms has been analyzed based on measurements of sound pressure level inside room and reverberations time. The measurement results show that impulse decays dependent on the position of the microphone inside room and the background noise is with agreement of National Official Journal of Algeria published in July 1993. Therefore there exists a discrepancy between the obtained reverberation time value and recommended reverberation time in some classrooms. Three methods have been proposed to reduce the reverberation time values in such room. We developed a program with FORTRAN 6.0 language based on the absorption acoustic values of the Technical Document Regulation (DTR C3.1.1). The important results of this paper can be used to regulate the construction and execute the acoustic rehabilitations of teaching room in Algeria, especially the classrooms of the pupils in primary and secondary schools.

Keywords: Room acoustic, reverberation time, background noise, absorptions materials.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2658
82 RUPSec: An Extension on RUP for Developing Secure Systems - Requirements Discipline

Authors: Mohammad Reza Ayatollahzadeh Shirazi, Pooya Jaferian, Golnaz Elahi, Hamid Baghi, Babak Sadeghian

Abstract:

The world is moving rapidly toward the deployment of information and communication systems. Nowadays, computing systems with their fast growth are found everywhere and one of the main challenges for these systems is increasing attacks and security threats against them. Thus, capturing, analyzing and verifying security requirements becomes a very important activity in development process of computing systems, specially in developing systems such as banking, military and e-business systems. For developing every system, a process model which includes a process, methods and tools is chosen. The Rational Unified Process (RUP) is one of the most popular and complete process models which is used by developers in recent years. This process model should be extended to be used in developing secure software systems. In this paper, the Requirement Discipline of RUP is extended to improve RUP for developing secure software systems. These proposed extensions are adding and integrating a number of Activities, Roles, and Artifacts to RUP in order to capture, document and model threats and security requirements of system. These extensions introduce a group of clear and stepwise activities to developers. By following these activities, developers assure that security requirements are captured and modeled. These models are used in design, implementation and test activitie

Keywords:

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2760
81 Model Canvas and Process for Educational Game Design in Outcome-Based Education

Authors: Ratima Damkham, Natasha Dejdumrong, Priyakorn Pusawiro

Abstract:

This paper explored the solution in game design to help game designers in the educational game designing using digital educational game model canvas (DEGMC) and digital educational game form (DEGF) based on Outcome-based Education program. DEGMC and DEGF can help designers develop an overview of the game while designing and planning their own game. The way to clearly assess players’ ability from learning outcomes and support their game learning design is by using the tools. Designers can balance educational content and entertainment in designing a game by using the strategies of the Business Model Canvas and design the gameplay and players’ ability assessment from learning outcomes they need by referring to the Constructive Alignment. Furthermore, they can use their design plan in this research to write their Game Design Document (GDD). The success of the research was evaluated by four experts’ perspectives in the education and computer field. From the experiments, the canvas and form helped the game designers model their game according to the learning outcomes and analysis of their own game elements. This method can be a path to research an educational game design in the future.

Keywords: Constructive alignment, constructivist theory, educational game, outcome-based education.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 769
80 Methodology Issues and Design Approach of VLE on Mathematical Concepts Acquisition within Secondary Education in England

Authors: Aaron A. R. Nwabude

Abstract:

This study used positivist quantitative approach to examine the mathematical concepts acquisition of- KS4 (14-16) Special Education Needs (SENs) students within the school sector education in England. The research is based on a pilot study and the design is completely holistic in its approach with mixing methodologies. The study combines the qualitative and quantitative methods of approach in gathering formative data for the design process. Although, the approach could best be described as a mix method, fundamentally with a strong positivist paradigm, hence my earlier understanding of the differentiation of the students, student – teacher body and the various elements of indicators that is being measured which will require an attenuated description of individual research subjects. The design process involves four phases with five key stages which are; literature review and document analysis, the survey, interview, and observation; then finally the analysis of data set. The research identified the need for triangulation with Reid-s phases of data management providing scaffold for the study. The study clearly identified the ideological and philosophical aspects of educational research design for the study of mathematics by the special education needs (SENs) students in England using the virtual learning environment (VLE) platform.

Keywords: VLE, Special Education Needs, Key stage4, School, Mathematics, Concepts Acquisition

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1936
79 Evidence Theory Enabled Quickest Change Detection Using Big Time-Series Data from Internet of Things

Authors: Hossein Jafari, Xiangfang Li, Lijun Qian, Alexander Aved, Timothy Kroecker

Abstract:

Traditionally in sensor networks and recently in the Internet of Things, numerous heterogeneous sensors are deployed in distributed manner to monitor a phenomenon that often can be model by an underlying stochastic process. The big time-series data collected by the sensors must be analyzed to detect change in the stochastic process as quickly as possible with tolerable false alarm rate. However, sensors may have different accuracy and sensitivity range, and they decay along time. As a result, the big time-series data collected by the sensors will contain uncertainties and sometimes they are conflicting. In this study, we present a framework to take advantage of Evidence Theory (a.k.a. Dempster-Shafer and Dezert-Smarandache Theories) capabilities of representing and managing uncertainty and conflict to fast change detection and effectively deal with complementary hypotheses. Specifically, Kullback-Leibler divergence is used as the similarity metric to calculate the distances between the estimated current distribution with the pre- and post-change distributions. Then mass functions are calculated and related combination rules are applied to combine the mass values among all sensors. Furthermore, we applied the method to estimate the minimum number of sensors needed to combine, so computational efficiency could be improved. Cumulative sum test is then applied on the ratio of pignistic probability to detect and declare the change for decision making purpose. Simulation results using both synthetic data and real data from experimental setup demonstrate the effectiveness of the presented schemes.

Keywords: CUSUM, evidence theory, KL divergence, quickest change detection, time series data.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 952
78 Proposals for the Thermal Regulation of Buildings in Algeria: An Energy Label for Social Housing

Authors: Marco Morini, Nicolandrea Calabrese, Dario Chello

Abstract:

Despite the international commitment of Algeria towards the development of energy efficiency and renewable energy in the country, the internal energy demand has been continuously growing during the last decade due to the substantial increase of population and of living conditions, which in turn has led to an unprecedented expansion of the residential building sector. The RTB (Thermal Building Regulation) is the technical document that establishes the calculation framework for the thermal performance of buildings in Algeria, setting up minimum obligatory targets for the thermal performance of new buildings. An update of this regulation is due in the coming years and this paper discusses some proposals in this regard, with the aim to improve the energy efficiency of the building sector, particularly with regard to social housing. In particular, it proposes a methodology for drafting an energy performance label of new Algerian residential buildings, moving from the results of the thermal compliance verification and sizing of technical systems as defined in the RTB. Such an energy performance label – whose calculation method is briefly described in the paper – aims to raise citizens' awareness of the benefits of energy efficiency. It can represent the first step in a process of integrating technical installations into the calculation of the energy performance of buildings in Algeria.

Keywords: building, energy certification, energy efficiency, social housing, international cooperation, Mediterranean Region

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 528
77 Properties of Bacterial Nanocellulose for Scenic Arts

Authors: B. Suárez, G. Forman

Abstract:

Kombucha (a symbiotic culture of bacteria and yeast) produces material capable of acquiring multiple shapes and textures that change significantly under different environment or temperature variations (e.g., when it is exposed to wet conditions), properties that may be explored in the scenic industry. This paper presents an analysis of its specific characteristics, exploring them as a non-conventional material for arts and performance. Costume Design uses surfaces as a powerful way of expression to represent concepts and stories; it may apply the unique features of nano bacterial cellulose (NBC) as assets in this artistic context. A mix of qualitative and quantitative (interventionist) methodology approaches were used such as review of relevant literature to deepen knowledge on the research topic (crossing bibliography from different fields of studies: biology, art, costume design, etc.); as well as descriptive methods: laboratorial experiments, document quantities, observation to identify material properties and possibilities used to express a multiple narrative ideas, concepts and feelings. The results confirmed that NBC is an interactive and versatile material viable to be used in an alternative scenic context; its unique aesthetic and performative qualities, which change in contact to moisture, are resources that can be used to show a visual and poetic impact on stage.

Keywords: Biotechnological materials, contemporary dance, costume design, nano bacterial cellulose, performing arts.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 428
76 A Novel Approach for Protein Classification Using Fourier Transform

Authors: A. F. Ali, D. M. Shawky

Abstract:

Discovering new biological knowledge from the highthroughput biological data is a major challenge to bioinformatics today. To address this challenge, we developed a new approach for protein classification. Proteins that are evolutionarily- and thereby functionally- related are said to belong to the same classification. Identifying protein classification is of fundamental importance to document the diversity of the known protein universe. It also provides a means to determine the functional roles of newly discovered protein sequences. Our goal is to predict the functional classification of novel protein sequences based on a set of features extracted from each protein sequence. The proposed technique used datasets extracted from the Structural Classification of Proteins (SCOP) database. A set of spectral domain features based on Fast Fourier Transform (FFT) is used. The proposed classifier uses multilayer back propagation (MLBP) neural network for protein classification. The maximum classification accuracy is about 91% when applying the classifier to the full four levels of the SCOP database. However, it reaches a maximum of 96% when limiting the classification to the family level. The classification results reveal that spectral domain contains information that can be used for classification with high accuracy. In addition, the results emphasize that sequence similarity measures are of great importance especially at the family level.

Keywords: Bioinformatics, Artificial Neural Networks, Protein Sequence Analysis, Feature Extraction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2305
75 Providing a Secure, Reliable and Decentralized Document Management Solution Using Blockchain by a Virtual Identity Card

Authors: Meet Shah, Ankita Aditya, Dhruv Bindra, V. S. Omkar, Aashruti Seervi

Abstract:

In today's world, we need documents everywhere for a smooth workflow in the identification process or any other security aspects. The current system and techniques which are used for identification need one thing, that is ‘proof of existence’, which involves valid documents, for example, educational, financial, etc. The main issue with the current identity access management system and digital identification process is that the system is centralized in their network, which makes it inefficient. The paper presents the system which resolves all these cited issues. It is based on ‘blockchain’ technology, which is a 'decentralized system'. It allows transactions in a decentralized and immutable manner. The primary notion of the model is to ‘have everything with nothing’. It involves inter-linking required documents of a person with a single identity card so that a person can go anywhere without having the required documents with him/her. The person just needs to be physically present at a place wherein documents are necessary, and using a fingerprint impression and an iris scan print, the rest of the verification will progress. Furthermore, some technical overheads and advancements are listed. This paper also aims to layout its far-vision scenario of blockchain and its impact on future trends.

Keywords: Blockchain, decentralized system, fingerprint impression, identity management, iris scan.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1225
74 Lamb Wave Wireless Communication in Healthy Plates Using Coherent Demodulation

Authors: Rudy Bahouth, Farouk Benmeddour, Emmanuel Moulin, Jamal Assaad

Abstract:

Guided ultrasonic waves are used in Non-Destructive Testing and Structural Health Monitoring for inspection and damage detection. Recently, wireless data transmission using ultrasonic waves in solid metallic channels has gained popularity in some industrial applications such as nuclear, aerospace and smart vehicles. The idea is to find a good substitute for electromagnetic waves since they are highly attenuated near metallic components due to Faraday shielding. The proposed solution is to use ultrasonic guided waves such as Lamb waves as an information carrier due to their capability of propagation for long distances. In addition to this, valuable information about the health of the structure could be extracted simultaneously. In this work, the reliable frequency bandwidth for communication is extracted experimentally from dispersion curves at first. Then, an experimental platform for wireless communication using Lamb waves is described and built. After this, coherent demodulation algorithm used in telecommunications is tested for Amplitude Shift Keying, On-Off Keying and Binary Phase Shift Keying modulation techniques. Signal processing parameters such as threshold choice, number of cycles per bit and Bit Rate are optimized. Experimental results are compared based on the average bit error percentage. Results has shown high sensitivity to threshold selection for Amplitude Shift Keying and On-Off Keying techniques resulting a Bit Rate decrease. Binary Phase Shift Keying technique shows the highest stability and data rate between all tested modulation techniques.

Keywords: Lamb Wave Communication, wireless communication, coherent demodulation, bit error percentage.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 503
73 Ordinal Regression with Fenton-Wilkinson Order Statistics: A Case Study of an Orienteering Race

Authors: Joonas Pääkkönen

Abstract:

In sports, individuals and teams are typically interested in final rankings. Final results, such as times or distances, dictate these rankings, also known as places. Places can be further associated with ordered random variables, commonly referred to as order statistics. In this work, we introduce a simple, yet accurate order statistical ordinal regression function that predicts relay race places with changeover-times. We call this function the Fenton-Wilkinson Order Statistics model. This model is built on the following educated assumption: individual leg-times follow log-normal distributions. Moreover, our key idea is to utilize Fenton-Wilkinson approximations of changeover-times alongside an estimator for the total number of teams as in the notorious German tank problem. This original place regression function is sigmoidal and thus correctly predicts the existence of a small number of elite teams that significantly outperform the rest of the teams. Our model also describes how place increases linearly with changeover-time at the inflection point of the log-normal distribution function. With real-world data from Jukola 2019, a massive orienteering relay race, the model is shown to be highly accurate even when the size of the training set is only 5% of the whole data set. Numerical results also show that our model exhibits smaller place prediction root-mean-square-errors than linear regression, mord regression and Gaussian process regression.

Keywords: Fenton-Wilkinson approximation, German tank problem, log-normal distribution, order statistics, ordinal regression, orienteering, sports analytics, sports modeling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 766
72 An Analysis of the Five Most Used Numerals and a Proposal for the Adoption of a Universally Acceptable Numeral (UAN)

Authors: Mufutau Ayinla Abdul-Yakeen

Abstract:

An analysis of the five most used numerals and a proposal for the adoption of a Universally Acceptable Numerals (UAN), came up as a result of the researchers inquisitiveses of the need for a set of numerals that is universally accepted. The researcher sought for the meaning of the first letter, “Nun”, “ن”, of the first verse of Suratul-Kalam (Chapter of the Pen), the Sixty-Eighth Chapter of the Holy Qur'an. It was observed that there was no universally accepted, economical, explainable, linkable and consistent set of numerals used by all scientists up till the moment of making this enquiry. As a theoretical paper, explanatory method is used to review five of the most used numerals (Tally Marks, Roman Figure, Hindu-Arabic, Arabic, and Chinese) and the urgent need for a universally accepted, economical, explainable, linkable and consistent set of numerals arises. The study discovers: ., I, \, _, L, U, =, C, O, 9, and 1.; to be used as numeral 0, 1, 2, 3, 4, 5, 6, 7, 8, 9 and 10 respectively; as a set of universally acceptable, economical, explainable, linkable, sustainable, convertible and consistent set of numerals that originates from Islam. They can be called Islameconumerals or UAN. With UAN, everything dropped, written, drawn and/or scribbled has meaning(s) as postulated by the first verse of Qur'an 68 and everyone can easily document all figures within the shortest period. It is suggested that there should be a discipline called Numeralnomics (Study of optimum utilization of Numerals) and everybody should start using the UAN, now, in order in know their strengths and weaknesses so as to suggest a better and acceptable set of numerals for the interested readers. Similarly study can be conducted for the alphabets.

Keywords: Islameconumerals, economical, Universally Acceptable Numerals (UAN), numeralnomics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 746
71 End-to-End Spanish-English Sequence Learning Translation Model

Authors: Vidhu Mitha Goutham, Ruma Mukherjee

Abstract:

The low availability of well-trained, unlimited, dynamic-access models for specific languages makes it hard for corporate users to adopt quick translation techniques and incorporate them into product solutions. As translation tasks increasingly require a dynamic sequence learning curve; stable, cost-free opensource models are scarce. We survey and compare current translation techniques and propose a modified sequence to sequence model repurposed with attention techniques. Sequence learning using an encoder-decoder model is now paving the path for higher precision levels in translation. Using a Convolutional Neural Network (CNN) encoder and a Recurrent Neural Network (RNN) decoder background, we use Fairseq tools to produce an end-to-end bilingually trained Spanish-English machine translation model including source language detection. We acquire competitive results using a duo-lingo-corpus trained model to provide for prospective, ready-made plug-in use for compound sentences and document translations. Our model serves a decent system for large, organizational data translation needs. While acknowledging its shortcomings and future scope, it also identifies itself as a well-optimized deep neural network model and solution.

Keywords: Attention, encoder-decoder, Fairseq, Seq2Seq, Spanish, translation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 419
70 Travel Time Evaluation of an Innovative U-Turn Facility on Urban Arterial Roadways

Authors: Ali Pirdavani, Tom Brijs, Tom Bellemans, Geert Wets, Koen Vanhoof

Abstract:

Signalized intersections on high-volume arterials are often congested during peak hours, causing a decrease in through movement efficiency on the arterial. Much of the vehicle delay incurred at conventional intersections is caused by high left-turn demand. Unconventional intersection designs attempt to reduce intersection delay and travel time by rerouting left-turns away from the main intersection and replacing it with right-turn followed by Uturn. The proposed new type of U-turn intersection is geometrically designed with a raised island which provides a protected U-turn movement. In this study several scenarios based on different distances between U-turn and main intersection, traffic volume of major/minor approaches and percentage of left-turn volumes were simulated by use of AIMSUN, a type of traffic microsimulation software. Subsequently some models are proposed in order to compute travel time of each movement. Eventually by correlating these equations to some in-field collected data of some implemented U-turn facilities, the reliability of the proposed models are approved. With these models it would be possible to calculate travel time of each movement under any kind of geometric and traffic condition. By comparing travel time of a conventional signalized intersection with U-turn intersection travel time, it would be possible to decide on converting signalized intersections into this new kind of U-turn facility or not. However comparison of travel time is not part of the scope of this research. In this paper only travel time of this innovative U-turn facility would be predicted. According to some before and after study about the traffic performance of some executed U-turn facilities, it is found that commonly, this new type of U-turn facility produces lower travel time. Thus, evaluation of using this type of unconventional intersection should be seriously considered.

Keywords: Innovative U-turn facility, Microsimulation, Traveltime, Unconventional intersection design.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1305
69 Analysis of Linguistic Disfluencies in Bilingual Children’s Discourse

Authors: Sheena Christabel Pravin, M. Palanivelan

Abstract:

Speech disfluencies are common in spontaneous speech. The primary purpose of this study was to distinguish linguistic disfluencies from stuttering disfluencies in bilingual Tamil–English (TE) speaking children. The secondary purpose was to determine whether their disfluencies are mediated by native language dominance and/or on an early onset of developmental stuttering at childhood. A detailed study was carried out to identify the prosodic and acoustic features that uniquely represent the disfluent regions of speech. This paper focuses on statistical modeling of repetitions, prolongations, pauses and interjections in the speech corpus encompassing bilingual spontaneous utterances from school going children – English and Tamil. Two classifiers including Hidden Markov Models (HMM) and the Multilayer Perceptron (MLP), which is a class of feed-forward artificial neural network, were compared in the classification of disfluencies. The results of the classifiers document the patterns of disfluency in spontaneous speech samples of school-aged children to distinguish between Children Who Stutter (CWS) and Children with Language Impairment CLI). The ability of the models in classifying the disfluencies was measured in terms of F-measure, Recall, and Precision.

Keywords: Bilingual, children who stutter, children with language impairment, Hidden Markov Models, multi-layer perceptron, linguistic disfluencies, stuttering disfluencies.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 983
68 Development of Software Complex for Digitalization of Enterprise Activities

Authors: G. T. Balakayeva, K. K. Nurlybayeva, M. B. Zhanuzakov

Abstract:

In the proposed work, we have developed software and designed a software architecture for the implementation of enterprise business processes. The proposed software has a multi-level architecture using a domain-specific tool. The developed architecture is a guarantor of the availability, reliability and security of the system and the implementation of business processes, which are the basis for effective enterprise management. Automating business processes, automating the algorithmic stages of an enterprise, developing optimal algorithms for managing activities, controlling and monitoring, reducing risks and improving results help organizations achieve strategic goals quickly and efficiently. The software described in this article can connect to the corporate information system via two methods: a desktop client and a web client. With an appeal to the application server, the desktop client program connects to the information system on the company's work PCs over a local network. Outside the organization, the user can interact with the information system via a web browser, which acts as a web client and connects to a web server. The developed software consists of several integrated modules that share resources and interact with each other through an API. The following technology stack was used during development: Node js, React js, MongoDB, Ngnix, Cloud Technologies, Python.

Keywords: Algorithms, document processing, automation, integrated modules, software architecture, software design, information system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 119
67 Effects of Roughness on Forward Facing Step in an Open Channel

Authors: S. M. Rifat, André L. Marchildon, Mark F. Tachie

Abstract:

Experiments were performed to investigate the effects of roughness on the reattachment and redevelopment regions over a 12 mm forward facing step (FFS) in an open channel flow. The experiments were performed over an upstream smooth wall and a smooth FFS, an upstream wall coated with sandpaper 36 grit and a smooth FFS and an upstream rough wall produced from sandpaper 36 grit and a FFS coated with sandpaper 36 grit. To investigate only the wall roughness effects, Reynolds number, Froude number, aspect ratio and blockage ratio were kept constant. Upstream profiles showed reduced streamwise mean velocities close to the rough wall compared to the smooth wall, but the turbulence level was increased by upstream wall roughness. The reattachment length for the smooth-smooth wall experiment was 1.78h; however, when it is replaced with rough-smooth wall the reattachment length decreased to 1.53h. It was observed that the upstream roughness increased the physical size of contours of maximum turbulence level; however, the downstream roughness decreased both the size and magnitude of contours in the vicinity of the leading edge of the step. Quadrant analysis was performed to investigate the dominant Reynolds shear stress contribution in the recirculation region. The Reynolds shear stress and turbulent kinetic energy profiles after the reattachment showed slower recovery compared to the streamwise mean velocity, however all the profiles fairly collapse on their corresponding upstream profiles at x/h = 60. It was concluded that to obtain a complete collapse several more streamwise distances would be required.

Keywords: Forward facing step, open channel, separated and reattached turbulent flows, wall roughness.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1615
66 Assessment of Soil Contamination on the Content of Macro and Microelements and the Quality of Grass Pea Seeds (Lathyrus sativus L.)

Authors: Violina R. Angelova

Abstract:

Comparative research has been conducted to allow us to determine the content of macro and microelements in the vegetative and reproductive organs of grass pea and the quality of grass pea seeds, as well as to identify the possibility of grass pea growth on soils contaminated by heavy metals. The experiment was conducted on an agricultural field subjected to contamination from the Non-Ferrous-Metal Works (MFMW) near Plovdiv, Bulgaria. The experimental plots were situated at different distances of 0.5 km and 8 km, respectively, from the source of pollution. On reaching commercial ripeness the grass pea plants were gathered. The composition of the macro and microelements in plant materials (roots, stems, leaves, seeds), and the dry matter content, sugars, proteins, fats and ash contained in the grass pea seeds were determined. Translocation factors (TF) and bioaccumulation factor (BCF) were also determined. The quantitative measurements were carried out through inductively-coupled plasma (ICP). The grass pea plant can successfully be grown on soils contaminated by heavy metals. Soil pollution with heavy metals does not affect the quality of the grass pea seeds. The seeds of the grass pea contain significant amounts of nutrients (K, P, Cu, Fe Mn, Zn) and protein (23.18-29.54%). The distribution of heavy metals in the organs of the grass pea has a selective character, which reduces in the following order: leaves > roots > stems > seeds. BCF and TF values were greater than one suggesting efficient accumulation in the above ground parts of grass pea plant. Grass pea is a plant that is tolerant to heavy metals and can be referred to the accumulator plants. The results provide valuable information about the chemical and nutritional composition of the seeds of the grass pea grown on contaminated soils in Bulgaria. The high content of macro and microelements and the low concentrations of toxic elements in the grass pea grown in contaminated soil make it possible to use the seeds of the grass pea as animal feed.

Keywords: Grass pea, heavy metals, micro and macroelements, polluted soils, quality.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 600
65 TOSOM: A Topic-Oriented Self-Organizing Map for Text Organization

Authors: Hsin-Chang Yang, Chung-Hong Lee, Kuo-Lung Ke

Abstract:

The self-organizing map (SOM) model is a well-known neural network model with wide spread of applications. The main characteristics of SOM are two-fold, namely dimension reduction and topology preservation. Using SOM, a high-dimensional data space will be mapped to some low-dimensional space. Meanwhile, the topological relations among data will be preserved. With such characteristics, the SOM was usually applied on data clustering and visualization tasks. However, the SOM has main disadvantage of the need to know the number and structure of neurons prior to training, which are difficult to be determined. Several schemes have been proposed to tackle such deficiency. Examples are growing/expandable SOM, hierarchical SOM, and growing hierarchical SOM. These schemes could dynamically expand the map, even generate hierarchical maps, during training. Encouraging results were reported. Basically, these schemes adapt the size and structure of the map according to the distribution of training data. That is, they are data-driven or dataoriented SOM schemes. In this work, a topic-oriented SOM scheme which is suitable for document clustering and organization will be developed. The proposed SOM will automatically adapt the number as well as the structure of the map according to identified topics. Unlike other data-oriented SOMs, our approach expands the map and generates the hierarchies both according to the topics and their characteristics of the neurons. The preliminary experiments give promising result and demonstrate the plausibility of the method.

Keywords: Self-organizing map, topic identification, learning algorithm, text clustering.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1986
64 Timescape-Based Panoramic View for Historic Landmarks

Authors: H. Ali, A. Whitehead

Abstract:

Providing a panoramic view of famous landmarks around the world offers artistic and historic value for historians, tourists, and researchers. Exploring the history of famous landmarks by presenting a comprehensive view of a temporal panorama merged with geographical and historical information presents a unique challenge of dealing with images that span a long period, from the 1800’s up to the present. This work presents the concept of temporal panorama through a timeline display of aligned historic and modern images for many famous landmarks. Utilization of this panorama requires a collection of hundreds of thousands of landmark images from the Internet comprised of historic images and modern images of the digital age. These images have to be classified for subset selection to keep the more suitable images that chronologically document a landmark’s history. Processing of historic images captured using older analog technology under various different capturing conditions represents a big challenge when they have to be used with modern digital images. Successful processing of historic images to prepare them for next steps of temporal panorama creation represents an active contribution in cultural heritage preservation through the fulfillment of one of UNESCO goals in preservation and displaying famous worldwide landmarks.

Keywords: Cultural heritage, image registration, image subset selection, registered image similarity, temporal panorama, timescapes.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 997
63 The Reproducibility and Repeatability of Modified Likelihood Ratio for Forensics Handwriting Examination

Authors: O. Abiodun Adeyinka, B. Adeyemo Adesesan

Abstract:

The forensic use of handwriting depends on the analysis, comparison, and evaluation decisions made by forensic document examiners. When using biometric technology in forensic applications, it is necessary to compute Likelihood Ratio (LR) for quantifying strength of evidence under two competing hypotheses, namely the prosecution and the defense hypotheses wherein a set of assumptions and methods for a given data set will be made. It is therefore important to know how repeatable and reproducible our estimated LR is. This paper evaluated the accuracy and reproducibility of examiners' decisions. Confidence interval for the estimated LR were presented so as not get an incorrect estimate that will be used to deliver wrong judgment in the court of Law. The estimate of LR is fundamentally a Bayesian concept and we used two LR estimators, namely Logistic Regression (LoR) and Kernel Density Estimator (KDE) for this paper. The repeatability evaluation was carried out by retesting the initial experiment after an interval of six months to observe whether examiners would repeat their decisions for the estimated LR. The experimental results, which are based on handwriting dataset, show that LR has different confidence intervals which therefore implies that LR cannot be estimated with the same certainty everywhere. Though the LoR performed better than the KDE when tested using the same dataset, the two LR estimators investigated showed a consistent region in which LR value can be estimated confidently. These two findings advance our understanding of LR when used in computing the strength of evidence in handwriting using forensics.

Keywords: Logistic Regression LoR, Kernel Density Estimator KDE, Handwriting, Confidence Interval, Repeatability, Reproducibility.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 417
62 Heavy Metal Contamination of a Dumpsite Environment as Assessed with Pollution Indices

Authors: Olubunmi S. Shittu, Olufemi J. Ayodele, Augustus O. A. Ilori, Abidemi O. Filani, Adetola T. Afuye

Abstract:

Indiscriminate refuse dumping in and around Ado-Ekiti combined with improper management of few available dumpsites, such as Ilokun dumpsite, posed the threat of heavy metals pollution in the surrounding soils and underground water that needs assessment using pollution indices. Surface soils (0-15 cm) were taken from the centre of Ilokun dumpsite (0 m) and environs at different directions and distances during the dry and wet seasons, as well as a background sample at 1000 m away, adjacent to the dumpsite at Ilokun, Ado-Ekiti, Nigeria. The concentration of heavy metals used to calculate the pollution indices for the soils were determined using Atomic Adsorption Spectrophotometer. The soils recorded high concentrations of all the heavy metals above the background concentrations irrespective of the season with highest concentrations at the 0 m except Ni and Fe at 50 m during the dry and wet season, respectively. The heavy metals concentration were in the order of Ni > Mn > Pb > Cr > Cu > Cd > Fe during the dry season, and Fe > Cr > Cu > Pb > Ni > Cd > Mn during the wet season. Using the Contamination Factor (CF), the soils were classified to be moderately contaminated with Cd and Fe to very high contamination with other metals during the dry season and low Cd contamination (0.87), moderate contamination with Fe, Pb, Mn and Ni and very high contamination with Cr and Cu during the wet season. At both seasons, the Pollution Load Index (PLI) indicates the soils to be generally polluted with heavy metals and the Geoaccumulation Index (Igeo) calculated shown the soils to be in unpolluted to moderately polluted levels. Enrichment Factor (EF) implied the soils to be deficiently enriched with all the heavy metals except Cr (7.90) and Cu (6.42) that were at significantly enrichment levels during the wet season. Modified Degree of Contamination (mCd) recorded, indicated the soils to be of very high to extremely high degree of contamination during the dry season and moderate degree of contamination during the wet season except 0 m with high degree of contamination. The concentration of heavy metals in the soils combined with some of the pollution indices indicated the soils in and around the Ilokun Dumpsite are being polluted with heavy metals from anthropogenic sources constituted by the indiscriminate refuse dumping.

Keywords: Contamination factor, enrichment factor, geoaccumulation index, modified degree of contamination, pollution load index.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1432
61 Semantic Mobility Channel (SMC): Ubiquitous and Mobile Computing Meets the Semantic Web

Authors: José M. Cantera, Miguel Jiménez, Genoveva López, Javier Soriano

Abstract:

With the advent of emerging personal computing paradigms such as ubiquitous and mobile computing, Web contents are becoming accessible from a wide range of mobile devices. Since these devices do not have the same rendering capabilities, Web contents need to be adapted for transparent access from a variety of client agents. Such content adaptation is exploited for either an individual element or a set of consecutive elements in a Web document and results in better rendering and faster delivery to the client device. Nevertheless, Web content adaptation sets new challenges for semantic markup. This paper presents an advanced components platform, called SMC, enabling the development of mobility applications and services according to a channel model based on the principles of Services Oriented Architecture (SOA). It then goes on to describe the potential for integration with the Semantic Web through a novel framework of external semantic annotation that prescribes a scheme for representing semantic markup files and a way of associating Web documents with these external annotations. The role of semantic annotation in this framework is to describe the contents of individual documents themselves, assuring the preservation of the semantics during the process of adapting content rendering. Semantic Web content adaptation is a way of adding value to Web contents and facilitates repurposing of Web contents (enhanced browsing, Web Services location and access, etc).

Keywords: Semantic web, ubiquitous and mobile computing, web content transcoding. semantic mark-up, mobile computing, middleware and services.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1764
60 The Design and Applied of Learning Management System via Social Media on Internet: Case Study of Operating System for Business Subject

Authors: Pimploi Tirastittam, Sawanath Treesathon, Amornrath Ongkawat

Abstract:

Learning Management System (LMS) is the system which uses to manage the learning in order to grouping the content and learning activity between the lecturer and learner including online examination and evaluation. Nowadays, it is the borderless learning era so the learning activities can be accessed from everywhere in the world and also anytime via the information technology and media. The learner can easily access to the knowledge so the different in time and distance is not a constraint for learning anymore. The learning pattern which was used in this research is the integration of the in-class learning and online learning via internet and will be able to monitor the progress by the Learning management system which will create the fast response and accessible learning process via the social media. In order to increase the capability and freedom of the learner, the system can show the current and history of the learning document, video conference and also has the chat room for the learner and lecturer to interact to each other. So the objectives of the “The Design and Applied of Learning Management System via Social Media on Internet: Case Study of Operating System for Business Subject” are to expand the opportunity of learning and to increase the efficiency of learning as well as increase the communication channel between lecturer and student. The data of this research was collect from 30 users of the system which are students who enroll in the subject. And the result of the research is in the “Very Good” which is conformed to the hypothesis.

Keywords: Learning Management System, Social Media.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1840
59 Analysis Model for the Relationship of Users, Products, and Stores on Online Marketplace Based on Distributed Representation

Authors: Ke He, Wumaier Parezhati, Haruka Yamashita

Abstract:

Recently, online marketplaces in the e-commerce industry, such as Rakuten and Alibaba, have become some of the most popular online marketplaces in Asia. In these shopping websites, consumers can select purchase products from a large number of stores. Additionally, consumers of the e-commerce site have to register their name, age, gender, and other information in advance, to access their registered account. Therefore, establishing a method for analyzing consumer preferences from both the store and the product side is required. This study uses the Doc2Vec method, which has been studied in the field of natural language processing. Doc2Vec has been used in many cases to analyze the extraction of semantic relationships between documents (represented as consumers) and words (represented as products) in the field of document classification. This concept is applicable to represent the relationship between users and items; however, the problem is that one more factor (i.e., shops) needs to be considered in Doc2Vec. More precisely, a method for analyzing the relationship between consumers, stores, and products is required. The purpose of our study is to combine the analysis of the Doc2vec model for users and shops, and for users and items in the same feature space. This method enables the calculation of similar shops and items for each user. In this study, we derive the real data analysis accumulated in the online marketplace and demonstrate the efficiency of the proposal.

Keywords: Doc2Vec, marketing, online marketplace, recommendation system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 419
58 A Bibliometric Assessment on Sustainability and Clustering

Authors: Fernanda M. Assef, Maria Teresinha A. Steiner, David Gabriel F. de Barros

Abstract:

Review researches are useful in terms of analysis of research problems. Between the types of review documents, we commonly find bibliometric studies. This type of application often helps the global visualization of a research problem and helps academics worldwide to understand the context of a research area better. In this document, a bibliometric view surrounding clustering techniques and sustainability problems is presented. The authors aimed at which issues mostly use clustering techniques and even which sustainability issue would be more impactful on today’s moment of research. During the bibliometric analysis, we found 10 different groups of research in clustering applications for sustainability issues: Energy; Environmental; Non-urban Planning; Sustainable Development; Sustainable Supply Chain; Transport; Urban Planning; Water; Waste Disposal; and, Others. Moreover, by analyzing the citations of each group, it was discovered that the Environmental group could be classified as the most impactful research cluster in the area mentioned. After the content analysis of each paper classified in the environmental group, it was found that the k-means technique is preferred for solving sustainability problems with clustering methods since it appeared the most amongst the documents. The authors finally conclude that a bibliometric assessment could help indicate a gap of researches on waste disposal – which was the group with the least amount of publications – and the most impactful research on environmental problems.

Keywords: Bibliometric assessment, clustering, sustainability, territorial partitioning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 322
57 Geostatistical Analysis and Mapping of Groundlevel Ozone in a Medium Sized Urban Area

Authors: F. J. Moral García, P. Valiente González, F. López Rodríguez

Abstract:

Ground-level tropospheric ozone is one of the air pollutants of most concern. It is mainly produced by photochemical processes involving nitrogen oxides and volatile organic compounds in the lower parts of the atmosphere. Ozone levels become particularly high in regions close to high ozone precursor emissions and during summer, when stagnant meteorological conditions with high insolation and high temperatures are common. In this work, some results of a study about urban ozone distribution patterns in the city of Badajoz, which is the largest and most industrialized city in Extremadura region (southwest Spain) are shown. Fourteen sampling campaigns, at least one per month, were carried out to measure ambient air ozone concentrations, during periods that were selected according to favourable conditions to ozone production, using an automatic portable analyzer. Later, to evaluate the ozone distribution at the city, the measured ozone data were analyzed using geostatistical techniques. Thus, first, during the exploratory analysis of data, it was revealed that they were distributed normally, which is a desirable property for the subsequent stages of the geostatistical study. Secondly, during the structural analysis of data, theoretical spherical models provided the best fit for all monthly experimental variograms. The parameters of these variograms (sill, range and nugget) revealed that the maximum distance of spatial dependence is between 302-790 m and the variable, air ozone concentration, is not evenly distributed in reduced distances. Finally, predictive ozone maps were derived for all points of the experimental study area, by use of geostatistical algorithms (kriging). High prediction accuracy was obtained in all cases as cross-validation showed. Useful information for hazard assessment was also provided when probability maps, based on kriging interpolation and kriging standard deviation, were produced.

Keywords: Kriging, map, tropospheric ozone, variogram.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1817
56 Schools of Thought in the Field of Social Entrepreneurship

Authors: Cris Bravo

Abstract:

Social entrepreneurship is a new and exciting topic that holds a great promise in helping alleviate the social problems of the world. As a new subject, the meaning of the term is too broad and this is counterproductive in trying to build understanding around the concept. The purpose of this study is to identify and compare the elements of social entrepreneurship as defined by seven international organizations leading social entrepreneurship projects: Ashoka Foundation, Skoll Foundation, Schwab Foundation and Yunus Center; as well as from three other institutions fostering social entrepreneurship: Global Social Benefit Institute, BRAC University, and Socialab. The study used document analysis from Skoll Foundation, Schwab Foundation, Yunus Center and Ashoka Foundation; and open ended interview to experts from the Global Social Benefit Institute at Santa Clara University in United States, BRAC University from Bangladesh, and Socialab from Argentina. The study identified three clearly differentiated schools of thought, based on their views on revenue, scalability, replicability and geographic location. While this study is by no means exhaustive, it provides an indication of the patterns of ideas fostered by important players in the field. By clearly identifying the similarities and differences in the concept of social entrepreneurship, research and practitioners are better equipped to build on the subject, and to promote more adequate and accurate social policies to foster the development of social entrepreneurship.

Keywords: Replicability, revenue, scalability, schools of thought, social entrepreneurship.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4256
55 An Investigation on the Sandwich Panels with Flexible and Toughened Adhesives under Flexural Loading

Authors: Emre Kara, Şura Karakuzu, Ahmet F. Geylan, Metehan Demir, Kadir Koç, Halil Aykul

Abstract:

The material selection in the design of the sandwich structures is very crucial aspect because of the positive or negative influences of the base materials to the mechanical properties of the entire panel. In the literature, it was presented that the selection of the skin and core materials plays very important role on the behavior of the sandwich. Beside this, the use of the correct adhesive can make the whole structure to show better mechanical results and behavior. In the present work, the static three-point bending tests were performed on the sandwiches having an aluminum alloy foam core, the skins made of three different types of fabrics and two different commercial adhesives (flexible polyurethane and toughened epoxy based) at different values of support span distances by aiming the analyses of their flexural performance in terms of absorbed energy, peak force values and collapse mechanisms. The main results of the flexural loading are: force-displacement curves obtained after the bending tests, peak force and absorbed energy values, collapse mechanisms and adhesion quality. The experimental results presented that the sandwiches with epoxy based toughened adhesive and the skins made of S-Glass Woven fabrics indicated the best adhesion quality and mechanical properties. The sandwiches with toughened adhesive exhibited higher peak force and energy absorption values compared to the sandwiches with flexible adhesive. The use of these sandwich structures can lead to a weight reduction of the transport vehicles, providing an adequate structural strength under operating conditions.

Keywords: Adhesive and adhesion, Aluminum foam, Bending, Collapse mechanisms.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2152
54 A System for Analyzing and Eliciting Public Grievances Using Cache Enabled Big Data

Authors: P. Kaladevi, N. Giridharan

Abstract:

The system for analyzing and eliciting public grievances serves its main purpose to receive and process all sorts of complaints from the public and respond to users. Due to the more number of complaint data becomes big data which is difficult to store and process. The proposed system uses HDFS to store the big data and uses MapReduce to process the big data. The concept of cache was applied in the system to provide immediate response and timely action using big data analytics. Cache enabled big data increases the response time of the system. The unstructured data provided by the users are efficiently handled through map reduce algorithm. The processing of complaints takes place in the order of the hierarchy of the authority. The drawbacks of the traditional database system used in the existing system are set forth by our system by using Cache enabled Hadoop Distributed File System. MapReduce framework codes have the possible to leak the sensitive data through computation process. We propose a system that add noise to the output of the reduce phase to avoid signaling the presence of sensitive data. If the complaints are not processed in the ample time, then automatically it is forwarded to the higher authority. Hence it ensures assurance in processing. A copy of the filed complaint is sent as a digitally signed PDF document to the user mail id which serves as a proof. The system report serves to be an essential data while making important decisions based on legislation.

Keywords: Big Data, Hadoop, HDFS, Caching, MapReduce, web personalization, e-governance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1553