Search results for: fully spatial signal processing
6353 Sepiolite as a Processing Aid in Fibre Reinforced Cement Produced in Hatschek Machine
Authors: R. Pérez Castells, J. M. Carbajo
Abstract:
Sepiolite is used as a processing aid in the manufacture of fibre cement from the start of the replacement of asbestos in the 80s. Sepiolite increases the inter-laminar bond between cement layers and improves homogeneity of the slurries. A new type of sepiolite processed product, Wollatrop TF/C, has been checked as a retention agent for fine particles in the production of fibre cement in a Hatschek machine. The effect of Wollatrop T/FC on filtering and fine particle losses was studied as well as the interaction with anionic polyacrylamide and microsilica. The design of the experiments were factorial and the VDT equipment used for measuring retention and drainage was modified Rapid Köethen laboratory sheet former. Wollatrop TF/C increased the fine particle retention improving the economy of the process and reducing the accumulation of solids in recycled process water. At the same time, drainage time increased sharply at high concentration, however drainage time can be improved by adjusting APAM concentration. Wollatrop TF/C and microsilica are having very small interactions among them. Microsilica does not control fine particle losses while Wollatrop TF/C does efficiently. Further research on APAM type (molecular weight and anionic character) is advisable to improve drainage.Keywords: drainage, fibre-reinforced cement, fine particle losses, flocculation, microsilica, sepiolite
Procedia PDF Downloads 3266352 The Impact of Human Intervention on Net Primary Productivity for the South-Central Zone of Chile
Authors: Yannay Casas-Ledon, Cinthya A. Andrade, Camila E. Salazar, Mauricio Aguayo
Abstract:
The sustainable management of available natural resources is a crucial question for policy-makers, economists, and the research community. Among several, land constitutes one of the most critical resources, which is being intensively appropriated by human activities producing ecological stresses and reducing ecosystem services. In this context, net primary production (NPP) has been considered as a feasible proxy indicator for estimating the impacts of human interventions on land-uses intensity. Accordingly, the human appropriation of NPP (HANPP) was calculated for the south-central regions of Chile between 2007 and 2014. The HANPP was defined as the difference between the potential NPP of the naturally produced vegetation (NPP0, i.e., the vegetation that would exist without any human interferences) and the NPP remaining in the field after harvest (NPPeco), expressed in gC/m² yr. Other NPP flows taken into account in HANPP estimation were the harvested (NPPh) and the losses of NPP through land conversion (NPPluc). The ArcGIS 10.4 software was used for assessing the spatial and temporal HANPP changes. The differentiation of HANPP as % of NPP0 was estimated by each landcover type taken in 2007 and 2014 as the reference years. The spatial results depicted a negative impact on land use efficiency during 2007 and 2014, showing negative HANPP changes for the whole region. The harvest and biomass losses through land conversion components are the leading causes of loss of land-use efficiency. Furthermore, the study depicted higher HANPP in 2014 than in 2007, representing 50% of NPP0 for all landcover classes concerning 2007. This performance was mainly related to the higher volume of harvested biomass for agriculture. In consequence, the cropland depicted the high HANPP followed by plantation. This performance highlights the strong positive correlation between the economic activities developed into the region. This finding constitutes the base for a better understanding of the main driving force influencing biomass productivity and a powerful metric for supporting the sustainable management of land use.Keywords: human appropriation, land-use changes, land-use impact, net primary productivity
Procedia PDF Downloads 1366351 Psychometric Examination of Atma Jaya's Multiple Intelligence Batteries for University Students
Authors: Angela Oktavia Suryani, Bernadeth Gloria, Edwin Sutamto, Jessica Kristianty, Ni Made Rai Sapitri, Patricia Catherine Agla, Sitti Arlinda Rochiadi
Abstract:
It was found that some blogs or personal websites in Indonesia sell standardized intelligence tests (for example, Progressive Matrices (PM), Intelligence Structure Test (IST), and Culture Fair Intelligence Test (CFIT)) and other psychological tests, together with the manual and the key answers for public. Individuals can buy and prepare themselves for selection or recruitment with the real test. This action drives people to lie to the institution (education or company) and also to themselves. It was also found that those tests are old. Some items are not relevant with the current context, for example a question about a diameter of a certain coin that does not exist anymore. These problems motivate us to develop a new intelligence battery test, namely of Multiple Aptitude Battery (MAB). The battery test was built by using Thurstone’s Primary Mental Abilities theory and intended to be used by high schools students, university students, and worker applicants. The battery tests consist of 9 subtests. In the current study we examine six subtests, namely Reading Comprehension, Verbal Analogies, Numerical Inductive Reasoning, Numerical Deductive Reasoning, Mechanical Ability, and Two Dimensional Spatial Reasoning for university students. The study included 1424 data from students recruited by convenience sampling from eight faculties at Atma Jaya Catholic University of Indonesia. Classical and modern test approaches (Item Response Theory) were carried out to identify the item difficulties of the items and confirmatory factor analysis was applied to examine their internal validities. The validity of each subtest was inspected by using convergent–discriminant method, whereas the reliability was examined by implementing Kuder–Richardson formula. The result showed that the majority of the subtests were difficult in medium level, and there was only one subtest categorized as easy, namely Verbal Analogies. The items were found homogenous and valid measuring their constructs; however at the level of subtests, the construct validity examined by convergent-discriminant method indicated that the subtests were not unidimensional. It means they were not only measuring their own constructs but also other construct. Three of the subtests were able to predict academic performance with small effect size, namely Reading Comprehension, Numerical Inductive Reasoning, and Two Dimensional Spatial Reasoning. GPAs in intermediate level (GPAs at third semester and above) were considered as a factor for predictive invalidity. The Kuder-Richardson formula showed that the reliability coefficients for both numerical reasoning subtests and spatial reasoning were superior, in the range 0.84 – 0.87, whereas the reliability coefficient for the other three subtests were relatively below standard for ability test, in the range of 0.65 – 0.71. It can be concluded that some of the subtests are ready to be used, whereas some others are still need some revisions. This study also demonstrated that the convergent-discrimination method is useful to identify the general intelligence of human.Keywords: intelligence, psychometric examination, multiple aptitude battery, university students
Procedia PDF Downloads 4366350 Water Temperature on Early Age Concrete Property
Authors: Tesfaye Sisay Dessalegn
Abstract:
The long-term performance of concrete structures is affected by the properties and behavior of concrete at an early age. However, the fundamental mechanisms affecting the early-age behavior of concrete have not yet been fully studied. The effect of water temperature on concrete is not sufficiently studied, and at the same time, the majority of studies focused on the effect of mixing water temperature on the workability and mechanical properties of concrete. However, to the best of the authors' knowledge, the effect of mixing water temperatures on plastic shrinkage cracking of concrete has not been studied yet.Keywords: water temperature, early age concrete strength, mechanical properties of concrete, strength
Procedia PDF Downloads 576349 CdS Quantum Dots as Fluorescent Probes for Detection of Naphthalene
Authors: Zhengyu Yan, Yan Yu, Jianqiu Chen
Abstract:
A novel sensing system has been designed for naphthalene detection based on the quenched fluorescence signal of CdS quantum dots. The fluorescence intensity of the system reduced significantly after adding CdS quantum dots to the water pollution model because of the fluorescent static quenching f mechanism. Herein, we have demonstrated the facile methodology can offer a convenient and low analysis cost with the recovery rate as 97.43%-103.2%, which has potential application prospect.Keywords: CdS quantum dots, modification, detection, naphthalene
Procedia PDF Downloads 4936348 Nature of Forest Fragmentation Owing to Human Population along Elevation Gradient in Different Countries in Hindu Kush Himalaya Mountains
Authors: Pulakesh Das, Mukunda Dev Behera, Manchiraju Sri Ramachandra Murthy
Abstract:
Large numbers of people living in and around the Hindu Kush Himalaya (HKH) region, depends on this diverse mountainous region for ecosystem services. Following the global trend, this region also experiencing rapid population growth, and demand for timber and agriculture land. The eight countries sharing the HKH region have different forest resources utilization and conservation policies that exert varying forces in the forest ecosystem. This created a variable spatial as well altitudinal gradient in rate of deforestation and corresponding forest patch fragmentation. The quantitative relationship between fragmentation and demography has not been established before for HKH vis-à-vis along elevation gradient. This current study was carried out to attribute the overall and different nature in landscape fragmentations along the altitudinal gradient with the demography of each sharing countries. We have used the tree canopy cover data derived from Landsat data to analyze the deforestation and afforestation rate, and corresponding landscape fragmentation observed during 2000 – 2010. Area-weighted mean radius of gyration (AMN radius of gyration) was computed owing to its advantage as spatial indicator of fragmentation over non-spatial fragmentation indices. Using the subtraction method, the change in fragmentation was computed during 2000 – 2010. Using the tree canopy cover data as a surrogate of forest cover, highest forest loss was observed in Myanmar followed by China, India, Bangladesh, Nepal, Pakistan, Bhutan, and Afghanistan. However, the sequence of fragmentation was different after the maximum fragmentation observed in Myanmar followed by India, China, Bangladesh, and Bhutan; whereas increase in fragmentation was seen following the sequence of as Nepal, Pakistan, and Afghanistan. Using SRTM-derived DEM, we observed higher rate of fragmentation up to 2400m that corroborated with high human population for the year 2000 and 2010. To derive the nature of fragmentation along the altitudinal gradients, the Statistica software was used, where the user defined function was utilized for regression applying the Gauss-Newton estimation method with 50 iterations. We observed overall logarithmic decrease in fragmentation change (area-weighted mean radius of gyration), forest cover loss and population growth during 2000-2010 along the elevation gradient with very high R2 values (i.e., 0.889, 0.895, 0.944 respectively). The observed negative logarithmic function with the major contribution in the initial elevation gradients suggest to gap filling afforestation in the lower altitudes to enhance the forest patch connectivity. Our finding on the pattern of forest fragmentation and human population across the elevation gradient in HKH region will have policy level implication for different nations and would help in characterizing hotspots of change. Availability of free satellite derived data products on forest cover and DEM, grid-data on demography, and utility of geospatial tools helped in quick evaluation of the forest fragmentation vis-a-vis human impact pattern along the elevation gradient in HKH.Keywords: area-weighted mean radius of gyration, fragmentation, human impact, tree canopy cover
Procedia PDF Downloads 2156347 Assessment of Spectral Indices for Soil Salinity Estimation in Irrigated Land
Authors: R. Lhissou , A. El Harti , K. Chokmani, E. Bachaoui, A. El Ghmari
Abstract:
Soil salinity is a serious environmental hazard in many countries around the world especially the arid and semi-arid countries like Morocco. Salinization causes negative effects on the ground; it affects agricultural production, infrastructure, water resources and biodiversity. Remote sensing can provide soil salinity information for large areas, and in a relatively short time. In addition, remote sensing is not limited by extremes in terrain or hazardous condition. Contrariwise, experimental methods for monitoring soil salinity by direct measurements in situ are very demanding of time and resources, and also very limited in spatial coverage. In the irrigated perimeter of Tadla plain in central Morocco, the increased use of saline groundwater and surface water, coupled with agricultural intensification leads to the deterioration of soil quality especially by salinization. In this study, we assessed several spectral indices of soil salinity cited in the literature using Landsat TM satellite images and field measurements of electrical conductivity (EC). Three Landsat TM satellite images were taken during 3 months in the dry season (September, October and November 2011). Based on field measurement data of EC collected in three field campaigns over the three dates simultaneously with acquisition dates of Landsat TM satellite images, a two assessment techniques are used to validate a soil salinity spectral indices. Firstly, the spectral indices are validated locally by pixel. The second validation technique is made using a window of size 3x3 pixels. The results of the study indicated that the second technique provides getting a more accurate validation and the assessment has shown its limits when it comes to assess across the pixel. In addition, the EC values measured from field have a good correlation with some spectral indices derived from Landsat TM data and the best results show an r² of 0.88, 0.79 and 0.65 for Salinity Index (SI) in the three dates respectively. The results have shown the usefulness of spectral indices as an auxiliary variable in the spatial estimation and mapping salinity in irrigated land.Keywords: remote sensing, spectral indices, soil salinity, irrigated land
Procedia PDF Downloads 3916346 Building User Behavioral Models by Processing Web Logs and Clustering Mechanisms
Authors: Madhuka G. P. D. Udantha, Gihan V. Dias, Surangika Ranathunga
Abstract:
Today Websites contain very interesting applications. But there are only few methodologies to analyze User navigations through the Websites and formulating if the Website is put to correct use. The web logs are only used if some major attack or malfunctioning occurs. Web Logs contain lot interesting dealings on users in the system. Analyzing web logs has become a challenge due to the huge log volume. Finding interesting patterns is not as easy as it is due to size, distribution and importance of minor details of each log. Web logs contain very important data of user and site which are not been put to good use. Retrieving interesting information from logs gives an idea of what the users need, group users according to their various needs and improve site to build an effective and efficient site. The model we built is able to detect attacks or malfunctioning of the system and anomaly detection. Logs will be more complex as volume of traffic and the size and complexity of web site grows. Unsupervised techniques are used in this solution which is fully automated. Expert knowledge is only used in validation. In our approach first clean and purify the logs to bring them to a common platform with a standard format and structure. After cleaning module web session builder is executed. It outputs two files, Web Sessions file and Indexed URLs file. The Indexed URLs file contains the list of URLs accessed and their indices. Web Sessions file lists down the indices of each web session. Then DBSCAN and EM Algorithms are used iteratively and recursively to get the best clustering results of the web sessions. Using homogeneity, completeness, V-measure, intra and inter cluster distance and silhouette coefficient as parameters these algorithms self-evaluate themselves to input better parametric values to run the algorithms. If a cluster is found to be too large then micro-clustering is used. Using Cluster Signature Module the clusters are annotated with a unique signature called finger-print. In this module each cluster is fed to Associative Rule Learning Module. If it outputs confidence and support as value 1 for an access sequence it would be a potential signature for the cluster. Then the access sequence occurrences are checked in other clusters. If it is found to be unique for the cluster considered then the cluster is annotated with the signature. These signatures are used in anomaly detection, prevent cyber attacks, real-time dashboards that visualize users, accessing web pages, predict actions of users and various other applications in Finance, University Websites, News and Media Websites etc.Keywords: anomaly detection, clustering, pattern recognition, web sessions
Procedia PDF Downloads 2886345 Application of Medium High Hydrostatic Pressure in Preserving Textural Quality and Safety of Pineapple Compote
Authors: Nazim Uddin, Yohiko Nakaura, Kazutaka Yamamoto
Abstract:
Compote (fruit in syrup) of pineapple (Ananas comosus L. Merrill) is expected to have a high market potential as one of convenient ready-to-eat (RTE) foods worldwide. High hydrostatic pressure (HHP) in combination with low temperature (LT) was applied to the processing of pineapple compote as well as medium HHP (MHHP) in combination with medium-high temperature (MHT) since both processes can enhance liquid impregnation and inactivate microbes. MHHP+MHT (55 or 65 °C) process, as well as the HHP+LT process, has successfully inactivated the microbes in the compote to a non-detectable level. Although the compotes processed by MHHP+MHT or HHP+LT have lost the fresh texture as in a similar manner as those processed solely by heat, it was indicated that the texture degradations by heat were suppressed under MHHP. Degassing process reduced the hardness, while calcium (Ca) contributed to be retained hardness in MHT and MHHP+MHT processes. Electrical impedance measurement supported the damage due to degassing and heat. The color, Brix, and appearance were not affected by the processing methods significantly. MHHP+MHT and HHP+LT processes may be applicable to produce high-quality, safe RTE pineapple compotes. Further studies on the optimization of packaging and storage condition will be indispensable for commercialization.Keywords: compote of pineapple, RTE, medium high hydrostatic pressure, postharvest loss, texture
Procedia PDF Downloads 1376344 Ibrutinib and the Potential Risk of Cardiac Failure: A Review of Pharmacovigilance Data
Authors: Abdulaziz Alakeel, Roaa Alamri, Abdulrahman Alomair, Mohammed Fouda
Abstract:
Introduction: Ibrutinib is a selective, potent, and irreversible small-molecule inhibitor of Bruton's tyrosine kinase (BTK). It forms a covalent bond with a cysteine residue (CYS-481) at the active site of Btk, leading to inhibition of Btk enzymatic activity. The drug is indicated to treat certain type of cancers such as mantle cell lymphoma (MCL), chronic lymphocytic leukaemia and Waldenström's macroglobulinaemia (WM). Cardiac failure is a condition referred to inability of heart muscle to pump adequate blood to human body organs. There are multiple types of cardiac failure including left and right-sided heart failure, systolic and diastolic heart failures. The aim of this review is to evaluate the risk of cardiac failure associated with the use of ibrutinib and to suggest regulatory recommendations if required. Methodology: Signal Detection team at the National Pharmacovigilance Center (NPC) of Saudi Food and Drug Authority (SFDA) performed a comprehensive signal review using its national database as well as the World Health Organization (WHO) database (VigiBase), to retrieve related information for assessing the causality between cardiac failure and ibrutinib. We used the WHO- Uppsala Monitoring Centre (UMC) criteria as standard for assessing the causality of the reported cases. Results: Case Review: The number of resulted cases for the combined drug/adverse drug reaction are 212 global ICSRs as of July 2020. The reviewers have selected and assessed the causality for the well-documented ICSRs with completeness scores of 0.9 and above (35 ICSRs); the value 1.0 presents the highest score for best-written ICSRs. Among the reviewed cases, more than half of them provides supportive association (four probable and 15 possible cases). Data Mining: The disproportionality of the observed and the expected reporting rate for drug/adverse drug reaction pair is estimated using information component (IC), a tool developed by WHO-UMC to measure the reporting ratio. Positive IC reflects higher statistical association while negative values indicates less statistical association, considering the null value equal to zero. The results of (IC=1.5) revealed a positive statistical association for the drug/ADR combination, which means “Ibrutinib” with “Cardiac Failure” have been observed more than expected when compared to other medications available in WHO database. Conclusion: Health regulators and health care professionals must be aware for the potential risk of cardiac failure associated with ibrutinib and the monitoring of any signs or symptoms in treated patients is essential. The weighted cumulative evidences identified from causality assessment of the reported cases and data mining are sufficient to support a causal association between ibrutinib and cardiac failure.Keywords: cardiac failure, drug safety, ibrutinib, pharmacovigilance, signal detection
Procedia PDF Downloads 1296343 Design and Realization of Double-Delay Line Canceller (DDLC) Using Fpga
Authors: A. E. El-Henawey, A. A. El-Kouny, M. M. Abd –El-Halim
Abstract:
Moving target indication (MTI) which is an anti-clutter technique that limits the display of clutter echoes. It uses the radar received information primarily to display moving targets only. The purpose of MTI is to discriminate moving targets from a background of clutter or slowly-moving chaff particles as shown in this paper. Processing system in these radars is so massive and complex; since it is supposed to perform a great amount of processing in very short time, in most radar applications the response of a single canceler is not acceptable since it does not have a wide notch in the stop-band. A double-delay canceler is an MTI delay-line canceler employing the two-delay-line configuration to improve the performance by widening the clutter-rejection notches, as compared with single-delay cancelers. This canceler is also called a double canceler, dual-delay canceler, or three-pulse canceler. In this paper, a double delay line canceler is chosen for study due to its simplicity in both concept and implementation. Discussing the implementation of a simple digital moving target indicator (DMTI) using FPGA which has distinct advantages compared to other application specific integrated circuit (ASIC) for the purposes of this work. The FPGA provides flexibility and stability which are important factors in the radar application.Keywords: FPGA, MTI, double delay line canceler, Doppler Shift
Procedia PDF Downloads 6446342 Survival Data with Incomplete Missing Categorical Covariates
Authors: Madaki Umar Yusuf, Mohd Rizam B. Abubakar
Abstract:
The survival censored data with incomplete covariate data is a common occurrence in many studies in which the outcome is survival time. With model when the missing covariates are categorical, a useful technique for obtaining parameter estimates is the EM by the method of weights. The survival outcome for the class of generalized linear model is applied and this method requires the estimation of the parameters of the distribution of the covariates. In this paper, we propose some clinical trials with ve covariates, four of which have some missing values which clearly show that they were fully censored data.Keywords: EM algorithm, incomplete categorical covariates, ignorable missing data, missing at random (MAR), Weibull Distribution
Procedia PDF Downloads 4056341 Antimicrobial Properties of SEBS Compounds with Copper Microparticles
Authors: Vanda Ferreira Ribeiro, Daiane Tomacheski, Douglas Naue Simões, Michele Pitto, Ruth Marlene Campomanes Santana
Abstract:
Indoor environments, such as car cabins and public transportation vehicles are places where users are subject to air quality. Microorganisms (bacteria, fungi, yeasts) enter these environments through windows, ventilation systems and may use the organic particles present as a growth substrate. In addition, atmospheric pollutants can act as potential carbon and nitrogen sources for some microorganisms. Compounds base SEBS copolymers, poly(styrene-b-(ethylene-co-butylene)-b-styrene, are a class of thermoplastic elastomers (TPEs), fully recyclable and largely used in automotive parts. Metals, such as cooper and silver, have biocidal activities and the production of the SEBS compounds by melting blending with these agents can be a good option for producing compounds for use in plastic parts of ventilation systems and automotive air-conditioning, in order to minimize the problems caused by growth of pathogenic microorganisms. In this sense, the aim of this work was to evaluate the effect of copper microparticles as antimicrobial agent in compositions based on SEBS/PP/oil/calcite. Copper microparticles were used in weight proportion of 0%, 1%, 2% and 4%. The compounds were prepared using a co-rotating double screw extruder (L/D ratio of 40/1 and 16 mm screw diameter). The processing parameters were 300 rpm of screw rotation rate, with a temperature profile between 150 to 190°C. SEBS based TPE compounds were injection molded. The compounds emission were characterized by gravimetric fogging test. Compounds were characterized by physical (density and staining by contact), mechanical (hardness and tension properties) and rheological properties (melt volume rate – MVR). Antibacterial properties were evaluated against Staphylococcus aureus (S. aureus) and Escherichia coli (E. coli) strains. To avaluate the abilities toward the fungi have been chosen Aspergillus niger (A. niger), Candida albicans (C. albicans), Cladosporium cladosporioides (C. cladosporioides) and Penicillium chrysogenum (P. chrysogenum). The results of biological tests showed a reduction on bacteria in up to 88% in E.coli and up to 93% in S. aureus. The tests with fungi showed no conclusive results because the sample without copper also demonstrated inhibition of the development of these microorganisms. The copper addition did not cause significant variations in mechanical properties, in the MVR and the emission behavior of the compounds. The density increases with the increment of copper in compounds.Keywords: air conditioner, antimicrobial, cooper, SEBS
Procedia PDF Downloads 2826340 Data Management System for Environmental Remediation
Authors: Elizaveta Petelina, Anton Sizo
Abstract:
Environmental remediation projects deal with a wide spectrum of data, including data collected during site assessment, execution of remediation activities, and environmental monitoring. Therefore, an appropriate data management is required as a key factor for well-grounded decision making. The Environmental Data Management System (EDMS) was developed to address all necessary data management aspects, including efficient data handling and data interoperability, access to historical and current data, spatial and temporal analysis, 2D and 3D data visualization, mapping, and data sharing. The system focuses on support of well-grounded decision making in relation to required mitigation measures and assessment of remediation success. The EDMS is a combination of enterprise and desktop level data management and Geographic Information System (GIS) tools assembled to assist to environmental remediation, project planning, and evaluation, and environmental monitoring of mine sites. EDMS consists of seven main components: a Geodatabase that contains spatial database to store and query spatially distributed data; a GIS and Web GIS component that combines desktop and server-based GIS solutions; a Field Data Collection component that contains tools for field work; a Quality Assurance (QA)/Quality Control (QC) component that combines operational procedures for QA and measures for QC; Data Import and Export component that includes tools and templates to support project data flow; a Lab Data component that provides connection between EDMS and laboratory information management systems; and a Reporting component that includes server-based services for real-time report generation. The EDMS has been successfully implemented for the Project CLEANS (Clean-up of Abandoned Northern Mines). Project CLEANS is a multi-year, multimillion-dollar project aimed at assessing and reclaiming 37 uranium mine sites in northern Saskatchewan, Canada. The EDMS has effectively facilitated integrated decision-making for CLEANS project managers and transparency amongst stakeholders.Keywords: data management, environmental remediation, geographic information system, GIS, decision making
Procedia PDF Downloads 1616339 Offline Signature Verification Using Minutiae and Curvature Orientation
Authors: Khaled Nagaty, Heba Nagaty, Gerard McKee
Abstract:
A signature is a behavioral biometric that is used for authenticating users in most financial and legal transactions. Signatures can be easily forged by skilled forgers. Therefore, it is essential to verify whether a signature is genuine or forged. The aim of any signature verification algorithm is to accommodate the differences between signatures of the same person and increase the ability to discriminate between signatures of different persons. This work presented in this paper proposes an automatic signature verification system to indicate whether a signature is genuine or not. The system comprises four phases: (1) The pre-processing phase in which image scaling, binarization, image rotation, dilation, thinning, and connecting ridge breaks are applied. (2) The feature extraction phase in which global and local features are extracted. The local features are minutiae points, curvature orientation, and curve plateau. The global features are signature area, signature aspect ratio, and Hu moments. (3) The post-processing phase, in which false minutiae are removed. (4) The classification phase in which features are enhanced before feeding it into the classifier. k-nearest neighbors and support vector machines are used. The classifier was trained on a benchmark dataset to compare the performance of the proposed offline signature verification system against the state-of-the-art. The accuracy of the proposed system is 92.3%.Keywords: signature, ridge breaks, minutiae, orientation
Procedia PDF Downloads 1466338 Correlation Analysis to Quantify Learning Outcomes for Different Teaching Pedagogies
Authors: Kanika Sood, Sijie Shang
Abstract:
A fundamental goal of education includes preparing students to become a part of the global workforce by making beneficial contributions to society. In this paper, we analyze student performance for multiple courses that involve different teaching pedagogies: a cooperative learning technique and an inquiry-based learning strategy. Student performance includes student engagement, grades, and attendance records. We perform this study in the Computer Science department for online and in-person courses for 450 students. We will perform correlation analysis to study the relationship between student scores and other parameters such as gender, mode of learning. We use natural language processing and machine learning to analyze student feedback data and performance data. We assess the learning outcomes of two teaching pedagogies for undergraduate and graduate courses to showcase the impact of pedagogical adoption and learning outcome as determinants of academic achievement. Early findings suggest that when using the specified pedagogies, students become experts on their topics and illustrate enhanced engagement with peers.Keywords: bag-of-words, cooperative learning, education, inquiry-based learning, in-person learning, natural language processing, online learning, sentiment analysis, teaching pedagogy
Procedia PDF Downloads 776337 C2N2 Adsorption on the Surface of a BN Nanosheet: A DFT Study
Authors: Maziar Noei
Abstract:
Calculation showed that when the nanosheet is doped by Si, the adsorption energy is about -85.62 to -87.43kcal/mol and also the amount of HOMO/LUMO energy gap (Eg) will reduce significantly. Boron nitride nanosheet is a suitable adsorbent for cyanogen and can be used in separation processes cyanogen. It seems that nanosheet (BNNS) is a suitable semiconductor after doping. The doped BNNS in the presence of cyanogens (C2N2) an electrical signal is generating directly and, therefore, can potentially be used for cyanogen sensors.Keywords: nanosheet, DFT, cyanogen, sensors
Procedia PDF Downloads 2826336 The Impact of External Technology Acquisition and Exploitation on Firms' Process Innovation Performance
Authors: Thammanoon Charmjuree, Yuosre F. Badir, Umar Safdar
Abstract:
There is a consensus among innovation scholars that knowledge is a vital antecedent for firm’s innovation; e.g., process innovation. Recently, there has been an increasing amount of attention to more open approaches to innovation. This open model emphasizes the use of purposive flows of knowledge across the organization boundaries. Firms adopt open innovation strategy to improve their innovation performance by bringing knowledge into the organization (inbound open innovation) to accelerate internal innovation or transferring knowledge outside (outbound open innovation) to expand the markets for external use of innovation. Reviewing open innovation research reveals the following. First, the majority of existing studies have focused on inbound open innovation and less on outbound open innovation. Second, limited research has considered the possible interaction between both and how this interaction may impact the firm’s innovation performance. Third, scholars have focused mainly on the impact of open innovation strategy on product innovation and less on process innovation. Therefore, our knowledge of the relationship between firms’ inbound and outbound open innovation and how these two impact process innovation is still limited. This study focuses on the firm’s external technology acquisition (ETA) and external technology exploitation (ETE) and the firm’s process innovation performance. The ETA represents inbound openness in which firms rely on the acquisition and absorption of external technologies to complement their technology portfolios. The ETE, on the other hand, refers to commercializing technology assets exclusively or in addition to their internal application. This study hypothesized that both ETA and ETE have a positive relationship with process innovation performance and that ETE fully mediates the relationship between ETA and process innovation performance, i.e., ETA has a positive impact on ETE, and turn, ETE has a positive impact on process innovation performance. This study empirically explored these hypotheses in software development firms in Thailand. These firms were randomly selected from a list of Software firms registered with the Department of Business Development, Ministry of Commerce of Thailand. The questionnaires were sent to 1689 firms. After follow-ups and periodic reminders, we obtained 329 (19.48%) completed usable questionnaires. The structure question modeling (SEM) has been used to analyze the data. An analysis of the outcome of 329 firms provides support for our three hypotheses: First, the firm’s ETA has a positive impact on its process innovation performance. Second, the firm’s ETA has a positive impact its ETE. Third, the firm’s ETE fully mediates the relationship between the firm’s ETA and its process innovation performance. This study fills up the gap in open innovation literature by examining the relationship between inbound (ETA) and outbound (ETE) open innovation and suggest that in order to benefits from the promises of openness, firms must engage in both. The study went one step further by explaining the mechanism through which ETA influence process innovation performance.Keywords: process innovation performance, external technology acquisition, external technology exploitation, open innovation
Procedia PDF Downloads 2036335 Taguchi Robust Design for Optimal Setting of Process Wastes Parameters in an Automotive Parts Manufacturing Company
Authors: Charles Chikwendu Okpala, Christopher Chukwutoo Ihueze
Abstract:
As a technique that reduces variation in a product by lessening the sensitivity of the design to sources of variation, rather than by controlling their sources, Taguchi Robust Design entails the designing of ideal goods, by developing a product that has minimal variance in its characteristics and also meets the desired exact performance. This paper examined the concept of the manufacturing approach and its application to brake pad product of an automotive parts manufacturing company. Although the firm claimed that only defects, excess inventory, and over-production were the few wastes that grossly affect their productivity and profitability, a careful study and analysis of their manufacturing processes with the application of Single Minute Exchange of Dies (SMED) tool showed that the waste of waiting is the fourth waste that bedevils the firm. The selection of the Taguchi L9 orthogonal array which is based on the four parameters and the three levels of variation for each parameter revealed that with a range of 2.17, that waiting is the major waste that the company must reduce in order to continue to be viable. Also, to enhance the company’s throughput and profitability, the wastes of over-production, excess inventory, and defects with ranges of 2.01, 1.46, and 0.82, ranking second, third, and fourth respectively must also be reduced to the barest minimum. After proposing -33.84 as the highest optimum Signal-to-Noise ratio to be maintained for the waste of waiting, the paper advocated for the adoption of all the tools and techniques of Lean Production System (LPS), and Continuous Improvement (CI), and concluded by recommending SMED in order to drastically reduce set up time which leads to unnecessary waiting.Keywords: lean production system, single minute exchange of dies, signal to noise ratio, Taguchi robust design, waste
Procedia PDF Downloads 1266334 Analysis of Translational Ship Oscillations in a Realistic Environment
Authors: Chen Zhang, Bernhard Schwarz-Röhr, Alexander Härting
Abstract:
To acquire accurate ship motions at the center of gravity, a single low-cost inertial sensor is utilized and applied on board to measure ship oscillating motions. As observations, the three axes accelerations and three axes rotational rates provided by the sensor are used. The mathematical model of processing the observation data includes determination of the distance vector between the sensor and the center of gravity in x, y, and z directions. After setting up the transfer matrix from sensor’s own coordinate system to the ship’s body frame, an extended Kalman filter is applied to deal with nonlinearities between the ship motion in the body frame and the observation information in the sensor’s frame. As a side effect, the method eliminates sensor noise and other unwanted errors. Results are not only roll and pitch, but also linear motions, in particular heave and surge at the center of gravity. For testing, we resort to measurements recorded on a small vessel in a well-defined sea state. With response amplitude operators computed numerically by a commercial software (Seaway), motion characteristics are estimated. These agree well with the measurements after processing with the suggested method.Keywords: extended Kalman filter, nonlinear estimation, sea trial, ship motion estimation
Procedia PDF Downloads 5236333 Image Segmentation Techniques: Review
Authors: Lindani Mbatha, Suvendi Rimer, Mpho Gololo
Abstract:
Image segmentation is the process of dividing an image into several sections, such as the object's background and the foreground. It is a critical technique in both image-processing tasks and computer vision. Most of the image segmentation algorithms have been developed for gray-scale images and little research and algorithms have been developed for the color images. Most image segmentation algorithms or techniques vary based on the input data and the application. Nearly all of the techniques are not suitable for noisy environments. Most of the work that has been done uses the Markov Random Field (MRF), which involves the computations and is said to be robust to noise. In the past recent years' image segmentation has been brought to tackle problems such as easy processing of an image, interpretation of the contents of an image, and easy analysing of an image. This article reviews and summarizes some of the image segmentation techniques and algorithms that have been developed in the past years. The techniques include neural networks (CNN), edge-based techniques, region growing, clustering, and thresholding techniques and so on. The advantages and disadvantages of medical ultrasound image segmentation techniques are also discussed. The article also addresses the applications and potential future developments that can be done around image segmentation. This review article concludes with the fact that no technique is perfectly suitable for the segmentation of all different types of images, but the use of hybrid techniques yields more accurate and efficient results.Keywords: clustering-based, convolution-network, edge-based, region-growing
Procedia PDF Downloads 966332 Computational Linguistic Implications of Gender Bias: Machines Reflect Misogyny in Society
Authors: Irene Yi
Abstract:
Machine learning, natural language processing, and neural network models of language are becoming more and more prevalent in the fields of technology and linguistics today. Training data for machines are at best, large corpora of human literature and at worst, a reflection of the ugliness in society. Computational linguistics is a growing field dealing with such issues of data collection for technological development. Machines have been trained on millions of human books, only to find that in the course of human history, derogatory and sexist adjectives are used significantly more frequently when describing females in history and literature than when describing males. This is extremely problematic, both as training data, and as the outcome of natural language processing. As machines start to handle more responsibilities, it is crucial to ensure that they do not take with them historical sexist and misogynistic notions. This paper gathers data and algorithms from neural network models of language having to deal with syntax, semantics, sociolinguistics, and text classification. Computational analysis on such linguistic data is used to find patterns of misogyny. Results are significant in showing the existing intentional and unintentional misogynistic notions used to train machines, as well as in developing better technologies that take into account the semantics and syntax of text to be more mindful and reflect gender equality. Further, this paper deals with the idea of non-binary gender pronouns and how machines can process these pronouns correctly, given its semantic and syntactic context. This paper also delves into the implications of gendered grammar and its effect, cross-linguistically, on natural language processing. Languages such as French or Spanish not only have rigid gendered grammar rules, but also historically patriarchal societies. The progression of society comes hand in hand with not only its language, but how machines process those natural languages. These ideas are all extremely vital to the development of natural language models in technology, and they must be taken into account immediately.Keywords: computational analysis, gendered grammar, misogynistic language, neural networks
Procedia PDF Downloads 1196331 Classifying Turbomachinery Blade Mode Shapes Using Artificial Neural Networks
Authors: Ismail Abubakar, Hamid Mehrabi, Reg Morton
Abstract:
Currently, extensive signal analysis is performed in order to evaluate structural health of turbomachinery blades. This approach is affected by constraints of time and the availability of qualified personnel. Thus, new approaches to blade dynamics identification that provide faster and more accurate results are sought after. Generally, modal analysis is employed in acquiring dynamic properties of a vibrating turbomachinery blade and is widely adopted in condition monitoring of blades. The analysis provides useful information on the different modes of vibration and natural frequencies by exploring different shapes that can be taken up during vibration since all mode shapes have their corresponding natural frequencies. Experimental modal testing and finite element analysis are the traditional methods used to evaluate mode shapes with limited application to real live scenario to facilitate a robust condition monitoring scheme. For a real time mode shape evaluation, rapid evaluation and low computational cost is required and traditional techniques are unsuitable. In this study, artificial neural network is developed to evaluate the mode shape of a lab scale rotating blade assembly by using result from finite element modal analysis as training data. The network performance evaluation shows that artificial neural network (ANN) is capable of mapping the correlation between natural frequencies and mode shapes. This is achieved without the need of extensive signal analysis. The approach offers advantage from the perspective that the network is able to classify mode shapes and can be employed in real time including simplicity in implementation and accuracy of the prediction. The work paves the way for further development of robust condition monitoring system that incorporates real time mode shape evaluation.Keywords: modal analysis, artificial neural network, mode shape, natural frequencies, pattern recognition
Procedia PDF Downloads 1566330 Extracellular Enzymes from Halophilic Bacteria with Potential in Agricultural Secondary Flow Recovery Products
Authors: Madalin Enache, Simona Neagu, Roxana Cojoc, Ioana Gomoiu, Delia Ionela Dobre, Ancuta Roxana Trifoi
Abstract:
Various types of halophilic and halotolerant microorganisms able to be cultivated in laboratory on culture media with a wide range of sodium chloride content are isolated from several salted environments. The extracellular enzymes of these microorganisms showed the enzymatic activity in these spectrums of salinity thus being attractive for several biotechnological processes developed at high ionic strength. In present work, a number of amylase, protease, esterase, lipase, cellulase, pectinase, xilanases and innulinase were identified for more than 50th bacterial strains isolated from water samples and sapropelic mud from four saline and hypersaline lakes located in Romanian plain. On the other hand, the cellulase and pectinase activity were also detected in some halotolerant microorganisms isolated from secondary agricultural flow of grapes processing. The preliminary data revealed that from totally tested strains seven harbor proteases activity, eight amylase activity, four for esterase and another four for lipase, three for pectinase and for one strain were identified either cellulase or pectinase activity. There were no identified enzymes able to hydrolase innulin added to culture media. Several strains isolated from sapropelic mud showed multiple extracellular enzymatic activities, namely three strains harbor three activities and another seven harbor two activities. The data revealed that amylase and protease activities were frequently detected if compare with other tested enzymes. In the case of pectinase were investigated, their ability to be used for increasing resveratrol recovery from material resulted after grapes processing. In this way, the resulted material from grapes processing was treated with microbial supernatant for several times (two, four and 24 hours) and the content of resveratrol was detected by High Performance Liquid Chromatography method (HPLC). The preliminary data revealed some positive results of this treatment.Keywords: halophilic microorganisms, enzymes, pectinase, salinity
Procedia PDF Downloads 1946329 Adapting Grain Crop Cleaning Equipment for Sesame and Other Emerging Spice Crops
Authors: Ramadas Narayanan, Surya Bhattrai, Vu Hoan
Abstract:
Threshing and cleaning are crucial post-harvest procedures that are carried out to separate the grain or seed from the harvested plant and eliminate any potential contaminants or foreign debris. After harvesting, threshing and cleaning are necessary for the clean seeds to guarantee high quality and acceptable for consumption or further processing. For mechanised production, threshing can be conducted in a thresher. Afterwards, the seeds are to be cleaned in dedicated seed-cleaning facilities. This research investigates the effectiveness of Kimseed cleaning equipment MK3, designed for grain crops for processing new crops such as sesame, fennel and kalonji. Subsequently, systematic trials were conducted to adapt the equipment to the applications in sesame and spice crops. It was done to develop methods for mechanising harvest and post-harvest operations. For sesame, it is recommended to have t a two-step process in the cleaning machine to remove large and small contaminants. The first step is to remove the large contaminants, and the second is to remove the smaller ones. The optimal parameters for cleaning fennel are a shaker frequency of 6.0 to 6.5 Hz and an airflow of 1.0 to 1.5 m/s. The optimal parameters for cleaning kalonji are a shaker frequency of 5.5Hz to 6.0 Hz and airflow of 1.0 to under 1.5m/s.Keywords: sustainable mechanisation, sead cleaning process, optimal setting, shaker frequency
Procedia PDF Downloads 736328 The Spatial Pattern of Economic Rents of an Airport Development Area: Lessons Learned from the Suvarnabhumi International Airport, Thailand
Authors: C. Bejrananda, Y. Lee, T. Khamkaew
Abstract:
With the rise of the importance of air transportation in the 21st century, the role of economics in airport planning and decision-making has become more important to the urban structure and land value around it. Therefore, this research aims to examine the relationship between an airport and its impacts on the distribution of urban land uses and land values by applying the Alonso’s bid rent model. The New Bangkok International Airport (Suvarnabhumi International Airport) was taken as a case study. The analysis was made over three different time periods of airport development (after the airport site was proposed, during airport construction, and after the opening of the airport). The statistical results confirm that Alonso’s model can be used to explain the impacts of the new airport only for the northeast quadrant of the airport, while proximity to the airport showed the inverse relationship with the land value of all six types of land use activities through three periods of time. It indicates that the land value for commercial land use is the most sensitive to the location of the airport or has the strongest requirement for accessibility to the airport compared to the residential and manufacturing land use. Also, the bid-rent gradients of the six types of land use activities have declined dramatically through the three time periods because of the Asian Financial Crisis in 1997. Therefore, the lesson learned from this research concerns about the reliability of the data used. The major concern involves the use of different areal units for assessing land value for different time periods between zone block (1995) and grid block (2002, 2009). As a result, this affect the investigation of the overall trends of land value assessment, which are not readily apparent. In addition, the next concern is the availability of the historical data. With the lack of collecting historical data for land value assessment by the government, some of data of land values and aerial photos are not available to cover the entire study area. Finally, the different formats of using aerial photos between hard-copy (1995) and digital photo (2002, 2009) made difficult for measuring distances. Therefore, these problems also affect the accuracy of the results of the statistical analyses.Keywords: airport development area, economic rents, spatial pattern, suvarnabhumi international airport
Procedia PDF Downloads 2746327 Obtaining Nutritive Powder from Peel of Mangifera Indica L. (Mango) as a Food Additive
Authors: Chajira Garrote, Laura Arango, Lourdes Merino
Abstract:
This research explains how to obtain nutritious powder from a variety of ripe mango peels Hilacha (Mangifera indica L.) to use it as a food additive. Also, this study intends to use efficiently the by-products resulting from the operations of mango pulp manufacturing process by processing companies with the aim of giving them an added value. The physical and chemical characteristics of the mango peels and the benefits that may help humans, were studied. Unit operations are explained for the processing of mango peels and the production of nutritive powder as a food additive. Emphasis is placed on the preliminary operations applied to the raw material and on the drying method, which is very important in this project to obtain the suitable characteristics of the nutritive powder. Once the powder was obtained, it was subjected to laboratory tests to determine its functional properties: water retention capacity (WRC) and oil retention capacity (ORC), also a sensory analysis for the powder was performed to determine the product profile. The nutritive powder from the ripe mango peels reported excellent WRC and ORC values: 7.236 g of water / g B.S. and 1.796 g water / g B.S. respectively and the sensory analysis defined a complete profile of color, odor and texture of the nutritive powder, which is suitable to use it in the food industry.Keywords: mango, peel, powder, nutritive, functional properties, sensory analysis
Procedia PDF Downloads 3566326 Effective Solvents for Proteins Recovery from Microalgae
Authors: Win Nee Phong, Tau Chuan Ling, Pau Loke Show
Abstract:
From an industrial perspective, the exploitation of microalgae for protein source is of great economical and commercial interest due to numerous attractive characteristics. Nonetheless, the release of protein from microalgae is limited by the multiple layers of the rigid thick cell wall that generally contain a large proportion of cellulose. Thus an efficient cell disruption process is required to rupture the cell wall. The conventional downstream processing methods which typically involve several unit operational steps such as disruption, isolation, extraction, concentration and purification are energy-intensive and costly. To reduce the overall cost and establish a feasible technology for the success of the large-scale production, microalgal industry today demands a more cost-effective and eco-friendly technique in downstream processing. One of the main challenges to extract the proteins from microalgae is the presence of rigid cell wall. This study aims to provide some guidance on the selection of the efficient solvent to facilitate the proteins released during the cell disruption process. The effects of solvent types such as methanol, ethanol, 1-propanol and water in rupturing the microalgae cell wall were studied. It is interesting to know that water is the most effective solvent to recover proteins from microalgae and the cost is cheapest among all other solvents.Keywords: green, microalgae, protein, solvents
Procedia PDF Downloads 2586325 Unsupervised Part-of-Speech Tagging for Amharic Using K-Means Clustering
Authors: Zelalem Fantahun
Abstract:
Part-of-speech tagging is the process of assigning a part-of-speech or other lexical class marker to each word into naturally occurring text. Part-of-speech tagging is the most fundamental and basic task almost in all natural language processing. In natural language processing, the problem of providing large amount of manually annotated data is a knowledge acquisition bottleneck. Since, Amharic is one of under-resourced language, the availability of tagged corpus is the bottleneck problem for natural language processing especially for POS tagging. A promising direction to tackle this problem is to provide a system that does not require manually tagged data. In unsupervised learning, the learner is not provided with classifications. Unsupervised algorithms seek out similarity between pieces of data in order to determine whether they can be characterized as forming a group. This paper explicates the development of unsupervised part-of-speech tagger using K-Means clustering for Amharic language since large amount of data is produced in day-to-day activities. In the development of the tagger, the following procedures are followed. First, the unlabeled data (raw text) is divided into 10 folds and tokenization phase takes place; at this level, the raw text is chunked at sentence level and then into words. The second phase is feature extraction which includes word frequency, syntactic and morphological features of a word. The third phase is clustering. Among different clustering algorithms, K-means is selected and implemented in this study that brings group of similar words together. The fourth phase is mapping, which deals with looking at each cluster carefully and the most common tag is assigned to a group. This study finds out two features that are capable of distinguishing one part-of-speech from others these are morphological feature and positional information and show that it is possible to use unsupervised learning for Amharic POS tagging. In order to increase performance of the unsupervised part-of-speech tagger, there is a need to incorporate other features that are not included in this study, such as semantic related information. Finally, based on experimental result, the performance of the system achieves a maximum of 81% accuracy.Keywords: POS tagging, Amharic, unsupervised learning, k-means
Procedia PDF Downloads 4516324 Selection of Optimal Reduced Feature Sets of Brain Signal Analysis Using Heuristically Optimized Deep Autoencoder
Authors: Souvik Phadikar, Nidul Sinha, Rajdeep Ghosh
Abstract:
In brainwaves research using electroencephalogram (EEG) signals, finding the most relevant and effective feature set for identification of activities in the human brain is a big challenge till today because of the random nature of the signals. The feature extraction method is a key issue to solve this problem. Finding those features that prove to give distinctive pictures for different activities and similar for the same activities is very difficult, especially for the number of activities. The performance of a classifier accuracy depends on this quality of feature set. Further, more number of features result in high computational complexity and less number of features compromise with the lower performance. In this paper, a novel idea of the selection of optimal feature set using a heuristically optimized deep autoencoder is presented. Using various feature extraction methods, a vast number of features are extracted from the EEG signals and fed to the autoencoder deep neural network. The autoencoder encodes the input features into a small set of codes. To avoid the gradient vanish problem and normalization of the dataset, a meta-heuristic search algorithm is used to minimize the mean square error (MSE) between encoder input and decoder output. To reduce the feature set into a smaller one, 4 hidden layers are considered in the autoencoder network; hence it is called Heuristically Optimized Deep Autoencoder (HO-DAE). In this method, no features are rejected; all the features are combined into the response of responses of the hidden layer. The results reveal that higher accuracy can be achieved using optimal reduced features. The proposed HO-DAE is also compared with the regular autoencoder to test the performance of both. The performance of the proposed method is validated and compared with the other two methods recently reported in the literature, which reveals that the proposed method is far better than the other two methods in terms of classification accuracy.Keywords: autoencoder, brainwave signal analysis, electroencephalogram, feature extraction, feature selection, optimization
Procedia PDF Downloads 114