Search results for: computer processing of large databases
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 12479

Search results for: computer processing of large databases

10739 ARGO: An Open Designed Unmanned Surface Vehicle Mapping Autonomous Platform

Authors: Papakonstantinou Apostolos, Argyrios Moustakas, Panagiotis Zervos, Dimitrios Stefanakis, Manolis Tsapakis, Nektarios Spyridakis, Mary Paspaliari, Christos Kontos, Antonis Legakis, Sarantis Houzouris, Konstantinos Topouzelis

Abstract:

For years unmanned and remotely operated robots have been used as tools in industry research and education. The rapid development and miniaturization of sensors that can be attached to remotely operated vehicles in recent years allowed industry leaders and researchers to utilize them as an affordable means for data acquisition in air, land, and sea. Despite the recent developments in the ground and unmanned airborne vehicles, a small number of Unmanned Surface Vehicle (USV) platforms are targeted for mapping and monitoring environmental parameters for research and industry purposes. The ARGO project is developed an open-design USV equipped with multi-level control hardware architecture and state-of-the-art sensors and payloads for the autonomous monitoring of environmental parameters in large sea areas. The proposed USV is a catamaran-type USV controlled over a wireless radio link (5G) for long-range mapping capabilities and control for a ground-based control station. The ARGO USV has a propulsion control using 2x fully redundant electric trolling motors with active vector thrust for omnidirectional movement, navigation with opensource autopilot system with high accuracy GNSS device, and communication with the 2.4Ghz digital link able to provide 20km of Line of Sight (Los) range distance. The 3-meter dual hull design and composite structure offer well above 80kg of usable payload capacity. Furthermore, sun and friction energy harvesting methods provide clean energy to the propulsion system. The design is highly modular, where each component or payload can be replaced or modified according to the desired task (industrial or research). The system can be equipped with Multiparameter Sonde, measuring up to 20 water parameters simultaneously, such as conductivity, salinity, turbidity, dissolved oxygen, etc. Furthermore, a high-end multibeam echo sounder can be installed in a specific boat datum for shallow water high-resolution seabed mapping. The system is designed to operate in the Aegean Sea. The developed USV is planned to be utilized as a system for autonomous data acquisition, mapping, and monitoring bathymetry and various environmental parameters. ARGO USV can operate in small or large ports with high maneuverability and endurance to map large geographical extends at sea. The system presents state of the art solutions in the following areas i) the on-board/real-time data processing/analysis capabilities, ii) the energy-independent and environmentally friendly platform entirely made using the latest aeronautical and marine materials, iii) the integration of advanced technology sensors, all in one system (photogrammetric and radiometric footprint, as well as its connection with various environmental and inertial sensors) and iv) the information management application. The ARGO web-based application enables the system to depict the results of the data acquisition process in near real-time. All the recorded environmental variables and indices are presented, allowing users to remotely access all the raw and processed information using the implemented web-based GIS application.

Keywords: monitor marine environment, unmanned surface vehicle, mapping bythometry, sea environmental monitoring

Procedia PDF Downloads 113
10738 The Effect of Using Computer-Assisted Translation Tools on the Translation of Collocations

Authors: Hassan Mahdi

Abstract:

The integration of computer-assisted translation (CAT) tools in translation creates several opportunities for translators. However, this integration is not useful in all types of English structures. This study aims at examining the impact of using CAT tools in translating collocations. Seventy students of English as a foreign language participated in this study. The participants were divided into three groups (i.e., CAT tools group, Machine Translation group, and the control group). The comparison of the results obtained from the translation output of the three groups demonstrated the improvement of translation using CAT tools. The results indicated that the participants who used CAT tools outscored the participants who used MT, and in turn, both groups outscored the control group who did not use any type of technology in translation. In addition, there was a significant difference in the use of CAT for translation different types of collocations. The results also indicated that CAT tools were more effective in translation fixed and medium-strength collocations than weak collocations. Finally, the results showed that CAT tools were effective in translation collocations in both types of languages (i.e. target language or source language). The study suggests some guidelines for translators to use CAT tools.

Keywords: machine translation, computer-assisted translation, collocations, technology

Procedia PDF Downloads 180
10737 An AI-generated Semantic Communication Platform in HCI Course

Authors: Yi Yang, Jiasong Sun

Abstract:

Almost every aspect of our daily lives is now intertwined with some degree of human-computer interaction (HCI). HCI courses draw on knowledge from disciplines as diverse as computer science, psychology, design principles, anthropology, and more. Our HCI courses, named the Media and Cognition course, are constantly updated to reflect state-of-the-art technological advancements such as virtual reality, augmented reality, and artificial intelligence-based interactions. For more than a decade, our course has used an interest-based approach to teaching, in which students proactively propose some research-based questions and collaborate with teachers, using course knowledge to explore potential solutions. Semantic communication plays a key role in facilitating understanding and interaction between users and computer systems, ultimately enhancing system usability and user experience. The advancements in AI-generated technology, which have gained significant attention from both academia and industry in recent years, are exemplified by language models like GPT-3 that generate human-like dialogues from given prompts. Our latest version of the Human-Computer Interaction course practices a semantic communication platform based on AI-generated techniques. The purpose of this semantic communication is twofold: to extract and transmit task-specific information while ensuring efficient end-to-end communication with minimal latency. An AI-generated semantic communication platform evaluates the retention of signal sources and converts low-retain ability visual signals into textual prompts. These data are transmitted through AI-generated techniques and reconstructed at the receiving end; on the other hand, visual signals with a high retain ability rate are compressed and transmitted according to their respective regions. The platform and associated research are a testament to our students' growing ability to independently investigate state-of-the-art technologies.

Keywords: human-computer interaction, media and cognition course, semantic communication, retainability, prompts

Procedia PDF Downloads 87
10736 A Neural Network Classifier for Identifying Duplicate Image Entries in Real-Estate Databases

Authors: Sergey Ermolin, Olga Ermolin

Abstract:

A Deep Convolution Neural Network with Triplet Loss is used to identify duplicate images in real-estate advertisements in the presence of image artifacts such as watermarking, cropping, hue/brightness adjustment, and others. The effects of batch normalization, spatial dropout, and various convergence methodologies on the resulting detection accuracy are discussed. For comparative Return-on-Investment study (per industry request), end-2-end performance is benchmarked on both Nvidia Titan GPUs and Intel’s Xeon CPUs. A new real-estate dataset from San Francisco Bay Area is used for this work. Sufficient duplicate detection accuracy is achieved to supplement other database-grounded methods of duplicate removal. The implemented method is used in a Proof-of-Concept project in the real-estate industry.

Keywords: visual recognition, convolutional neural networks, triplet loss, spatial batch normalization with dropout, duplicate removal, advertisement technologies, performance benchmarking

Procedia PDF Downloads 322
10735 Nazi Experiments during World War II: Dismal Period for Bioethics

Authors: Catharina O. Vianna Dias da Silva, Amanda F. Batista, Ana Clara C. Burgos Lessa, Carolina S. Lucchesi Ramacciotti, Maria Clara B. de Andrade, Roberto de B. Silva

Abstract:

This article aims to analyze the bioethical aspects related to the historical practices of experiments on humans that occurred in Nazi Germany during the period of World War II (1939-1945). The method was based on the bibliographic review of articles published in databases such as SciELO and Pubmed. In the discussion, historical and humanistic aspects that contributed to the construction of a genocidal culture practiced during this period were analyzed. Additionally, an ethical question arises: should the information acquired during this dark period be used by science? After analysis, it was found that these Nazi experiments went over medical and ethical principles, being a deplorable milestone in history. It was also concluded that, although they generated potentially 'useful' results in the scientific field, they should be discarded as an ethical question of principle, of never daring to validate such a deplorable way of obtaining knowledge.

Keywords: Nazism, bioethics, human experimentation, human rights, genocide, torture, medicine

Procedia PDF Downloads 151
10734 Effect of Temperature and Deformation Mode on Texture Evolution of AA6061

Authors: M. Ghosh, A. Miroux, L. A. I. Kestens

Abstract:

At molecular or micrometre scale, practically all materials are neither homogeneous nor isotropic. The concept of texture is used to identify the structural features that cause the properties of a material to be anisotropic. For metallic materials, the anisotropy of the mechanical behaviour originates from the crystallographic nature of plastic deformation, and is therefore controlled by the crystallographic texture. Anisotropy in mechanical properties often constitutes a disadvantage in the application of materials, as it is often illustrated by the earing phenomena during drawing. However, advantages may also be attained when considering other properties (e.g. optimization of magnetic behaviour to a specific direction) by controlling texture through thermo-mechanical processing). Nevertheless, in order to have better control over the final properties it is essential to relate texture with materials processing route and subsequently optimise their performance. However, up to date, few studies have been reported about the evolution of texture in 6061 aluminium alloy during warm processing (from room temperature to 250ºC). In present investigation, recrystallized 6061 aluminium alloy samples were subjected to tensile and plane strain compression (PSC) at room and warm temperatures. The gradual change of texture following both deformation modes were measured and discussed. Tensile tests demonstrate the mechanism at low strain while PSC does the same at high strain and eventually simulate the condition of rolling. Cube dominated texture of the initial rolled and recrystallized AA6061 sheets were replaced by domination of S and R components after PSC at room temperature, warm temperature (250ºC) though did not reflect any noticeable deviation from room temperature observation. It was also noticed that temperature has no significant effect on the evolution of grain morphology during PSC. The band contrast map revealed that after 30% deformation the substructure inside the grain is mainly made of series of parallel bands. A tendency for decrease of Cube and increase of Goss was noticed after tensile deformation compared to as-received material. Like PSC, texture does not change after deformation at warm temperature though. n-fibre was noticed for all the three textures from Goss to Cube.

Keywords: AA 6061, deformation, temperature, tensile, PSC, texture

Procedia PDF Downloads 472
10733 Effects of Pulsed Electromagnetic and Static Magnetic Fields on Musculoskeletal Low Back Pain: A Systematic Review Approach

Authors: Mohammad Javaherian, Siamak Bashardoust Tajali, Monavvar Hadizadeh

Abstract:

Objective: This systematic review study was conducted to evaluate the effects of Pulsed Electromagnetic (PEMF) and Static Magnetic Fields (SMG) on pain relief and functional improvement in patients with musculoskeletal Low Back Pain (LBP). Methods: Seven electronic databases were searched by two researchers independently to identify the published Randomized Controlled Trials (RCTs) on the efficacy of pulsed electromagnetic, static magnetic, and therapeutic nuclear magnetic fields. The identified databases for systematic search were Ovid Medline®, Ovid Cochrane RCTs and Reviews, PubMed, Web of Science, Cochrane Library, CINAHL, and EMBASE from 1968 to February 2016. The relevant keywords were selected by Mesh. After initial search and finding relevant manuscripts, all references in selected studies were searched to identify second hand possible manuscripts. The published RCTs in English would be included to the study if they reported changes on pain and/or functional disability following application of magnetic fields on chronic musculoskeletal low back pain. All studies with surgical patients, patients with pelvic pain, and combination of other treatment techniques such as acupuncture or diathermy were excluded. The identified studies were critically appraised and the data were extracted independently by two raters (M.J and S.B.T). Probable disagreements were resolved through discussion between raters. Results: In total, 1505 abstracts were found following the initial electronic search. The abstracts were reviewed to identify potentially relevant manuscripts. Seventeen possibly appropriate studies were retrieved in full-text of which 48 were excluded after reviewing their full-texts. Ten selected articles were categorized into three subgroups: PEMF (6 articles), SMF (3 articles), and therapeutic nuclear magnetic fields (tNMF) (1 article). Since one study evaluated tNMF, we had to exclude it. In the PEMF group, one study of acute LBP did not show significant positive results and the majority of the other five studies on Chronic Low Back Pain (CLBP) indicated its efficacy on pain relief and functional improvement, but one study with the lowest sessions (6 sessions during 2 weeks) did not report a significant difference between treatment and control groups. In the SMF subgroup, two articles reported near significant pain reduction without any functional improvement although more studies are needed. Conclusion: The PEMFs with a strength of 5 to 150 G or 0.1 to 0.3 G and a frequency of 5 to 64 Hz or sweep 7 to 7KHz can be considered as an effective modality in pain relief and functional improvement in patients with chronic low back pain, but there is not enough evidence to confirm their effectiveness in acute low back pain. To achieve the appropriate effectiveness, it is suggested to perform this treatment modality 20 minutes per day for at least 9 sessions. SMFs have not been reported to be substantially effective in decreasing pain or improving the function in chronic low back pain. More studies are necessary to achieve more reliable results.

Keywords: pulsed electromagnetic field, static magnetic field, magnetotherapy, low back pain

Procedia PDF Downloads 190
10732 DNA Barcoding for Identification of Dengue Vectors from Assam and Arunachal Pradesh: North-Eastern States in India

Authors: Monika Soni, Shovonlal Bhowmick, Chandra Bhattacharya, Jitendra Sharma, Prafulla Dutta, Jagadish Mahanta

Abstract:

Aedes aegypti and Aedes albopictus are considered as two major vectors to transmit dengue virus. In North-east India, two states viz. Assam and Arunachal Pradesh are known to be high endemic zone for dengue and Chikungunya viral infection. The taxonomical classification of medically important vectors are important for mapping of actual evolutionary trends and epidemiological studies. However, misidentification of mosquito species in field-collected mosquito specimens could have a negative impact which may affect vector-borne disease control policy. DNA barcoding is a prominent method to record available species, differentiate from new addition and change of population structure. In this study, a combined approach of a morphological and molecular technique of DNA barcoding was adopted to explore sequence variation in mitochondrial cytochrome c oxidase subunit I (COI) gene within dengue vectors. The study has revealed the map distribution of the dengue vector from two states i.e. Assam and Arunachal Pradesh, India. Approximate five hundred mosquito specimens were collected from different parts of two states, and their morphological features were compared with the taxonomic keys. The analysis of detailed taxonomic study revealed identification of two species Aedes aegypti and Aedes albopictus. The species aegypti comprised of 66.6% of the specimen and represented as dominant dengue vector species. The sequences obtained through standard DNA barcoding protocol were compared with public databases, viz. GenBank and BOLD. The sequences of all Aedes albopictus have shown 100% similarity whereas sequence of Aedes aegypti has shown 99.77 - 100% similarity of COI gene with that of different geographically located same species based on BOLD database search. From dengue prevalent different geographical regions fifty-nine sequences were retrieved from NCBI and BOLD databases of the same and related taxa to determine the evolutionary distance model based on the phylogenetic analysis. Neighbor-Joining (NJ) and Maximum Likelihood (ML) phylogenetic tree was constructed in MEGA6.06 software with 1000 bootstrap replicates using Kimura-2-Parameter model. Data were analyzed for sequence divergence and found that intraspecific divergence ranged from 0.0 to 2.0% and interspecific divergence ranged from 11.0 to 12.0%. The transitional and transversional substitutions were tested individually. The sequences were deposited in NCBI: GenBank database. This observation claimed the first DNA barcoding analysis of Aedes mosquitoes from North-eastern states in India and also confirmed the range expansion of two important mosquito species. Overall, this study insight into the molecular ecology of the dengue vectors from North-eastern India which will enhance the understanding to improve the existing entomological surveillance and vector incrimination program.

Keywords: COI, dengue vectors, DNA barcoding, molecular identification, North-east India, phylogenetics

Procedia PDF Downloads 281
10731 A Review on Valorisation of Chicken Feathers: Current Status and Future Prospects

Authors: Tamrat Tesfaye, Bruce Sithole, Deresh Ramjugernath

Abstract:

Worldwide, the poultry–processing industry generates large quantities of feather by-products that amount to 40 billion kilograms annually. The feathers are considered wastes although small amounts are often processed into valuable products such as feather meal and fertilizers. The remaining waste is disposed of by incineration or by burial in controlled landfills. Improper disposal of these biological wastes contributes to environmental damage and transmission of diseases. Economic pressures, environmental pressures, increasing interest in using renewable and sustainable raw materials, and the need to decrease reliance on non-renewable petroleum resources behove the industry to find better ways of dealing with waste feathers. A closer look at the structure and composition of feathers shows that the whole part of a chicken feather (rachis and barb) can be used as a source of a pure structural protein called keratin which can be exploited for conversion into a number of high-value bio products. Additionally, a number of technologies can be used to convert other biological components of feathers into high value added products. Thus, conversion of the waste into valuable products can make feathers an attractive raw material for the production of bio products. In this review, possible applications of chicken feathers in a variety of technologies and products are discussed. Thus, using waste feathers as a valuable resource can help the poultry industry to dispose of the waste feathers in an environmentally sustainable manner that also generates extra income for the industry. Their valorisation can result in their sustainable conversion into high-value materials and products on the proviso of existence or development of cost-effective technologies for converting this waste into the useful products.

Keywords: biodegradable product, keratin, poultry waste, feathers, valorisation

Procedia PDF Downloads 280
10730 Experimental Modeling of Spray and Water Sheet Formation Due to Wave Interactions with Vertical and Slant Bow-Shaped Model

Authors: Armin Bodaghkhani, Bruce Colbourne, Yuri S. Muzychka

Abstract:

The process of spray-cloud formation and flow kinematics produced from breaking wave impact on vertical and slant lab-scale bow-shaped models were experimentally investigated. Bubble Image Velocimetry (BIV) and Image Processing (IP) techniques were applied to study the various types of wave-model impacts. Different wave characteristics were generated in a tow tank to investigate the effects of wave characteristics, such as wave phase velocity, wave steepness on droplet velocities, and behavior of the process of spray cloud formation. The phase ensemble-averaged vertical velocity and turbulent intensity were computed. A high-speed camera and diffused LED backlights were utilized to capture images for further post processing. Various pressure sensors and capacitive wave probes were used to measure the wave impact pressure and the free surface profile at different locations of the model and wave-tank, respectively. Droplet sizes and velocities were measured using BIV and IP techniques to trace bubbles and droplets in order to measure their velocities and sizes by correlating the texture in these images. The impact pressure and droplet size distributions were compared to several previously experimental models, and satisfactory agreements were achieved. The distribution of droplets in front of both models are demonstrated. Due to the highly transient process of spray formation, the drag coefficient for several stages of this transient displacement for various droplet size ranges and different Reynolds number were calculated based on the ensemble average method. From the experimental results, the slant model produces less spray in comparison with the vertical model, and the droplet velocities generated from the wave impact with the slant model have a lower velocity as compared with the vertical model.

Keywords: spray charachteristics, droplet size and velocity, wave-body interactions, bubble image velocimetry, image processing

Procedia PDF Downloads 287
10729 Data Clustering in Wireless Sensor Network Implemented on Self-Organization Feature Map (SOFM) Neural Network

Authors: Krishan Kumar, Mohit Mittal, Pramod Kumar

Abstract:

Wireless sensor network is one of the most promising communication networks for monitoring remote environmental areas. In this network, all the sensor nodes are communicated with each other via radio signals. The sensor nodes have capability of sensing, data storage and processing. The sensor nodes collect the information through neighboring nodes to particular node. The data collection and processing is done by data aggregation techniques. For the data aggregation in sensor network, clustering technique is implemented in the sensor network by implementing self-organizing feature map (SOFM) neural network. Some of the sensor nodes are selected as cluster head nodes. The information aggregated to cluster head nodes from non-cluster head nodes and then this information is transferred to base station (or sink nodes). The aim of this paper is to manage the huge amount of data with the help of SOM neural network. Clustered data is selected to transfer to base station instead of whole information aggregated at cluster head nodes. This reduces the battery consumption over the huge data management. The network lifetime is enhanced at a greater extent.

Keywords: artificial neural network, data clustering, self organization feature map, wireless sensor network

Procedia PDF Downloads 497
10728 CO2 Gas Solubility and Foam Generation

Authors: Chanmoly Or, Kyuro Sasaki, Yuichi Sugai, Masanori Nakano, Motonao Imai

Abstract:

Cold drainage mechanism of oil production is a complicated process which involves with solubility and foaming processes. Laboratory experiments were carried out to investigate the CO2 gas solubility in hexadecane (as light oil) and the effect of depressurization processes on microbubble generation. The experimental study of sensitivity parameters of temperature and pressure on CO2 gas solubility in hexadecane was conducted at temperature of 20 °C and 50 °C and pressure ranged 2.0–7.0 MPa by using PVT (RUSKA Model 2370) apparatus. The experiments of foamy hexadecane were also prepared by depressurizing from saturated pressure of 6.4 MPa and temperature of 50 °C. The experimental results show the CO2 gas solubility in hexadecane linearly increases with increasing pressure. At pressure 4.5 MPa, CO2 gas dissolved in hexadecane 2.5 mmol.g-1 for temperature of 50 °C and 3.5 mmol.g-1 for temperature of 20 °C. The bubbles of foamy hexadecane were observed that most of large bubbles were coalesced shortly whereas the small one keeps presence. The experimental result of foamy hexadecane indicated large depressurization step (∆P) produces high quality of foam with high microbubble distribution.

Keywords: CO2 gas solubility, depressurization process, foamy hexadecane, microbubble distribution

Procedia PDF Downloads 476
10727 Greyscale: A Tree-Based Taxonomy for Grey Literature Published by Fisheries Agencies

Authors: Tatiana Tunon, Gottfried Pestal

Abstract:

Government agencies responsible for the management of fisheries resources publish many types of grey literature, and these materials are increasingly accessible to the public on agency websites. However, scope and quality vary considerably, and end-users need meta-data about the report series when deciding whether to use the information (e.g. apply the methods, include the results in a systematic review), or when prioritizing materials for archiving (e.g. library holdings, reference databases). A proposed taxonomy for these report series was developed based on a review of 41 report series from 6 government agencies in 4 countries (Canada, New Zealand, Scotland, and United States). Each report series was categorized according to multiple criteria describing peer-review process, content, and purpose. A robust classification tree was then fitted to these descriptions, and the resulting taxonomic groups were used to compare agency output from 4 countries using reports available in their online repositories.

Keywords: classification tree, fisheries, government, grey literature

Procedia PDF Downloads 260
10726 Improving Second Language Speaking Skills via Video Exchange

Authors: Nami Takase

Abstract:

Computer-mediated-communication allows people to connect and interact with each other as if they were sharing the same space. The current study examined the effects of using video letters (VLs) on the development of second language speaking skills of Common European Framework of Reference for Languages (CEFR) A1 and CEFR B2 level learners of English as a foreign language. Two groups were formed to measure the impact of VLs. The experimental and control groups were given the same topic, and both groups worked with a native English-speaking university student from the United States of America. Students in the experimental group exchanged VLs, and students in the control group used video conferencing. Pre- and post-tests were conducted to examine the effects of each practice mode. The transcribed speech-text data showed that the VL group had improved speech accuracy scores, while the video conferencing group had increased sentence complexity scores. The use of VLs may be more effective for beginner-level learners because they are able to notice their own errors and replay videos to better understand the native speaker’s speech at their own pace. Both the VL and video conferencing groups provided positive feedback regarding their interactions with native speakers. The results showed how different types of computer-mediated communication impacts different areas of language learning and speaking practice and how each of these types of online communication tool is suited to different teaching objectives.

Keywords: computer-assisted-language-learning, computer-mediated-communication, english as a foreign language, speaking

Procedia PDF Downloads 87
10725 Machine Learning Approach for Mutation Testing

Authors: Michael Stewart

Abstract:

Mutation testing is a type of software testing proposed in the 1970s where program statements are deliberately changed to introduce simple errors so that test cases can be validated to determine if they can detect the errors. Test cases are executed against the mutant code to determine if one fails, detects the error and ensures the program is correct. One major issue with this type of testing was it became intensive computationally to generate and test all possible mutations for complex programs. This paper used reinforcement learning and parallel processing within the context of mutation testing for the selection of mutation operators and test cases that reduced the computational cost of testing and improved test suite effectiveness. Experiments were conducted using sample programs to determine how well the reinforcement learning-based algorithm performed with one live mutation, multiple live mutations and no live mutations. The experiments, measured by mutation score, were used to update the algorithm and improved accuracy for predictions. The performance was then evaluated on multiple processor computers. With reinforcement learning, the mutation operators utilized were reduced by 50 – 100%.

Keywords: automated-testing, machine learning, mutation testing, parallel processing, reinforcement learning, software engineering, software testing

Procedia PDF Downloads 178
10724 Lignin Valorization: Techno-Economic Analysis of Three Lignin Conversion Routes

Authors: Iris Vural Gursel, Andrea Ramirez

Abstract:

Effective utilization of lignin is an important mean for developing economically profitable biorefineries. Current literature suggests that large amounts of lignin will become available in second generation biorefineries. New conversion technologies will, therefore, be needed to carry lignin transformation well beyond combustion to produce energy, but towards high-value products such as chemicals and transportation fuels. In recent years, significant progress on catalysis has been made to improve transformation of lignin, and new catalytic processes are emerging. In this work, a techno-economic assessment of two of these novel conversion routes and comparison with more established lignin pyrolysis route were made. The aim is to provide insights into the potential performance and potential hotspots in order to guide the experimental research and ease the commercialization by early identifying cost drivers, strengths, and challenges. The lignin conversion routes selected for detailed assessment were: (non-catalytic) lignin pyrolysis as the benchmark, direct hydrodeoxygenation (HDO) of lignin and hydrothermal lignin depolymerisation. Products generated were mixed oxygenated aromatic monomers (MOAMON), light organics, heavy organics, and char. For the technical assessment, a basis design followed by process modelling in Aspen was done using experimental yields. A design capacity of 200 kt/year lignin feed was chosen that is equivalent to a 1 Mt/y scale lignocellulosic biorefinery. The downstream equipment was modelled to achieve the separation of the product streams defined. For determining external utility requirement, heat integration was considered and when possible gasses were combusted to cover heating demand. The models made were used in generating necessary data on material and energy flows. Next, an economic assessment was carried out by estimating operating and capital costs. Return on investment (ROI) and payback period (PBP) were used as indicators. The results of the process modelling indicate that series of separation steps are required. The downstream processing was found especially demanding in the hydrothermal upgrading process due to the presence of significant amount of unconverted lignin (34%) and water. Also, external utility requirements were found to be high. Due to the complex separations, hydrothermal upgrading process showed the highest capital cost (50 M€ more than benchmark). Whereas operating costs were found the highest for the direct HDO process (20 M€/year more than benchmark) due to the use of hydrogen. Because of high yields to valuable heavy organics (32%) and MOAMON (24%), direct HDO process showed the highest ROI (12%) and the shortest PBP (5 years). This process is found feasible with a positive net present value. However, it is very sensitive to the prices used in the calculation. The assessments at this stage are associated with large uncertainties. Nevertheless, they are useful for comparing alternatives and identifying whether a certain process should be given further consideration. Among the three processes investigated here, the direct HDO process was seen to be the most promising.

Keywords: biorefinery, economic assessment, lignin conversion, process design

Procedia PDF Downloads 249
10723 A Passive Digital Video Authentication Technique Using Wavelet Based Optical Flow Variation Thresholding

Authors: R. S. Remya, U. S. Sethulekshmi

Abstract:

Detecting the authenticity of a video is an important issue in digital forensics as Video is used as a silent evidence in court such as in child pornography, movie piracy cases, insurance claims, cases involving scientific fraud, traffic monitoring etc. The biggest threat to video data is the availability of modern open video editing tools which enable easy editing of videos without leaving any trace of tampering. In this paper, we propose an efficient passive method for inter-frame video tampering detection, its type and location by estimating the optical flow of wavelet features of adjacent frames and thresholding the variation in the estimated feature. The performance of the algorithm is compared with the z-score thresholding and achieved an efficiency above 95% on all the tested databases. The proposed method works well for videos with dynamic (forensics) as well as static (surveillance) background.

Keywords: discrete wavelet transform, optical flow, optical flow variation, video tampering

Procedia PDF Downloads 345
10722 The Impact of Information and Communication Technology in Knowledge Fraternization

Authors: Muhammad Aliyu

Abstract:

Significant improvement in Information and Communication Technology (ICT) and the enforced global competition are revolutionizing the way knowledge is managed and the way organizations compete. The emergence of new organizations calls for a new way to fraternize knowledge, which is known as 'knowledge fraternization.' In this modern economy, it is the knowledge if properly managed that can harness the organization's competitive advantage. This competitive advantage is realized through the full utilization of information and data coupled with the harnessing of people’s skills and ideas as well as their commitment and motivations, which can be accomplished through socializing the knowledge management processes. A fraternize network for knowledge management is a web-based system designed using PHP that is Dreamweaver web development tool, with the help of CS4 Adobe Dreamweaver as the PHP code Editor that supports the use of Cascadian Style Sheet (CSS), MySQL with Xamp, Php My Admin (Version 3.4.7) localhost server via TCP/IP for containing the databases of the system to support this in a distributed way, spreading the workload over the whole organization. This paper reviews the technologies and the technology tools to be used in the development of social networks in an organization.

Keywords: Information and Communication Technology (ICT), knowledge, fraternization, social network

Procedia PDF Downloads 378
10721 Parallel Multisplitting Methods for DAE’s

Authors: Ahmed Machmoum, Malika El Kyal

Abstract:

We consider iterative parallel multi-splitting method for differential algebraic equations. The main feature of the proposed idea is to use the asynchronous form. We prove that the multi-splitting technique can effectively accelerate the convergent performance of the iterative process. The main characteristic of an asynchronous mode is that the local algorithm not have to wait at predetermined messages to become available. We allow some processors to communicate more frequently than others, and we allow the communication delays tobe substantial and unpredictable. Note that synchronous algorithms in the computer science sense are particular cases of our formulation of asynchronous one.

Keywords: computer, multi-splitting methods, asynchronous mode, differential algebraic systems

Procedia PDF Downloads 532
10720 Fast Bayesian Inference of Multivariate Block-Nearest Neighbor Gaussian Process (NNGP) Models for Large Data

Authors: Carlos Gonzales, Zaida Quiroz, Marcos Prates

Abstract:

Several spatial variables collected at the same location that share a common spatial distribution can be modeled simultaneously through a multivariate geostatistical model that takes into account the correlation between these variables and the spatial autocorrelation. The main goal of this model is to perform spatial prediction of these variables in the region of study. Here we focus on a geostatistical multivariate formulation that relies on sharing common spatial random effect terms. In particular, the first response variable can be modeled by a mean that incorporates a shared random spatial effect, while the other response variables depend on this shared spatial term, in addition to specific random spatial effects. Each spatial random effect is defined through a Gaussian process with a valid covariance function, but in order to improve the computational efficiency when the data are large, each Gaussian process is approximated to a Gaussian random Markov field (GRMF), specifically to the block nearest neighbor Gaussian process (Block-NNGP). This approach involves dividing the spatial domain into several dependent blocks under certain constraints, where the cross blocks allow capturing the spatial dependence on a large scale, while each individual block captures the spatial dependence on a smaller scale. The multivariate geostatistical model belongs to the class of Latent Gaussian Models; thus, to achieve fast Bayesian inference, it is used the integrated nested Laplace approximation (INLA) method. The good performance of the proposed model is shown through simulations and applications for massive data.

Keywords: Block-NNGP, geostatistics, gaussian process, GRMF, INLA, multivariate models.

Procedia PDF Downloads 79
10719 Mage Fusion Based Eye Tumor Detection

Authors: Ahmed Ashit

Abstract:

Image fusion is a significant and efficient image processing method used for detecting different types of tumors. This method has been used as an effective combination technique for obtaining high quality images that combine anatomy and physiology of an organ. It is the main key in the huge biomedical machines for diagnosing cancer such as PET-CT machine. This thesis aims to develop an image analysis system for the detection of the eye tumor. Different image processing methods are used to extract the tumor and then mark it on the original image. The images are first smoothed using median filtering. The background of the image is subtracted, to be then added to the original, results in a brighter area of interest or tumor area. The images are adjusted in order to increase the intensity of their pixels which lead to clearer and brighter images. once the images are enhanced, the edges of the images are detected using canny operators results in a segmented image comprises only of the pupil and the tumor for the abnormal images, and the pupil only for the normal images that have no tumor. The images of normal and abnormal images are collected from two sources: “Miles Research” and “Eye Cancer”. The computerized experimental results show that the developed image fusion based eye tumor detection system is capable of detecting the eye tumor and segment it to be superimposed on the original image.

Keywords: image fusion, eye tumor, canny operators, superimposed

Procedia PDF Downloads 344
10718 Adoption of Big Data by Global Chemical Industries

Authors: Ashiff Khan, A. Seetharaman, Abhijit Dasgupta

Abstract:

The new era of big data (BD) is influencing chemical industries tremendously, providing several opportunities to reshape the way they operate and help them shift towards intelligent manufacturing. Given the availability of free software and the large amount of real-time data generated and stored in process plants, chemical industries are still in the early stages of big data adoption. The industry is just starting to realize the importance of the large amount of data it owns to make the right decisions and support its strategies. This article explores the importance of professional competencies and data science that influence BD in chemical industries to help it move towards intelligent manufacturing fast and reliable. This article utilizes a literature review and identifies potential applications in the chemical industry to move from conventional methods to a data-driven approach. The scope of this document is limited to the adoption of BD in chemical industries and the variables identified in this article. To achieve this objective, government, academia, and industry must work together to overcome all present and future challenges.

Keywords: chemical engineering, big data analytics, industrial revolution, professional competence, data science

Procedia PDF Downloads 67
10717 The Effects of Prolonged Social Media Use on Student Health: A Focus on Computer Vision Syndrome, Hand Pain, and Headaches and Mental Status

Authors: Augustine Ndudi Egere, Shehu Adamu, Esther Ishaya Solomon

Abstract:

As internet accessibility and smartphones continue to increase in Nigeria, Africa’s most populous country, social media platforms have become ubiquitous, causing students of 18-25 age brackets to spend more time on social media. The research investigated the impact of prolonged social media use on the physical health of students, with a specific focus on computer vision syndrome, hand pain, headaches and mental status. The study adopted a mixed-methods approach combining quantitative surveys to gather statistical data on usage patterns and symptoms, along with qualitative interviews into the experiences and perceptions of medical practitioners concerning cases under study within the geopolitical region. The result was analyzed using Regression analysis. It was observed that there is a significant correlation between social media usage by the students in the study age bracket concerning computer vision syndrome, hand pain, headache and general mental status. The research concluded by providing valuable insights into potential interventions and strategies to mitigate the adverse effects of excessive social media use on student well-being and recommends, among others, that educational institutions, parents, and students themselves collaborate to implement strategies aimed at promoting responsible and balanced use of social media.

Keywords: social media, student health, computer vision syndrome, hand pain, headaches, mental staus

Procedia PDF Downloads 31
10716 Free Vibration Analysis of Timoshenko Beams at Higher Modes with Central Concentrated Mass Using Coupled Displacement Field Method

Authors: K. Meera Saheb, K. Krishna Bhaskar

Abstract:

Complex structures used in many fields of engineering are made up of simple structural elements like beams, plates etc. These structural elements, sometimes carry concentrated masses at discrete points, and when subjected to severe dynamic environment tend to vibrate with large amplitudes. The frequency amplitude relationship is very much essential in determining the response of these structural elements subjected to the dynamic loads. For Timoshenko beams, the effects of shear deformation and rotary inertia are to be considered to evaluate the fundamental linear and nonlinear frequencies. A commonly used method for solving vibration problem is energy method, or a finite element analogue of the same. In the present Coupled Displacement Field method the number of undetermined coefficients is reduced to half when compared to the famous Rayleigh Ritz method, which significantly simplifies the procedure to solve the vibration problem. This is accomplished by using a coupling equation derived from the static equilibrium of the shear flexible structural element. The prime objective of the present paper here is to study, in detail, the effect of a central concentrated mass on the large amplitude free vibrations of uniform shear flexible beams. Accurate closed form expressions for linear frequency parameter for uniform shear flexible beams with a central concentrated mass was developed and the results are presented in digital form.

Keywords: coupled displacement field, coupling equation, large amplitude vibrations, moderately thick plates

Procedia PDF Downloads 211
10715 DeepNIC a Method to Transform Each Tabular Variable into an Independant Image Analyzable by Basic CNNs

Authors: Nguyen J. M., Lucas G., Ruan S., Digonnet H., Antonioli D.

Abstract:

Introduction: Deep Learning (DL) is a very powerful tool for analyzing image data. But for tabular data, it cannot compete with machine learning methods like XGBoost. The research question becomes: can tabular data be transformed into images that can be analyzed by simple CNNs (Convolutional Neuron Networks)? Will DL be the absolute tool for data classification? All current solutions consist in repositioning the variables in a 2x2 matrix using their correlation proximity. In doing so, it obtains an image whose pixels are the variables. We implement a technology, DeepNIC, that offers the possibility of obtaining an image for each variable, which can be analyzed by simple CNNs. Material and method: The 'ROP' (Regression OPtimized) model is a binary and atypical decision tree whose nodes are managed by a new artificial neuron, the Neurop. By positioning an artificial neuron in each node of the decision trees, it is possible to make an adjustment on a theoretically infinite number of variables at each node. From this new decision tree whose nodes are artificial neurons, we created the concept of a 'Random Forest of Perfect Trees' (RFPT), which disobeys Breiman's concepts by assembling very large numbers of small trees with no classification errors. From the results of the RFPT, we developed a family of 10 statistical information criteria, Nguyen Information Criterion (NICs), which evaluates in 3 dimensions the predictive quality of a variable: Performance, Complexity and Multiplicity of solution. A NIC is a probability that can be transformed into a grey level. The value of a NIC depends essentially on 2 super parameters used in Neurops. By varying these 2 super parameters, we obtain a 2x2 matrix of probabilities for each NIC. We can combine these 10 NICs with the functions AND, OR, and XOR. The total number of combinations is greater than 100,000. In total, we obtain for each variable an image of at least 1166x1167 pixels. The intensity of the pixels is proportional to the probability of the associated NIC. The color depends on the associated NIC. This image actually contains considerable information about the ability of the variable to make the prediction of Y, depending on the presence or absence of other variables. A basic CNNs model was trained for supervised classification. Results: The first results are impressive. Using the GSE22513 public data (Omic data set of markers of Taxane Sensitivity in Breast Cancer), DEEPNic outperformed other statistical methods, including XGBoost. We still need to generalize the comparison on several databases. Conclusion: The ability to transform any tabular variable into an image offers the possibility of merging image and tabular information in the same format. This opens up great perspectives in the analysis of metadata.

Keywords: tabular data, CNNs, NICs, DeepNICs, random forest of perfect trees, classification

Procedia PDF Downloads 96
10714 Comparative Analysis of Feature Extraction and Classification Techniques

Authors: R. L. Ujjwal, Abhishek Jain

Abstract:

In the field of computer vision, most facial variations such as identity, expression, emotions and gender have been extensively studied. Automatic age estimation has been rarely explored. With age progression of a human, the features of the face changes. This paper is providing a new comparable study of different type of algorithm to feature extraction [Hybrid features using HAAR cascade & HOG features] & classification [KNN & SVM] training dataset. By using these algorithms we are trying to find out one of the best classification algorithms. Same thing we have done on the feature selection part, we extract the feature by using HAAR cascade and HOG. This work will be done in context of age group classification model.

Keywords: computer vision, age group, face detection

Procedia PDF Downloads 353
10713 Synthesis of Silver Nanoparticle: An Analytical Method Based Approach for the Quantitative Assessment of Drug

Authors: Zeid A. Alothman

Abstract:

Silver nanoparticle (AgNP) has been synthesized using adrenaline. Adrenaline readily undergoes an autoxidation reaction in an alkaline medium with the dissolved oxygen to form adrenochrome, thus behaving as a mild reducing agent for the dissolved oxygen. This reducing behavior of adrenaline when employed to reduce Ag(+) ions yielded a large enhancement in the intensity of absorbance in the visible region. Transmission electron microscopy (TEM) and X-ray diffraction (XRD) studies have been performed to confirm the surface morphology of AgNPs. Further, the metallic nanoparticles with size greater than 2 nm caused a strong and broad absorption band in the UV-visible spectrum called surface plasmon band or Mie resonance. The formation of AgNPs caused the large enhancement in the absorbance values with λmax at 436 nm through the excitation of the surface plasmon band. The formation of AgNPs was adapted to for the quantitative assessment of adrenaline using spectrophotometry with lower detection limit and higher precision values.

Keywords: silver nanoparticle, adrenaline, XRD, TEM, analysis

Procedia PDF Downloads 190
10712 Frequency Decomposition Approach for Sub-Band Common Spatial Pattern Methods for Motor Imagery Based Brain-Computer Interface

Authors: Vitor M. Vilas Boas, Cleison D. Silva, Gustavo S. Mafra, Alexandre Trofino Neto

Abstract:

Motor imagery (MI) based brain-computer interfaces (BCI) uses event-related (de)synchronization (ERS/ ERD), typically recorded using electroencephalography (EEG), to translate brain electrical activity into control commands. To mitigate undesirable artifacts and noise measurements on EEG signals, methods based on band-pass filters defined by a specific frequency band (i.e., 8 – 30Hz), such as the Infinity Impulse Response (IIR) filters, are typically used. Spatial techniques, such as Common Spatial Patterns (CSP), are also used to estimate the variations of the filtered signal and extract features that define the imagined motion. The CSP effectiveness depends on the subject's discriminative frequency, and approaches based on the decomposition of the band of interest into sub-bands with smaller frequency ranges (SBCSP) have been suggested to EEG signals classification. However, despite providing good results, the SBCSP approach generally increases the computational cost of the filtering step in IM-based BCI systems. This paper proposes the use of the Fast Fourier Transform (FFT) algorithm in the IM-based BCI filtering stage that implements SBCSP. The goal is to apply the FFT algorithm to reduce the computational cost of the processing step of these systems and to make them more efficient without compromising classification accuracy. The proposal is based on the representation of EEG signals in a matrix of coefficients resulting from the frequency decomposition performed by the FFT, which is then submitted to the SBCSP process. The structure of the SBCSP contemplates dividing the band of interest, initially defined between 0 and 40Hz, into a set of 33 sub-bands spanning specific frequency bands which are processed in parallel each by a CSP filter and an LDA classifier. A Bayesian meta-classifier is then used to represent the LDA outputs of each sub-band as scores and organize them into a single vector, and then used as a training vector of an SVM global classifier. Initially, the public EEG data set IIa of the BCI Competition IV is used to validate the approach. The first contribution of the proposed method is that, in addition to being more compact, because it has a 68% smaller dimension than the original signal, the resulting FFT matrix maintains the signal information relevant to class discrimination. In addition, the results showed an average reduction of 31.6% in the computational cost in relation to the application of filtering methods based on IIR filters, suggesting FFT efficiency when applied in the filtering step. Finally, the frequency decomposition approach improves the overall system classification rate significantly compared to the commonly used filtering, going from 73.7% using IIR to 84.2% using FFT. The accuracy improvement above 10% and the computational cost reduction denote the potential of FFT in EEG signal filtering applied to the context of IM-based BCI implementing SBCSP. Tests with other data sets are currently being performed to reinforce such conclusions.

Keywords: brain-computer interfaces, fast Fourier transform algorithm, motor imagery, sub-band common spatial patterns

Procedia PDF Downloads 110
10711 Process Optimization and Microbial Quality of Provitamin A-Biofortified Amahewu, a Non-Alcoholic Maize Based Beverage

Authors: Temitope D. Awobusuyi, Eric O. Amonsou, Muthulisi Siwela, Oluwatosin A. Ijabadeniyi

Abstract:

Provitamin A-biofortified maize has been developed to alleviate Vitamin A deficiency; a major public health problem in developing countries. Amahewu, a non-alcoholic fermented maize based beverage is produced using white maize, which is deficient in Vitamin A. In this study, the suitable processing conditions for the production of amahewu using provitamin A-biofortified maize and the microbial quality of the processed products were evaluated. Provitamin A-biofortified amahewu was produced with reference to traditional processing method. Processing variables were Inoculum types (Malted provitamin A maize, Wheat bran, and lactobacillus mixed starter culture with either malted provitamin A or wheat bran) and concentration (0.5 %, 1 % and 2 %). A total of four provitamin A-biofortified amahewu products after fermentation were subjected to different storage conditions: 4ᴼC, 25ᴼC and 37ᴼC. pH and TTA were monitored throughout the storage period. Sample of provitamin A-biofortified amahewu were plated and observed every day for 5 days to assess the presence of Aerobic and Anaerobic spore formers, E.coli, Lactobacillus and Mould. The addition of starter culture substantially reduced the fermentation time (6 hour, pH 3.3) compared to those with no addition of starter culture (24 hour pH 3.5). It was observed that Lactobacillus were present from day 0 for all the storage temperatures. The presence of aerobic spore former and mould were observed on day 3. E.coli and Anaerobic spore formers were not present throughout the storage period. These microbial growth were minimal at 4ᴼC while 25ᴼC had higher counts of growth with 37ᴼC having the highest colony count. Throughout the storage period, pH of provitamin A-biofortified amahewu was stable. Provitamin A-biofortified amahewu stored under refrigerated condition (4ᴼC) had better storability compared to 25ᴼC and 37ᴼC. The production and microbial quality of provitamin A-biofortified amahewu might be important in combating Vitamin A Deficiency.

Keywords: biofortification, fermentation, maize, vitamin A deficiency

Procedia PDF Downloads 416
10710 An Improved Mesh Deformation Method Based on Radial Basis Function

Authors: Xuan Zhou, Litian Zhang, Shuixiang Li

Abstract:

Mesh deformation using radial basis function interpolation method has been demonstrated to produce quality meshes with relatively little computational cost using a concise algorithm. However, it still suffers from the limited deformation ability, especially in large deformation. In this paper, a pre-displacement improvement is proposed to improve the problem that illegal meshes always appear near the moving inner boundaries owing to the large relative displacement of the nodes near inner boundaries. In this improvement, nodes near the inner boundaries are first associated to the near boundary nodes, and a pre-displacement based on the displacements of associated boundary nodes is added to the nodes near boundaries in order to make the displacement closer to the boundary deformation and improve the deformation capability. Several 2D and 3D numerical simulation cases have shown that the pre-displacement improvement for radial basis function (RBF) method significantly improves the mesh quality near inner boundaries and deformation capability, with little computational burden increasement.

Keywords: mesh deformation, mesh quality, background mesh, radial basis function

Procedia PDF Downloads 352