Search results for: Document processing.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1788

Search results for: Document processing.

1098 The Shifting Urban Role of Buildings’ Facades: A Diachronic Analysis of El Korba

Authors: Virginia Bassily, Sherif Goubran

Abstract:

In heritage conservation and revival, much of the focus is placed on the techniques and methods to preserve, restore, and revive heritage structures and locations. However, more attention needs to be drawn to how deterioration happens and its effect on the area’s character and socio-economic status. To this end, this research aims to examine the decline and its effect in the El Korba area in Heliopolis, Cairo, Egypt. El Korba was designed with a unique architectural character to stimulate social and economic life. However, the area has been on a path of physical deterioration that is corroding the social life on its streets. This research uses diachronic analysis in Ibrahim El-Lakkani Boulevard of El Korba based on a previously developed framework that connects buildings’ architectural features to the degree of social interaction in the street to document the changes that the building deterioration could have caused. Architectural features of the street level during both the original state (1906) and the current state (2021) are broken down and categorized in those six parameters to understand their decline or improvement over time. We find that the parameters that have decreased over the years and caused the deterioration are complexity and architectural character, permeability, territoriality and personalization, and physical comfort.  Based on these findings, revival projects can focus on physical parameters that create synergistic benefits by preserving and renewing heritage locations and revitalizing their socio-economic potential.

Keywords: Architectural character, heritage building conservation, enclosure, ground-floor use, El Korba, visual and physical permeability, personalization, physical comfort, social life, territoriality.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 430
1097 64 bit Computer Architectures for Space Applications – A study

Authors: Niveditha Domse, Kris Kumar, K. N. Balasubramanya Murthy

Abstract:

The more recent satellite projects/programs makes extensive usage of real – time embedded systems. 16 bit processors which meet the Mil-Std-1750 standard architecture have been used in on-board systems. Most of the Space Applications have been written in ADA. From a futuristic point of view, 32 bit/ 64 bit processors are needed in the area of spacecraft computing and therefore an effort is desirable in the study and survey of 64 bit architectures for space applications. This will also result in significant technology development in terms of VLSI and software tools for ADA (as the legacy code is in ADA). There are several basic requirements for a special processor for this purpose. They include Radiation Hardened (RadHard) devices, very low power dissipation, compatibility with existing operational systems, scalable architectures for higher computational needs, reliability, higher memory and I/O bandwidth, predictability, realtime operating system and manufacturability of such processors. Further on, these may include selection of FPGA devices, selection of EDA tool chains, design flow, partitioning of the design, pin count, performance evaluation, timing analysis etc. This project deals with a brief study of 32 and 64 bit processors readily available in the market and designing/ fabricating a 64 bit RISC processor named RISC MicroProcessor with added functionalities of an extended double precision floating point unit and a 32 bit signal processing unit acting as co-processors. In this paper, we emphasize the ease and importance of using Open Core (OpenSparc T1 Verilog RTL) and Open “Source" EDA tools such as Icarus to develop FPGA based prototypes quickly. Commercial tools such as Xilinx ISE for Synthesis are also used when appropriate.

Keywords: RISC MicroProcessor, RPC – RISC Processor Core, PBX – Processor to Block Interface part of the Interconnection Network, BPX – Block to Processor Interface part of the Interconnection Network, FPU – Floating Point Unit, SPU – Signal Processing Unit, WB – Wishbone Interface, CTU – Clock and Test Unit

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2226
1096 Mammogram Image Size Reduction Using 16-8 bit Conversion Technique

Authors: Ayman A. AbuBaker, Rami S.Qahwaji, Musbah J. Aqel, Mohmmad H. Saleh

Abstract:

Two algorithms are proposed to reduce the storage requirements for mammogram images. The input image goes through a shrinking process that converts the 16-bit images to 8-bits by using pixel-depth conversion algorithm followed by enhancement process. The performance of the algorithms is evaluated objectively and subjectively. A 50% reduction in size is obtained with no loss of significant data at the breast region.

Keywords: Breast cancer, Image processing, Image reduction, Mammograms, Image enhancement

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2014
1095 Contextual SenSe Model: Word Sense Disambiguation Using Sense and Sense Value of Context Surrounding the Target

Authors: Vishal Raj, Noorhan Abbas

Abstract:

Ambiguity in NLP (Natural Language Processing) refers to the ability of a word, phrase, sentence, or text to have multiple meanings. This results in various kinds of ambiguities such as lexical, syntactic, semantic, anaphoric and referential. This study is focused mainly on solving the issue of Lexical ambiguity. Word Sense Disambiguation (WSD) is an NLP technique that aims to resolve lexical ambiguity by determining the correct meaning of a word within a given context. Most WSD solutions rely on words for training and testing, but we have used lemma and Part of Speech (POS) tokens of words for training and testing. Lemma adds generality and POS adds properties of word into token. We have designed a method to create an affinity matrix to calculate the affinity between any pair of lemma_POS (a token where lemma and POS of word are joined by underscore) of given training set. Additionally, we have devised an algorithm to create the sense clusters of tokens using affinity matrix under hierarchy of POS of lemma. Furthermore, three different mechanisms to predict the sense of target word using the affinity/similarity value are devised. Each contextual token contributes to the sense of target word with some value and whichever sense gets higher value becomes the sense of target word. So, contextual tokens play a key role in creating sense clusters and predicting the sense of target word, hence, the model is named Contextual SenSe Model (CSM). CSM exhibits a noteworthy simplicity and explication lucidity in contrast to contemporary deep learning models characterized by intricacy, time-intensive processes, and challenging explication. CSM is trained on SemCor training data and evaluated on SemEval test dataset. The results indicate that despite the naivety of the method, it achieves promising results when compared to the Most Frequent Sense (MFS) model.

Keywords: Word Sense Disambiguation, WSD, Contextual SenSe Model, Most Frequent Sense, part of speech, POS, Natural Language Processing, NLP, OOV, out of vocabulary, ELMo, Embeddings from Language Model, BERT, Bidirectional Encoder Representations from Transformers, Word2Vec, lemma_POS, Algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 275
1094 Dynamic Web-Based 2D Medical Image Visualization and Processing Software

Authors: Abdelhalim. N. Mohammed, Mohammed. Y. Esmail

Abstract:

In the course of recent decades, medical imaging has been dominated by the use of costly film media for review and archival of medical investigation, however due to developments in networks technologies and common acceptance of a standard digital imaging and communication in medicine (DICOM) another approach in light of World Wide Web was produced. Web technologies successfully used in telemedicine applications, the combination of web technologies together with DICOM used to design a web-based and open source DICOM viewer. The Web server allowance to inquiry and recovery of images and the images viewed/manipulated inside a Web browser without need for any preinstalling software. The dynamic site page for medical images visualization and processing created by using JavaScript and HTML5 advancements. The XAMPP ‘apache server’ is used to create a local web server for testing and deployment of the dynamic site. The web-based viewer connected to multiples devices through local area network (LAN) to distribute the images inside healthcare facilities. The system offers a few focal points over ordinary picture archiving and communication systems (PACS): easy to introduce, maintain and independently platforms that allow images to display and manipulated efficiently, the system also user-friendly and easy to integrate with an existing system that have already been making use of web technologies. The wavelet-based image compression technique on which 2-D discrete wavelet transform used to decompose the image then wavelet coefficients are transmitted by entropy encoding after threshold to decrease transmission time, stockpiling cost and capacity. The performance of compression was estimated by using images quality metrics such as mean square error ‘MSE’, peak signal to noise ratio ‘PSNR’ and compression ratio ‘CR’ that achieved (83.86%) when ‘coif3’ wavelet filter is used.

Keywords: DICOM, discrete wavelet transform, PACS, HIS, LAN.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 767
1093 Slaughter and Carcass Characterization, and Sensory Qualities of Native, Pure, and Upgraded Breeds of Goat Raised in the Philippines

Authors: Jonathan N. Nayga, Emelita B. Valdez, Mila R. Andres, Beulah B. Estrada, Emelina A. Lopez, Rogelio B. Tamayo, Aubrey Joy M. Balbin

Abstract:

Goat production is one of the activities included in integrated farming in the Philippines. Goats are raised for its meat and regardless of breed the animal is slaughtered for this purpose. In order to document the carcass yield of different goats slaughtered, five (5) different breeds of goats to include Purebred Boer and Anglo-nubian, Crossbred Boer and Anglo-nubian and Philippine Native goat were used in the study. Data on slaughter parameters, carcass characteristics, and sensory evaluation were gathered and analyzed using Complete Random Design (CRD) at 5% level of significance and the results of carcass conformation were assessed descriptively. Results showed that slaughter data such as slaughter/live weight, hot and chilled carcass weights, dressing percentage and percentage drip loss were significantly different (P>0.05) among breeds. On carcass and meat characteristics, pure breed and upgraded Boer were found to be moderately muscular while Native goat was rated as thin muscular. The color of the carcass also revealed that Purebred and crossbred Boer were described dark red, while Native goat was noted to be slightly pale. On sensory evaluation, the results indicated that there was no significant difference (P>0.05) among breeds evaluated. It is therefore concluded that purebred goat has heavier carcass, while both purebred Boer and upgrade are rated slightly muscular. It is further confirms that regardless of breed, goat will have the same sensory characteristics. Thus, it is recommended to slaughter heavier goats to obtain more carcasses with better conformation and quality.

Keywords: Carcass quality, goat, sensory evaluation, slaughter.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1996
1092 Analysis of Complex Quadrature Mirror Filter Banks

Authors: Chimin Tsai

Abstract:

This work consists of three parts. First, the alias-free condition for the conventional two-channel quadrature mirror filter bank is analyzed using complex arithmetic. Second, the approach developed in the first part is applied to the complex quadrature mirror filter bank. Accordingly, the structure is simplified and the theory is easier to follow. Finally, a new class of complex quadrature mirror filter banks is proposed. Interesting properties of this new structure are also discussed.

Keywords: Aliasing cancellation, complex signal processing, polyphase realization, quadrature mirror filter banks.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2250
1091 Capacity Building for Hazmat Transport Emergency Preparedness: 'Hotspot Impact Zone' Mapping from Flammable and Toxic Releases

Authors: U K Chakrabarti, Jigisha Parikh

Abstract:

Hazardous Material transportation by road is coupled with inherent risk of accidents causing loss of lives, grievous injuries, property losses and environmental damages. The most common type of hazmat road accident happens to be the releases (78%) of hazardous substances, followed by fires (28%), explosions (14%) and vapour/ gas clouds (6 %.). The paper is discussing initially the probable 'Impact Zones' likely to be caused by one flammable (LPG) and one toxic (ethylene oxide) chemicals being transported through a sizable segment of a State Highway connecting three notified Industrial zones in Surat district in Western India housing 26 MAH industrial units. Three 'hotspots' were identified along the highway segment depending on the particular chemical traffic and the population distribution within 500 meters on either sides. The thermal radiation and explosion overpressure have been calculated for LPG / Ethylene Oxide BLEVE scenarios along with toxic release scenario for ethylene oxide. Besides, the dispersion calculations for ethylene oxide toxic release have been made for each 'hotspot' location and the impact zones have been mapped for the LOC concentrations. Subsequently, the maximum Initial Isolation and the protective zones were calculated based on ERPG-3 and ERPG-2 values of ethylene oxide respectively which are estimated taking the worst case scenario under worst weather conditions. The data analysis will be helpful to the local administration in capacity building with respect to rescue / evacuation and medical preparedness and quantitative inputs to augment the District Offsite Emergency Plan document.

Keywords: Hotspot, Ethylene Oxide, LPG, MAH (MajorAccident Hazard).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1786
1090 Revisiting Distributed Protocols for Mobility at the Application Layer

Authors: N. Nouali, H. Drias, A. Doucet

Abstract:

During more than a decade, many proposals and standards have been designed to deal with the mobility issues; however, there are still some serious limitations in basing solutions on them. In this paper we discuss the possibility of handling mobility at the application layer. We do this while revisiting the conventional implementation of the Two Phase Commit (2PC) protocol which is a fundamental asset of transactional technology for ensuring the consistent commitment of distributed transactions. The solution is based on an execution framework providing an efficient extension that is aware of the mobility and preserves the 2PC principle.

Keywords: Application layer, distributed mobile protocols, mobility management, mobile transaction processing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1591
1089 Algorithm of Measurement of Noise Signal Power in the Presence of Narrowband Interference

Authors: Alexey V. Klyuev, Valery P. Samarin, Viktor F. Klyuev

Abstract:

A power measurement algorithm of the input mix components of the noise signal and narrowband interference is considered using functional transformations of the input mix in the postdetection processing channel. The algorithm efficiency analysis has been carried out for different interference-to-signal ratio. Algorithm performance features have been explored by numerical experiment results.

Keywords: Noise signal, continuous narrowband interference, signal power, spectrum width, detection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1376
1088 Study of Syntactic Errors for Deep Parsing at Machine Translation

Authors: Yukiko Sasaki Alam, Shahid Alam

Abstract:

Syntactic parsing is vital for semantic treatment by many applications related to natural language processing (NLP), because form and content coincide in many cases. However, it has not yet reached the levels of reliable performance. By manually examining and analyzing individual machine translation output errors that involve syntax as well as semantics, this study attempts to discover what is required for improving syntactic and semantic parsing.

Keywords: Machine translation, error analysis, syntactic errors, knowledge required for parsing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1213
1087 Semantic Enhanced Social Media Sentiments for Stock Market Prediction

Authors: K. Nirmala Devi, V. Murali Bhaskaran

Abstract:

Traditional document representation for classification follows Bag of Words (BoW) approach to represent the term weights. The conventional method uses the Vector Space Model (VSM) to exploit the statistical information of terms in the documents and they fail to address the semantic information as well as order of the terms present in the documents. Although, the phrase based approach follows the order of the terms present in the documents rather than semantics behind the word. Therefore, a semantic concept based approach is used in this paper for enhancing the semantics by incorporating the ontology information. In this paper a novel method is proposed to forecast the intraday stock market price directional movement based on the sentiments from Twitter and money control news articles. The stock market forecasting is a very difficult and highly complicated task because it is affected by many factors such as economic conditions, political events and investor’s sentiment etc. The stock market series are generally dynamic, nonparametric, noisy and chaotic by nature. The sentiment analysis along with wisdom of crowds can automatically compute the collective intelligence of future performance in many areas like stock market, box office sales and election outcomes. The proposed method utilizes collective sentiments for stock market to predict the stock price directional movements. The collective sentiments in the above social media have powerful prediction on the stock price directional movements as up/down by using Granger Causality test.

Keywords: Bag of Words, Collective Sentiments, Ontology, Semantic relations, Sentiments, Social media, Stock Prediction, Twitter, Vector Space Model and wisdom of crowds.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2782
1086 Lean Production to Increase Reproducibility and Work Safety in the Laser Beam Melting Process Chain

Authors: C. Bay, A. Mahr, H. Groneberg, F. Döpper

Abstract:

Additive Manufacturing processes are becoming increasingly established in the industry for the economic production of complex prototypes and functional components. Laser beam melting (LBM), the most frequently used Additive Manufacturing technology for metal parts, has been gaining in industrial importance for several years. The LBM process chain – from material storage to machine set-up and component post-processing – requires many manual operations. These steps often depend on the manufactured component and are therefore not standardized. These operations are often not performed in a standardized manner, but depend on the experience of the machine operator, e.g., levelling of the build plate and adjusting the first powder layer in the LBM machine. This lack of standardization limits the reproducibility of the component quality. When processing metal powders with inhalable and alveolar particle fractions, the machine operator is at high risk due to the high reactivity and the toxic (e.g., carcinogenic) effect of the various metal powders. Faulty execution of the operation or unintentional omission of safety-relevant steps can impair the health of the machine operator. In this paper, all the steps of the LBM process chain are first analysed in terms of their influence on the two aforementioned challenges: reproducibility and work safety. Standardization to avoid errors increases the reproducibility of component quality as well as the adherence to and correct execution of safety-relevant operations. The corresponding lean method 5S will therefore be applied, in order to develop approaches in the form of recommended actions that standardize the work processes. These approaches will then be evaluated in terms of ease of implementation and their potential for improving reproducibility and work safety. The analysis and evaluation showed that sorting tools and spare parts as well as standardizing the workflow are likely to increase reproducibility. Organizing the operational steps and production environment decreases the hazards of material handling and consequently improves work safety.

Keywords: Additive manufacturing, lean production, reproducibility, work safety.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 798
1085 Dynamics In Production Processes

Authors: Marco Kennemann, Steffen C. Eickemeyer, Peter Nyhuis

Abstract:

An increasingly dynamic and complex environment poses huge challenges to production enterprises, especially with regards to logistics. The Logistic Operating Curve Theory, developed at the Institute of Production Systems and Logistics (IFA) of the Leibniz University of Hanover, is a recognized approach to describing logistic interactions, nevertheless, it reaches its limits when it comes to the dynamic aspects. In order to facilitate a timely and optimal Logistic Positioning a method is developed for quickly and reliably identifying dynamic processing states.

Keywords: Dynamics, Logistic Operating Curves, Production Logistics, Production Planning and Control

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1481
1084 Material Concepts and Processing Methods for Electrical Insulation

Authors: R. Sekula

Abstract:

Epoxy composites are broadly used as an electrical insulation for the high voltage applications since only such materials can fulfill particular mechanical, thermal, and dielectric requirements. However, properties of the final product are strongly dependent on proper manufacturing process with minimized material failures, as too large shrinkage, voids and cracks. Therefore, application of proper materials (epoxy, hardener, and filler) and process parameters (mold temperature, filling time, filling velocity, initial temperature of internal parts, gelation time), as well as design and geometric parameters are essential features for final quality of the produced components. In this paper, an approach for three-dimensional modeling of all molding stages, namely filling, curing and post-curing is presented. The reactive molding simulation tool is based on a commercial CFD package, and include dedicated models describing viscosity and reaction kinetics that have been successfully implemented to simulate the reactive nature of the system with exothermic effect. Also a dedicated simulation procedure for stress and shrinkage calculations, as well as simulation results are presented in the paper. Second part of the paper is dedicated to recent developments on formulations of functional composites for electrical insulation applications, focusing on thermally conductive materials. Concepts based on filler modifications for epoxy electrical composites have been presented, including the results of the obtained properties. Finally, having in mind tough environmental regulations, in addition to current process and design aspects, an approach for product re-design has been presented focusing on replacement of epoxy material with the thermoplastic one. Such “design-for-recycling” method is one of new directions associated with development of new material and processing concepts of electrical products and brings a lot of additional research challenges. For that, one of the successful products has been presented to illustrate the presented methodology.

Keywords: Curing, epoxy insulation, numerical simulations, recycling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1609
1083 Tagged Grid Matching Based Object Detection in Wavelet Neural Network

Authors: R. Arulmurugan, P. Sengottuvelan

Abstract:

Object detection using Wavelet Neural Network (WNN) plays a major contribution in the analysis of image processing. Existing cluster-based algorithm for co-saliency object detection performs the work on the multiple images. The co-saliency detection results are not desirable to handle the multi scale image objects in WNN. Existing Super Resolution (SR) scheme for landmark images identifies the corresponding regions in the images and reduces the mismatching rate. But the Structure-aware matching criterion is not paying attention to detect multiple regions in SR images and fail to enhance the result percentage of object detection. To detect the objects in the high-resolution remote sensing images, Tagged Grid Matching (TGM) technique is proposed in this paper. TGM technique consists of the three main components such as object determination, object searching and object verification in WNN. Initially, object determination in TGM technique specifies the position and size of objects in the current image. The specification of the position and size using the hierarchical grid easily determines the multiple objects. Second component, object searching in TGM technique is carried out using the cross-point searching. The cross out searching point of the objects is selected to faster the searching process and reduces the detection time. Final component performs the object verification process in TGM technique for identifying (i.e.,) detecting the dissimilarity of objects in the current frame. The verification process matches the search result grid points with the stored grid points to easily detect the objects using the Gabor wavelet Transform. The implementation of TGM technique offers a significant improvement on the multi-object detection rate, processing time, precision factor and detection accuracy level.

Keywords: Object Detection, Cross-point Searching, Wavelet Neural Network, Object Determination, Gabor Wavelet Transform, Tagged Grid Matching.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1948
1082 Digital Content Strategy: Detailed Review of the Key Content Components

Authors: Oksana Razina, Shakeel Ahmad, Jessie Qun Ren, Olufemi Isiaq

Abstract:

The modern life of businesses is categorically reliant on their established position online, where digital (and particularly website) content plays a significant role as the first point of information. Digital content, therefore, becomes essential – from making the first impression through to the building and development of client relationships. Despite a number of valuable papers suggesting a strategic approach when dealing with digital data, other sources often do not view or accept the approach to digital content as a holistic or continuous process. Associations are frequently made with merely a one-off marketing campaign or similar. The challenge is in establishing an agreed definition for the notion of Digital Content Strategy (DCS), which currently does not exist, as it is viewed from an excessive number of angles. A strategic approach to content, nonetheless, is required, both practically and contextually. We, therefore, aimed at attempting to identify the key content components, comprising a DCS, to ensure all the aspects were covered and strategically applied – from the company’s understanding of the content value to the ability to display flexibility of content and advances in technology. This conceptual project evaluated existing literature on the topic of DCS and related aspects, using PRISMA Systematic Review Method, Document Analysis, Inclusion and Exclusion Criteria, Scoping Review, Snow-Balling Technique and Thematic Analysis. The data were collected from academic and statistical sources, government and relevant trade publications. Based on the suggestions from academics and trading sources, related to the issues discussed, we revealed the key actions for content creation and attempted to define the notion of DCS. The major finding of the study presented Key Content Components of DCS and can be considered for implementation in a business retail setting.

Keywords: Digital content strategy, digital marketing strategy, key content components, websites.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 163
1081 Digital Content Strategy: Detailed Review of the Key Content Components

Authors: Oksana Razina, Shakeel Ahmad, Jessie Qun Ren, Olufemi Isiaq

Abstract:

The modern life of businesses is categorically reliant on their established position online, where digital (and particularly website) content plays a significant role as the first point of information. Digital content, therefore, becomes essential – from making the first impression through to the building and development of client relationships. Despite a number of valuable papers suggesting a strategic approach when dealing with digital data, other sources often do not view or accept the approach to digital content as a holistic or continuous process. Associations are frequently made with merely a one-off marketing campaign or similar. The challenge is in establishing an agreed definition for the notion of Digital Content Strategy (DCS), which currently does not exist, as it is viewed from an excessive number of angles. A strategic approach to content, nonetheless, is required, both practically and contextually. We, therefore, aimed at attempting to identify the key content components, comprising a DCS, to ensure all the aspects were covered and strategically applied – from the company’s understanding of the content value to the ability to display flexibility of content and advances in technology. This conceptual project evaluated existing literature on the topic of DCS and related aspects, using PRISMA Systematic Review Method, Document Analysis, Inclusion and Exclusion Criteria, Scoping Review, Snow-Balling Technique and Thematic Analysis. The data were collected from academic and statistical sources, government and relevant trade publications. Based on the suggestions from academics and trading sources, related to the issues discussed, we revealed the key actions for content creation and attempted to define the notion of DCS. The major finding of the study presented Key Content Components of DCS and can be considered for implementation in a business retail setting.

Keywords: Digital content strategy, digital marketing strategy, key content components, websites.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 142
1080 Application of Fuzzy Neural Network for Image Tumor Description

Authors: Nahla Ibraheem Jabbar, Monica Mehrotra

Abstract:

This paper used a fuzzy kohonen neural network for medical image segmentation. Image segmentation plays a important role in the many of medical imaging applications by automating or facilitating the diagnostic. The paper analyses the tumor by extraction of the features of (area, entropy, means and standard deviation).These measurements gives a description for a tumor.

Keywords: FCM, features extraction, medical image processing, neural network, segmentation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2091
1079 From Industry 4.0 to Agriculture 4.0: A Framework to Manage Product Data in Agri-Food Supply Chain for Voluntary Traceability

Authors: Angelo Corallo, Maria Elena Latino, Marta Menegoli

Abstract:

Agri-food value chain involves various stakeholders with different roles. All of them abide by national and international rules and leverage marketing strategies to advance their products. Food products and related processing phases carry with it a big mole of data that are often not used to inform final customer. Some data, if fittingly identified and used, can enhance the single company, and/or the all supply chain creates a math between marketing techniques and voluntary traceability strategies. Moreover, as of late, the world has seen buying-models’ modification: customer is careful on wellbeing and food quality. Food citizenship and food democracy was born, leveraging on transparency, sustainability and food information needs. Internet of Things (IoT) and Analytics, some of the innovative technologies of Industry 4.0, have a significant impact on market and will act as a main thrust towards a genuine ‘4.0 change’ for agriculture. But, realizing a traceability system is not simple because of the complexity of agri-food supply chain, a lot of actors involved, different business models, environmental variations impacting products and/or processes, and extraordinary climate changes. In order to give support to the company involved in a traceability path, starting from business model analysis and related business process a Framework to Manage Product Data in Agri-Food Supply Chain for Voluntary Traceability was conceived. Studying each process task and leveraging on modeling techniques lead to individuate information held by different actors during agri-food supply chain. IoT technologies for data collection and Analytics techniques for data processing supply information useful to increase the efficiency intra-company and competitiveness in the market. The whole information recovered can be shown through IT solutions and mobile application to made accessible to the company, the entire supply chain and the consumer with the view to guaranteeing transparency and quality.

Keywords: Agriculture 4.0, agri-food supply chain, Industry 4.0, voluntary traceability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2320
1078 On Musical Information Geometry with Applications to Sonified Image Analysis

Authors: Shannon Steinmetz, Ellen Gethner

Abstract:

In this paper a theoretical foundation is developed to segment, analyze and associate patterns within audio. We explore this on imagery via sonified audio applied to our segmentation framework. The approach involves a geodesic estimator within the statistical manifold, parameterized by musical centricity. We demonstrate viability by processing a database of random imagery to produce statistically significant clusters of similar imagery content.

Keywords: Sonification, musical information geometry, image content extraction, automated quantification, audio segmentation, pattern recognition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 376
1077 Intelligent Audio Watermarking using Genetic Algorithm in DWT Domain

Authors: M. Ketcham, S. Vongpradhip

Abstract:

In this paper, an innovative watermarking scheme for audio signal based on genetic algorithms (GA) in the discrete wavelet transforms is proposed. It is robust against watermarking attacks, which are commonly employed in literature. In addition, the watermarked image quality is also considered. We employ GA for the optimal localization and intensity of watermark. The watermark detection process can be performed without using the original audio signal. The experimental results demonstrate that watermark is inaudible and robust to many digital signal processing, such as cropping, low pass filter, additive noise.

Keywords: Intelligent Audio Watermarking, GeneticAlgorithm, DWT Domain.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2034
1076 Dielectric Studies on Nano Zirconium Dioxide Synthesized through Co-Precipitation Process

Authors: K. Geethalakshmi, T. Prabhakaran, J. Hemalatha

Abstract:

Nano sized zirconium dioxide in monoclinic phase (m-ZrO2) has been synthesized in pure form through co-precipitation processing at different calcination temperatures and has been characterized by several techniques such as XRD, FT-IR, UV-Vis Spectroscopy and SEM. The dielectric and capacitance values of the pelletized samples have been examined at room temperature as the functions of frequency. The higher dielectric constant value of the sample having larger grain size proves the strong influence of grain size on the dielectric constant.

Keywords: capacitance, dielectric constant, m-ZrO2, nano zirconia

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4008
1075 Algorithm for Bleeding Determination Based On Object Recognition and Local Color Features in Capsule Endoscopy

Authors: Yong-Gyu Lee, Jin Hee Park, Youngdae Seo, Gilwon Yoon

Abstract:

Automatic determination of blood in less bright or noisy capsule endoscopic images is difficult due to low S/N ratio. Especially it may not be accurate to analyze these images due to the influence of external disturbance. Therefore, we proposed detection methods that are not dependent only on color bands. In locating bleeding regions, the identification of object outlines in the frame and features of their local colors were taken into consideration. The results showed that the capability of detecting bleeding was much improved.

Keywords: Endoscopy, object recognition, bleeding, image processing, RGB.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1904
1074 Password Cracking on Graphics Processing Unit Based Systems

Authors: N. Gopalakrishna Kini, Ranjana Paleppady, Akshata K. Naik

Abstract:

Password authentication is one of the widely used methods to achieve authentication for legal users of computers and defense against attackers. There are many different ways to authenticate users of a system and there are many password cracking methods also developed. This paper proposes how best password cracking can be performed on a CPU-GPGPU based system. The main objective of this work is to project how quickly a password can be cracked with some knowledge about the computer security and password cracking if sufficient security is not incorporated to the system.

Keywords: GPGPU, password cracking, secret key, user authentication.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2597
1073 Developing Manufacturing Process for the Graphene Sensors

Authors: Abdullah Faqihi, John Hedley

Abstract:

Biosensors play a significant role in the healthcare sectors, scientific and technological progress. Developing electrodes that are easy to manufacture and deliver better electrochemical performance is advantageous for diagnostics and biosensing. They can be implemented extensively in various analytical tasks such as drug discovery, food safety, medical diagnostics, process controls, security and defence, in addition to environmental monitoring. Development of biosensors aims to create high-performance electrochemical electrodes for diagnostics and biosensing. A biosensor is a device that inspects the biological and chemical reactions generated by the biological sample. A biosensor carries out biological detection via a linked transducer and transmits the biological response into an electrical signal; stability, selectivity, and sensitivity are the dynamic and static characteristics that affect and dictate the quality and performance of biosensors. In this research, a developed experimental study for laser scribing technique for graphene oxide inside a vacuum chamber for processing of graphene oxide is presented. The processing of graphene oxide (GO) was achieved using the laser scribing technique. The effect of the laser scribing on the reduction of GO was investigated under two conditions: atmosphere and vacuum. GO solvent was coated onto a LightScribe DVD. The laser scribing technique was applied to reduce GO layers to generate rGO. The micro-details for the morphological structures of rGO and GO were visualised using scanning electron microscopy (SEM) and Raman spectroscopy so that they could be examined. The first electrode was a traditional graphene-based electrode model, made under normal atmospheric conditions, whereas the second model was a developed graphene electrode fabricated under a vacuum state using a vacuum chamber. The purpose was to control the vacuum conditions, such as the air pressure and the temperature during the fabrication process. The parameters to be assessed include the layer thickness and the continuous environment. Results presented show high accuracy and repeatability achieving low cost productivity.

Keywords: Laser scribing, LightScribe DVD, graphene oxide, scanning electron microscopy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 627
1072 Concurrent Access to Complex Entities

Authors: Cosmin Rablou

Abstract:

In this paper we present a way of controlling the concurrent access to data in a distributed application using the Pessimistic Offline Lock design pattern. In our case, the application processes a complex entity, which contains in a hierarchical structure different other entities (objects). It will be shown how the complex entity and the contained entities must be locked in order to control the concurrent access to data.

Keywords: Object-oriented programming, Pessimistic Lock, Design pattern, Concurrent access to data, Processing complex entities

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1287
1071 Structural Parsing of Natural Language Text in Tamil Using Phrase Structure Hybrid Language Model

Authors: Selvam M, Natarajan. A M, Thangarajan R

Abstract:

Parsing is important in Linguistics and Natural Language Processing to understand the syntax and semantics of a natural language grammar. Parsing natural language text is challenging because of the problems like ambiguity and inefficiency. Also the interpretation of natural language text depends on context based techniques. A probabilistic component is essential to resolve ambiguity in both syntax and semantics thereby increasing accuracy and efficiency of the parser. Tamil language has some inherent features which are more challenging. In order to obtain the solutions, lexicalized and statistical approach is to be applied in the parsing with the aid of a language model. Statistical models mainly focus on semantics of the language which are suitable for large vocabulary tasks where as structural methods focus on syntax which models small vocabulary tasks. A statistical language model based on Trigram for Tamil language with medium vocabulary of 5000 words has been built. Though statistical parsing gives better performance through tri-gram probabilities and large vocabulary size, it has some disadvantages like focus on semantics rather than syntax, lack of support in free ordering of words and long term relationship. To overcome the disadvantages a structural component is to be incorporated in statistical language models which leads to the implementation of hybrid language models. This paper has attempted to build phrase structured hybrid language model which resolves above mentioned disadvantages. In the development of hybrid language model, new part of speech tag set for Tamil language has been developed with more than 500 tags which have the wider coverage. A phrase structured Treebank has been developed with 326 Tamil sentences which covers more than 5000 words. A hybrid language model has been trained with the phrase structured Treebank using immediate head parsing technique. Lexicalized and statistical parser which employs this hybrid language model and immediate head parsing technique gives better results than pure grammar and trigram based model.

Keywords: Hybrid Language Model, Immediate Head Parsing, Lexicalized and Statistical Parsing, Natural Language Processing, Parts of Speech, Probabilistic Context Free Grammar, Tamil Language, Tree Bank.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3620
1070 3D Objects Indexing Using Spherical Harmonic for Optimum Measurement Similarity

Authors: S. Hellam, Y. Oulahrir, F. El Mounchid, A. Sadiq, S. Mbarki

Abstract:

In this paper, we propose a method for three-dimensional (3-D)-model indexing based on defining a new descriptor, which we call new descriptor using spherical harmonics. The purpose of the method is to minimize, the processing time on the database of objects models and the searching time of similar objects to request object. Firstly we start by defining the new descriptor using a new division of 3-D object in a sphere. Then we define a new distance which will be used in the search for similar objects in the database.

Keywords: 3D indexation, spherical harmonic, similarity of 3D objects.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2207
1069 Self-Organization of Clusters having Locally Distributed Patterns for Synchronized Inputs

Authors: Toshio Akimitsu, Yoichi Okabe, Akira Hirose

Abstract:

Many experimental results suggest that more precise spike timing is significant in neural information processing. We construct a self-organization model using the spatiotemporal patterns, where Spike-Timing Dependent Plasticity (STDP) tunes the conduction delays between neurons. We show that the fluctuation of conduction delays causes globally continuous and locally distributed firing patterns through the self-organization.

Keywords: Self-organization, synfire-chain, Spike-Timing Dependent Plasticity, distributed information representation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1208