Search results for: Bi-directional associative memory (BAM) neural networks
844 Contextual SenSe Model: Word Sense Disambiguation Using Sense and Sense Value of Context Surrounding the Target
Authors: Vishal Raj, Noorhan Abbas
Abstract:
Ambiguity in NLP (Natural Language Processing) refers to the ability of a word, phrase, sentence, or text to have multiple meanings. This results in various kinds of ambiguities such as lexical, syntactic, semantic, anaphoric and referential. This study is focused mainly on solving the issue of Lexical ambiguity. Word Sense Disambiguation (WSD) is an NLP technique that aims to resolve lexical ambiguity by determining the correct meaning of a word within a given context. Most WSD solutions rely on words for training and testing, but we have used lemma and Part of Speech (POS) tokens of words for training and testing. Lemma adds generality and POS adds properties of word into token. We have designed a method to create an affinity matrix to calculate the affinity between any pair of lemma_POS (a token where lemma and POS of word are joined by underscore) of given training set. Additionally, we have devised an algorithm to create the sense clusters of tokens using affinity matrix under hierarchy of POS of lemma. Furthermore, three different mechanisms to predict the sense of target word using the affinity/similarity value are devised. Each contextual token contributes to the sense of target word with some value and whichever sense gets higher value becomes the sense of target word. So, contextual tokens play a key role in creating sense clusters and predicting the sense of target word, hence, the model is named Contextual SenSe Model (CSM). CSM exhibits a noteworthy simplicity and explication lucidity in contrast to contemporary deep learning models characterized by intricacy, time-intensive processes, and challenging explication. CSM is trained on SemCor training data and evaluated on SemEval test dataset. The results indicate that despite the naivety of the method, it achieves promising results when compared to the Most Frequent Sense (MFS) model.
Keywords: Word Sense Disambiguation, WSD, Contextual SenSe Model, Most Frequent Sense, part of speech, POS, Natural Language Processing, NLP, OOV, out of vocabulary, ELMo, Embeddings from Language Model, BERT, Bidirectional Encoder Representations from Transformers, Word2Vec, lemma_POS, Algorithm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 387843 A Practical Distributed String Matching Algorithm Architecture and Implementation
Authors: Bi Kun, Gu Nai-jie, Tu Kun, Liu Xiao-hu, Liu Gang
Abstract:
Traditional parallel single string matching algorithms are always based on PRAM computation model. Those algorithms concentrate on the cost optimal design and the theoretical speed. Based on the distributed string matching algorithm proposed by CHEN, a practical distributed string matching algorithm architecture is proposed in this paper. And also an improved single string matching algorithm based on a variant Boyer-Moore algorithm is presented. We implement our algorithm on the above architecture and the experiments prove that it is really practical and efficient on distributed memory machine. Its computation complexity is O(n/p + m), where n is the length of the text, and m is the length of the pattern, and p is the number of the processors.Keywords: Boyer-Moore algorithm, distributed algorithm, parallel string matching, string matching.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2189842 Momentum and Heat Transfer in the Flow of a Viscoelastic Fluid Past a Porous Flat Plate Subject to Suction or Blowing
Authors: Motahar Reza, Anadi Sankar Gupta
Abstract:
An analysis is made of the flow of an incompressible viscoelastic fluid (of small memory) over a porous plate subject to suction or blowing. It is found that velocity at a point increases with increase in the elasticity in the fluid. It is also shown that wall shear stress depends only on suction and is also independent of the material of fluids. No steady solution for velocity distribution exists when there is blowing at the plate. Temperature distribution in the boundary layer is determined and it is found that temperature at a point decreases with increase in the elasticity in the fluid.
Keywords: Viscoelastic fluid, Flow past a porous plate, Heat transfer
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1335841 Weka Based Desktop Data Mining as Web Service
Authors: Sujala.D.Shetty, S.Vadivel, Sakshi Vaghella
Abstract:
Data mining is the process of sifting through large volumes of data, analyzing data from different perspectives and summarizing it into useful information. One of the widely used desktop applications for data mining is the Weka tool which is nothing but a collection of machine learning algorithms implemented in Java and open sourced under the General Public License (GPL). A web service is a software system designed to support interoperable machine to machine interaction over a network using SOAP messages. Unlike a desktop application, a web service is easy to upgrade, deliver and access and does not occupy any memory on the system. Keeping in mind the advantages of a web service over a desktop application, in this paper we are demonstrating how this Java based desktop data mining application can be implemented as a web service to support data mining across the internet.Keywords: desktop application, Weka mining, web service
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4081840 Content-Based Color Image Retrieval Based On 2-D Histogram and Statistical Moments
Authors: Khalid Elasnaoui, Brahim Aksasse, Mohammed Ouanan
Abstract:
In this paper, we are interested in the problem of finding similar images in a large database. For this purpose we propose a new algorithm based on a combination of the 2-D histogram intersection in the HSV space and statistical moments. The proposed histogram is based on a 3x3 window and not only on the intensity of the pixel. This approach overcome the drawback of the conventional 1-D histogram which is ignoring the spatial distribution of pixels in the image, while the statistical moments are used to escape the effects of the discretisation of the color space which is intrinsic to the use of histograms. We compare the performance of our new algorithm to various methods of the state of the art and we show that it has several advantages. It is fast, consumes little memory and requires no learning. To validate our results, we apply this algorithm to search for similar images in different image databases.Keywords: 2-D histogram, Statistical moments, Indexing, Similarity distance, Histograms intersection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1931839 Collaboration versus Cooperation: Grassroots Activism in Divided Cities and Communication Networks
Authors: R. Barbour
Abstract:
Peace-building organisations act as a network of information for communities. Through fieldwork, it was highlighted that grassroots organisations and activists may cooperate with each other in their actions of peace-building; however, they would not collaborate. Within two divided societies; Nicosia in Cyprus and Jerusalem in Israel, there is a distinction made by organisations and activists with regards to activities being more ‘co-operative’ than ‘collaborative’. This theme became apparent when having informal conversations and semi-structured interviews with various members of the activist communities. This idea needs further exploration as these distinctions could impact upon the efficiency of peacebuilding activities within divided societies. Civil societies within divided landscapes, both physically and socially, play an important role in conflict resolution. How organisations and activists interact with each other has the possibility to be very influential with regards to peacebuilding activities. Working together sets a positive example for divided communities. Cooperation may be considered a primary level of interaction between CSOs. Therefore, at the beginning of a working relationship, organisations cooperate over basic agendas, parallel power structures and focus, which led to the same objective. Over time, in some instances, due to varying factors such as funding, more trust and understanding within the relationship, it could be seen that processes progressed to more collaborative ways. It is evident to see that NGOs and activist groups are highly independent and focus on their own agendas before coming together over shared issues. At this time, there appears to be more collaboration in Nicosia among CSOs and activists than Jerusalem. The aims and objectives of agendas also influence how organisations work together. In recent years, Nicosia, and Cyprus in general, have perhaps changed their focus from peace-building initiatives to more environmental issues which have become new-age reconciliation topics. Civil society does not automatically indicate like-minded organisations however solidarity within social groups can create ties that bring people and resources together. In unequal societies, such as those in Nicosia and Jerusalem, it is these ties that cut across groups and are essential for social cohesion. Societies are a collection of social groups; individuals who have come together over common beliefs. These groups in turn shape the identities and determine the values and structures within societies. At many different levels and stages, social groups work together through cooperation and collaboration. These structures in turn have the capabilities to open up networks to less powerful or excluded groups, with the aim to produce social cohesion which may contribute social stability and economic welfare over any extended period.
Keywords: Collaboration, cooperation, grassroots activism, networks of communication.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 924838 Test Data Compression Using a Hybrid of Bitmask Dictionary and 2n Pattern Runlength Coding Methods
Authors: C. Kalamani, K. Paramasivam
Abstract:
In VLSI, testing plays an important role. Major problem in testing are test data volume and test power. The important solution to reduce test data volume and test time is test data compression. The Proposed technique combines the bit maskdictionary and 2n pattern run length-coding method and provides a substantial improvement in the compression efficiency without introducing any additional decompression penalty. This method has been implemented using Mat lab and HDL Language to reduce test data volume and memory requirements. This method is applied on various benchmark test sets and compared the results with other existing methods. The proposed technique can achieve a compression ratio up to 86%.Keywords: Bit Mask dictionary, 2n pattern run length code, system-on-chip, SOC, test data compression.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1921837 Performance Evaluation of Popular Hash Functions
Authors: Sheena Mathew, K. Poulose Jacob
Abstract:
This paper describes the results of an extensive study and comparison of popular hash functions SHA-1, SHA-256, RIPEMD-160 and RIPEMD-320 with JERIM-320, a 320-bit hash function. The compression functions of hash functions like SHA-1 and SHA-256 are designed using serial successive iteration whereas those like RIPEMD-160 and RIPEMD-320 are designed using two parallel lines of message processing. JERIM-320 uses four parallel lines of message processing resulting in higher level of security than other hash functions at comparable speed and memory requirement. The performance evaluation of these methods has been done by using practical implementation and also by using step computation methods. JERIM-320 proves to be secure and ensures the integrity of messages at a higher degree. The focus of this work is to establish JERIM-320 as an alternative of the present day hash functions for the fast growing internet applications.Keywords: Cryptography, Hash function, JERIM-320, Messageintegrity
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2641836 Pathology of Explanted Transvaginal Meshes
Authors: Vladimir V. Iakovlev, Erin T. Carey, John Steege
Abstract:
The use of polypropylene mesh devices for Pelvic Organ Prolapse (POP) spread rapidly during the last decade, yet our knowledge of the mesh-tissue interaction is far from complete. We aimed to perform a thorough pathological examination of explanted POP meshes and describe findings that may explain mechanisms of complications resulting in product excision. We report a spectrum of important findings, including nerve ingrowth, mesh deformation, involvement of detrusor muscle with neural ganglia, and polypropylene degradation. Analysis of these findings may improve and guide future treatment strategies.
Keywords: Transvaginal, mesh, nerves, polypropylene degradation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4087835 SMART: Solution Methods with Ants Running by Types
Authors: Nicolas Zufferey
Abstract:
Ant algorithms are well-known metaheuristics which have been widely used since two decades. In most of the literature, an ant is a constructive heuristic able to build a solution from scratch. However, other types of ant algorithms have recently emerged: the discussion is thus not limited by the common framework of the constructive ant algorithms. Generally, at each generation of an ant algorithm, each ant builds a solution step by step by adding an element to it. Each choice is based on the greedy force (also called the visibility, the short term profit or the heuristic information) and the trail system (central memory which collects historical information of the search process). Usually, all the ants of the population have the same characteristics and behaviors. In contrast in this paper, a new type of ant metaheuristic is proposed, namely SMART (for Solution Methods with Ants Running by Types). It relies on the use of different population of ants, where each population has its own personality.Keywords: Optimization, Metaheuristics, Ant Algorithms, Evolutionary Procedures, Population-Based Methods.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1720834 Grid-based Supervised Clustering - GBSC
Authors: Pornpimol Bungkomkhun, Surapong Auwatanamongkol
Abstract:
This paper presents a supervised clustering algorithm, namely Grid-Based Supervised Clustering (GBSC), which is able to identify clusters of any shapes and sizes without presuming any canonical form for data distribution. The GBSC needs no prespecified number of clusters, is insensitive to the order of the input data objects, and is capable of handling outliers. Built on the combination of grid-based clustering and density-based clustering, under the assistance of the downward closure property of density used in bottom-up subspace clustering, the GBSC can notably reduce its search space to avoid the memory confinement situation during its execution. On two-dimension synthetic datasets, the GBSC can identify clusters with different shapes and sizes correctly. The GBSC also outperforms other five supervised clustering algorithms when the experiments are performed on some UCI datasets.Keywords: supervised clustering, grid-based clustering, subspace clustering
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1610833 An Intelligent Transportation System for Safety and Integrated Management of Railway Crossings
Authors: M. Magrini, D. Moroni, G. Palazzese, G. Pieri, D. Azzarelli, A. Spada, L. Fanucci, O. Salvetti
Abstract:
Railway crossings are complex entities whose optimal management cannot be addressed unless with the help of an intelligent transportation system integrating information both on train and vehicular flows. In this paper, we propose an integrated system named SIMPLE (Railway Safety and Infrastructure for Mobility applied at level crossings) that, while providing unparalleled safety in railway level crossings, collects data on rail and road traffic and provides value-added services to citizens and commuters. Such services include for example alerts, via variable message signs to drivers and suggestions for alternative routes, towards a more sustainable, eco-friendly and efficient urban mobility. To achieve these goals, SIMPLE is organized as a System of Systems (SoS), with a modular architecture whose components range from specially-designed radar sensors for obstacle detection to smart ETSI M2M-compliant camera networks for urban traffic monitoring. Computational unit for performing forecast according to adaptive models of train and vehicular traffic are also included. The proposed system has been tested and validated during an extensive trial held in the mid-sized Italian town of Montecatini, a paradigmatic case where the rail network is inextricably linked with the fabric of the city. Results of the tests are reported and discussed.
Keywords: Intelligent Transportation Systems (ITS), railway, railroad crossing, smart camera networks, radar obstacle detection, real-time traffic optimization, IoT, ETSI M2M, transport safety.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1420832 Self-Organization of Clusters having Locally Distributed Patterns for Synchronized Inputs
Authors: Toshio Akimitsu, Yoichi Okabe, Akira Hirose
Abstract:
Many experimental results suggest that more precise spike timing is significant in neural information processing. We construct a self-organization model using the spatiotemporal patterns, where Spike-Timing Dependent Plasticity (STDP) tunes the conduction delays between neurons. We show that the fluctuation of conduction delays causes globally continuous and locally distributed firing patterns through the self-organization.Keywords: Self-organization, synfire-chain, Spike-Timing Dependent Plasticity, distributed information representation
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1233831 A Recommender System Fusing Collaborative Filtering and User’s Review Mining
Authors: Seulbi Choi, Hyunchul Ahn
Abstract:
Collaborative filtering (CF) algorithm has been popularly used for recommender systems in both academic and practical applications. It basically generates recommendation results using users’ numeric ratings. However, the additional use of the information other than user ratings may lead to better accuracy of CF. Considering that a lot of people are likely to share their honest opinion on the items they purchased recently due to the advent of the Web 2.0, user's review can be regarded as the new informative source for identifying user's preference with accuracy. Under this background, this study presents a hybrid recommender system that fuses CF and user's review mining. Our system adopts conventional memory-based CF, but it is designed to use both user’s numeric ratings and his/her text reviews on the items when calculating similarities between users.Keywords: Recommender system, collaborative filtering, text mining, review mining.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1587830 Remarks Regarding Queuing Model and Packet Loss Probability for the Traffic with Self-Similar Characteristics
Authors: Mihails Kulikovs, Ernests Petersons
Abstract:
Network management techniques have long been of interest to the networking research community. The queue size plays a critical role for the network performance. The adequate size of the queue maintains Quality of Service (QoS) requirements within limited network capacity for as many users as possible. The appropriate estimation of the queuing model parameters is crucial for both initial size estimation and during the process of resource allocation. The accurate resource allocation model for the management system increases the network utilization. The present paper demonstrates the results of empirical observation of memory allocation for packet-based services.Keywords: Queuing System, Packet Loss Probability, Measurement-Based Admission Control (MBAC), Performanceevaluation, Quality of Service (QoS).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1773829 Comanche – A Compiler-Driven I/O Management System
Authors: Wendy Zhang, Ernst L. Leiss, Huilin Ye
Abstract:
Most scientific programs have large input and output data sets that require out-of-core programming or use virtual memory management (VMM). Out-of-core programming is very error-prone and tedious; as a result, it is generally avoided. However, in many instance, VMM is not an effective approach because it often results in substantial performance reduction. In contrast, compiler driven I/O management will allow a program-s data sets to be retrieved in parts, called blocks or tiles. Comanche (COmpiler MANaged caCHE) is a compiler combined with a user level runtime system that can be used to replace standard VMM for out-of-core programs. We describe Comanche and demonstrate on a number of representative problems that it substantially out-performs VMM. Significantly our system does not require any special services from the operating system and does not require modification of the operating system kernel.Keywords: I/O Management, Out-of-core, Compiler, Tile mapping.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1318828 Performance Analysis of MC-SS for the Indoor BPLC Systems
Authors: Justinian Anatory
Abstract:
power-line networks are promise infrastructure for broadband services provision to end users. However, the network performance is affected by stochastic channel changing which is due to load impedances, number of branches and branched line lengths. It has been proposed that multi-carrier modulations techniques such as orthogonal frequency division multiplexing (OFDM), Multi-Carrier Spread Spectrum (MC-SS), wavelet OFDM can be used in such environment. This paper investigates the performance of different indoor topologies of power-line networks that uses MC-SS modulation scheme.It is observed that when a branch is added in the link between sending and receiving end of an indoor channel an average of 2.5dB power loss is found. In additional, when the branch is added at a node an average of 1dB power loss is found. Additionally when the terminal impedances of the branch change from line characteristic impedance to impedance either higher or lower values the channel performances were tremendously improved. For example changing terminal load from characteristic impedance (85 .) to 5 . the signal to noise ratio (SNR) required to attain the same performances were decreased from 37dB to 24dB respectively. Also, changing the terminal load from channel characteristic impedance (85 .) to very higher impedance (1600 .) the SNR required to maintain the same performances were decreased from 37dB to 23dB. The result concludes that MC-SS performs better compared with OFDM techniques in all aspects and especially when the channel is terminated in either higher or lower impedances.Keywords: Communication channel model; Broadband Powerlinecommunication; Branched network; OFDM; Delay Spread, MCSS;impulsive noise; load impedance
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1606827 Self-Organization of Clusters Having Locally Distributed Patterns for Highly Synchronized Inputs
Authors: Toshio Akimitsu, Yoichi Okabe, Akira Hirose
Abstract:
Many experimental results suggest that more precise spike timing is significant in neural information processing. We construct a self-organization model using the spatiotemporal pat-terns, where Spike-Timing Dependent Plasticity (STDP) tunes the conduction delays between neurons. We show that, for highly syn-chronized inputs, the fluctuation of conduction delays causes globally continuous and locally distributed firing patterns through the self-organization.
Keywords: Self-organization, synfire-chain, Spike-Timing DependentPlasticity, distributed information representation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1409826 Self Organizing Analysis Platform for Wear Particle
Authors: Qurban A. Memon, Mohammad S. Laghari
Abstract:
Integration of system process information obtained through an image processing system with an evolving knowledge database to improve the accuracy and predictability of wear particle analysis is the main focus of the paper. The objective is to automate intelligently the analysis process of wear particle using classification via self organizing maps. This is achieved using relationship measurements among corresponding attributes of various measurements for wear particle. Finally, visualization technique is proposed that helps the viewer in understanding and utilizing these relationships that enable accurate diagnostics.Keywords: Neural Network, Relationship Measurement, Selforganizing Clusters, Wear Particle Analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2214825 Learning Based On Computer Science Unplugged in Computer Science Education: Design, Development, and Assessment
Authors: Eiko Takaoka, Yoshiyuki Fukushima, Koichiro Hirose, Tadashi Hasegawa
Abstract:
Although, all high school students in Japan are required to learn informatics, many of them do not learn this topic sufficiently. In response to this situation, we propose a support package for high school informatics classes. To examine what students learned and if they sufficiently understood the context of the lessons, a questionnaire survey was distributed to 186 students. We analyzed the results of the questionnaire and determined the weakest units, which were “basic computer configuration” and “memory and secondary storage”. We then developed a package for teaching these units. We propose that our package be applied in high school classrooms.
Keywords: Computer Science Unplugged, computer science outreach, high school curriculum, experimental evaluation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2119824 Harrison’s Stolen: Addressing Aboriginal and Indigenous Islanders Human Rights
Authors: M. Shukry
Abstract:
According to the United Nations Declaration of Human Rights in 1948, every human being is entitled to rights in life that should be respected by others and protected by the state and community. Such rights are inherent regardless of colour, ethnicity, gender, religion or otherwise, and it is expected that all humans alike have the right to live without discrimination of any sort. However, that has not been the case with Aborigines in Australia. Over a long period of time, the governments of the State and the Territories and the Australian Commonwealth denied the Aboriginal and Indigenous inhabitants of the Torres Strait Islands such rights. Past Australian governments set policies and laws that enabled them to forcefully remove Indigenous children from their parents, which resulted in creating lost generations living the trauma of the loss of cultural identity, alienation and even their own selfhood. Intending to reduce that population of natives and their Aboriginal culture while, on the other hand, assimilate them into mainstream society, they gave themselves the right to remove them from their families with no hope of return. That practice has led to tragic consequences due to the trauma that has affected those children, an experience that is depicted by Jane Harrison in her play Stolen. The drama is the outcome of a six-year project on lost children and which was first performed in 1997 in Melbourne. Five actors only appear on the stage, playing the role of all the different characters, whether the main protagonists or the remaining cast, present or non-present ones as voices. The play outlines the life of five children who have been taken from their parents at an early age, entailing a disastrous negative impact that differs from one to the other. Unknown to each other, what connects between them is being put in a children’s home. The purpose of this paper is to analyse the play’s text in light of the 1948 Declaration of Human Rights, using it as a lens that reflects the atrocities practiced against the Aborigines. It highlights how such practices formed an outrageous violation of those natives’ rights as human beings. Harrison’s dramatic technique in conveying the children’s experiences is through a non-linear structure, fluctuating between past and present that are linked together within each of the five characters, reflecting their suffering and pain to create an emotional link between them and the audience. Her dramatic handling of the issue by fusing tragedy with humour as well as symbolism is a successful technique in revealing the traumatic memory of those children and their present life. The play has made a difference in commencing to address the problem of the right of all children to be with their families, which renders the real meaning of having a home and an identity as people.
Keywords: Aboriginal, audience, Australia, children, culture, drama, home, human rights, identity, indigenous, Jane Harrison, memory, scenic effects, setting, stage, stage directions, Stolen, trauma.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1686823 How Virtualization, Decentralization and Network Building Change the Manufacturing Landscape: An Industry 4.0 Perspective
Authors: Malte Brettel, Niklas Friederichsen, Michael Keller, Marius Rosenberg
Abstract:
The German manufacturing industry has to withstand an increasing global competition on product quality and production costs. As labor costs are high, several industries have suffered severely under the relocation of production facilities towards aspiring countries, which have managed to close the productivity and quality gap substantially. Established manufacturing companies have recognized that customers are not willing to pay large price premiums for incremental quality improvements. As a consequence, many companies from the German manufacturing industry adjust their production focusing on customized products and fast time to market. Leveraging the advantages of novel production strategies such as Agile Manufacturing and Mass Customization, manufacturing companies transform into integrated networks, in which companies unite their core competencies. Hereby, virtualization of the process- and supply-chain ensures smooth inter-company operations providing real-time access to relevant product and production information for all participating entities. Boundaries of companies deteriorate, as autonomous systems exchange data, gained by embedded systems throughout the entire value chain. By including Cyber-Physical-Systems, advanced communication between machines is tantamount to their dialogue with humans. The increasing utilization of information and communication technology allows digital engineering of products and production processes alike. Modular simulation and modeling techniques allow decentralized units to flexibly alter products and thereby enable rapid product innovation. The present article describes the developments of Industry 4.0 within the literature and reviews the associated research streams. Hereby, we analyze eight scientific journals with regards to the following research fields: Individualized production, end-to-end engineering in a virtual process chain and production networks. We employ cluster analysis to assign sub-topics into the respective research field. To assess the practical implications, we conducted face-to-face interviews with managers from the industry as well as from the consulting business using a structured interview guideline. The results reveal reasons for the adaption and refusal of Industry 4.0 practices from a managerial point of view. Our findings contribute to the upcoming research stream of Industry 4.0 and support decision-makers to assess their need for transformation towards Industry 4.0 practices.
Keywords: Industry 4.0., Mass Customization, Production networks, Virtual Process-Chain.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 31747822 Real-Time Image Analysis of Capsule Endoscopy for Bleeding Discrimination in Embedded System Platform
Authors: Yong-Gyu Lee, Gilwon Yoon
Abstract:
Image processing for capsule endoscopy requires large memory and it takes hours for diagnosis since operation time is normally more than 8 hours. A real-time analysis algorithm of capsule images can be clinically very useful. It can differentiate abnormal tissue from health structure and provide with correlation information among the images. Bleeding is our interest in this regard and we propose a method of detecting frames with potential bleeding in real-time. Our detection algorithm is based on statistical analysis and the shapes of bleeding spots. We tested our algorithm with 30 cases of capsule endoscopy in the digestive track. Results were excellent where a sensitivity of 99% and a specificity of 97% were achieved in detecting the image frames with bleeding spots.Keywords: bleeding, capsule endoscopy, image processing, real time analysis
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1875821 Neuro-Fuzzy System for Equalization Channel Distortion
Authors: Rahib H. Abiyev
Abstract:
In this paper the application of neuro-fuzzy system for equalization of channel distortion is considered. The structure and operation algorithm of neuro-fuzzy equalizer are described. The use of neuro-fuzzy equalizer in digital signal transmission allows to decrease training time of parameters and decrease the complexity of the network. The simulation of neuro-fuzzy equalizer is performed. The obtained result satisfies the efficiency of application of neurofuzzy technology in channel equalization.
Keywords: Neuro-fuzzy system, noise equalization, neuro-fuzzy equalizer, neural system.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1632820 Grid-HPA: Predicting Resource Requirements of a Job in the Grid Computing Environment
Authors: M. Bohlouli, M. Analoui
Abstract:
For complete support of Quality of Service, it is better that environment itself predicts resource requirements of a job by using special methods in the Grid computing. The exact and correct prediction causes exact matching of required resources with available resources. After the execution of each job, the used resources will be saved in the active database named "History". At first some of the attributes will be exploit from the main job and according to a defined similarity algorithm the most similar executed job will be exploited from "History" using statistic terms such as linear regression or average, resource requirements will be predicted. The new idea in this research is based on active database and centralized history maintenance. Implementation and testing of the proposed architecture results in accuracy percentage of 96.68% to predict CPU usage of jobs and 91.29% of memory usage and 89.80% of the band width usage.
Keywords: Active Database, Grid Computing, ResourceRequirement Prediction, Scheduling,
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1432819 A Pipelined FSBM Hardware Architecture for HTDV-H.26x
Authors: H. Loukil, A. Ben Atitallah, F. Ghozzi, M. A. Ben Ayed, N. Masmoudi
Abstract:
In MPEG and H.26x standards, to eliminate the temporal redundancy we use motion estimation. Given that the motion estimation stage is very complex in terms of computational effort, a hardware implementation on a re-configurable circuit is crucial for the requirements of different real time multimedia applications. In this paper, we present hardware architecture for motion estimation based on "Full Search Block Matching" (FSBM) algorithm. This architecture presents minimum latency, maximum throughput, full utilization of hardware resources such as embedded memory blocks, and combining both pipelining and parallel processing techniques. Our design is described in VHDL language, verified by simulation and implemented in a Stratix II EP2S130F1020C4 FPGA circuit. The experiment result show that the optimum operating clock frequency of the proposed design is 89MHz which achieves 160M pixels/sec.Keywords: SAD, FSBM, Hardware Implementation, FPGA.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1641818 Cooperative Cross Layer Topology for Concurrent Transmission Scheduling Scheme in Broadband Wireless Networks
Authors: Gunasekaran Raja, Ramkumar Jayaraman
Abstract:
In this paper, we consider CCL-N (Cooperative Cross Layer Network) topology based on the cross layer (both centralized and distributed) environment to form network communities. Various performance metrics related to the IEEE 802.16 networks are discussed to design CCL-N Topology. In CCL-N topology, nodes are classified as master nodes (Master Base Station [MBS]) and serving nodes (Relay Station [RS]). Nodes communities are organized based on the networking terminologies. Based on CCL-N Topology, various simulation analyses for both transparent and non-transparent relays are tabulated and throughput efficiency is calculated. Weighted load balancing problem plays a challenging role in IEEE 802.16 network. CoTS (Concurrent Transmission Scheduling) Scheme is formulated in terms of three aspects – transmission mechanism based on identical communities, different communities and identical node communities. CoTS scheme helps in identifying the weighted load balancing problem. Based on the analytical results, modularity value is inversely proportional to that of the error value. The modularity value plays a key role in solving the CoTS problem based on hop count. The transmission mechanism for identical node community has no impact since modularity value is same for all the network groups. In this paper three aspects of communities based on the modularity value which helps in solving the problem of weighted load balancing and CoTS are discussed.
Keywords: Cross layer network topology, concurrent scheduling, modularity value, network communities and weighted load balancing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1434817 A Refined Application of QFD in SCM, A New Approach
Authors: Nooshin La'l Mohamadi
Abstract:
Due to the fact that in the new century customers tend to express globally increasing demands, networks of interconnected businesses have been established in societies and the management of such networks seems to be a major key through gaining competitive advantages. Supply chain management encompasses such managerial activities. Within a supply chain, a critical role is played by quality. QFD is a widely-utilized tool which serves the purpose of not only bringing quality to the ultimate provision of products or service packages required by the end customer or the retailer, but it can also initiate us into a satisfactory relationship with our initial customer; that is the wholesaler. However, the wholesalers- cooperation is considerably based on the capabilities that are heavily dependent on their locations and existing circumstances. Therefore, it is undeniable that for all companies each wholesaler possesses a specific importance ratio which can heavily influence the figures calculated in the House of Quality in QFD. Moreover, due to the competitiveness of the marketplace today, it-s been widely recognized that consumers- expression of demands has been highly volatile in periods of production. Apparently, such instability and proneness to change has been very tangibly noticed and taking it into account during the analysis of HOQ is widely influential and doubtlessly required. For a more reliable outcome in such matters, this article demonstrates the application viability of Analytic Network Process for considering the wholesalers- reputation and simultaneously introduces a mortality coefficient for the reliability and stability of the consumers- expressed demands in course of time. Following to this, the paper provides further elaboration on the relevant contributory factors and approaches through the calculation of such coefficients. In the end, the article concludes that an empirical application is needed to achieve broader validity.Keywords: Analytic Network Process, Quality Function Deployment, QFD flaws, Supply Chain Management
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1426816 Low Power CNFET SRAM Design
Authors: Pejman Hosseiniun, Rose Shayeghi, Iman Rahbari, Mohamad Reza Kalhor
Abstract:
CNFET has emerged as an alternative material to silicon for high performance, high stability and low power SRAM design in recent years. SRAM functions as cache memory in computers and many portable devices. In this paper, a new SRAM cell design based on CNFET technology is proposed. The proposed SRAM cell design for CNFET is compared with SRAM cell designs implemented with the conventional CMOS and FinFET in terms of speed, power consumption, stability, and leakage current. The HSPICE simulation and analysis show that the dynamic power consumption of the proposed 8T CNFET SRAM cell’s is reduced about 48% and the SNM is widened up to 56% compared to the conventional CMOS SRAM structure at the expense of 2% leakage power and 3% write delay increase.
Keywords: SRAM cell, CNFET, low power, HSPICE.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2703815 Consistent Modeling of Functional Dependencies along with World Knowledge
Authors: Sven Rebhan, Nils Einecke, Julian Eggert
Abstract:
In this paper we propose a method for vision systems to consistently represent functional dependencies between different visual routines along with relational short- and long-term knowledge about the world. Here the visual routines are bound to visual properties of objects stored in the memory of the system. Furthermore, the functional dependencies between the visual routines are seen as a graph also belonging to the object-s structure. This graph is parsed in the course of acquiring a visual property of an object to automatically resolve the dependencies of the bound visual routines. Using this representation, the system is able to dynamically rearrange the processing order while keeping its functionality. Additionally, the system is able to estimate the overall computational costs of a certain action. We will also show that the system can efficiently use that structure to incorporate already acquired knowledge and thus reduce the computational demand.Keywords: Adaptive systems, Knowledge representation, Machinevision, Systems engineering
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1696