Search results for: Memory Forensics.
192 MMU Simulation in Hardware Simulator Based-on State Transition Models
Authors: Zhang Xiuping, Yang Guowu, Zheng Desheng
Abstract:
Embedded hardware simulator is a valuable computeraided tool for embedded application development. This paper focuses on the ARM926EJ-S MMU, builds state transition models and formally verifies critical properties for the models. The state transition models include loading instruction model, reading data model, and writing data model. The properties of the models are described by CTL specification language, and they are verified in VIS. The results obtained in VIS demonstrate that the critical properties of MMU are satisfied in the state transition models. The correct models can be used to implement the MMU component in our simulator. In the end of this paper, the experimental results show that the MMU can successfully accomplish memory access requests from CPU.Keywords: MMU, State transition, Model, Simulation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1617191 Supercompression for Full-HD and 4k-3D (8k)Digital TV Systems
Authors: Mario Mastriani
Abstract:
In this work, we developed the concept of supercompression, i.e., compression above the compression standard used. In this context, both compression rates are multiplied. In fact, supercompression is based on super-resolution. That is to say, supercompression is a data compression technique that superpose spatial image compression on top of bit-per-pixel compression to achieve very high compression ratios. If the compression ratio is very high, then we use a convolutive mask inside decoder that restores the edges, eliminating the blur. Finally, both, the encoder and the complete decoder are implemented on General-Purpose computation on Graphics Processing Units (GPGPU) cards. Specifically, the mentio-ned mask is coded inside texture memory of a GPGPU.Keywords: General-Purpose computation on Graphics Processing Units, Image Compression, Interpolation, Super-resolution.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1980190 Applying a Noise Reduction Method to Reveal Chaos in the River Flow Time Series
Authors: Mohammad H. Fattahi
Abstract:
Chaotic analysis has been performed on the river flow time series before and after applying the wavelet based de-noising techniques in order to investigate the noise content effects on chaotic nature of flow series. In this study, 38 years of monthly runoff data of three gauging stations were used. Gauging stations were located in Ghar-e-Aghaj river basin, Fars province, Iran. Noise level of time series was estimated with the aid of Gaussian kernel algorithm. This step was found to be crucial in preventing removal of the vital data such as memory, correlation and trend from the time series in addition to the noise during de-noising process.
Keywords: Chaotic behavior, wavelet, noise reduction, river flow.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2095189 Challenges for Security in Wireless Sensor Networks (WSNs)
Authors: Muazzam A. Khan, Ghalib A. Shah, Muhammad Sher
Abstract:
Wireless sensor network is formed with the combination of sensor nodes and sink nodes. Recently Wireless sensor network has attracted attention of the research community. The main application of wireless sensor network is security from different attacks both for mass public and military. However securing these networks, by itself is a critical issue due to many constraints like limited energy, computational power and lower memory. Researchers working in this area have proposed a number of security techniques for this purpose. Still, more work needs to be done.In this paper we provide a detailed discussion on security in wireless sensor networks. This paper will help to identify different obstacles and requirements for security of wireless sensor networks as well as highlight weaknesses of existing techniques.
Keywords: Wireless senor networks (WSNs), security, denial of service, black hole, cryptography, stenography.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2923188 The Hardware Implementation of a Novel Genetic Algorithm
Authors: Zhenhuan Zhu, David Mulvaney, Vassilios Chouliaras
Abstract:
This paper presents a novel genetic algorithm, termed the Optimum Individual Monogenetic Algorithm (OIMGA) and describes its hardware implementation. As the monogenetic strategy retains only the optimum individual, the memory requirement is dramatically reduced and no crossover circuitry is needed, thereby ensuring the requisite silicon area is kept to a minimum. Consequently, depending on application requirements, OIMGA allows the investigation of solutions that warrant either larger GA populations or individuals of greater length. The results given in this paper demonstrate that both the performance of OIMGA and its convergence time are superior to those of existing hardware GA implementations. Local convergence is achieved in OIMGA by retaining elite individuals, while population diversity is ensured by continually searching for the best individuals in fresh regions of the search space.Keywords: Genetic algorithms, hardware-based machinelearning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1640187 Evolutionary Feature Selection for Text Documents using the SVM
Authors: Daniel I. Morariu, Lucian N. Vintan, Volker Tresp
Abstract:
Text categorization is the problem of classifying text documents into a set of predefined classes. After a preprocessing step, the documents are typically represented as large sparse vectors. When training classifiers on large collections of documents, both the time and memory restrictions can be quite prohibitive. This justifies the application of feature selection methods to reduce the dimensionality of the document-representation vector. In this paper, we present three feature selection methods: Information Gain, Support Vector Machine feature selection called (SVM_FS) and Genetic Algorithm with SVM (called GA_SVM). We show that the best results were obtained with GA_SVM method for a relatively small dimension of the feature vector.Keywords: Feature Selection, Learning with Kernels, Support Vector Machine, Genetic Algorithm, and Classification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1706186 Study on the Influence of Physical Effort on the Mental Processes of Preteen Students
Authors: Constantin Pehoiu, Cristian Savu, Silviu Badea, Cristian Borida
Abstract:
The physiological effects of physical exercise on human body are relatively well known in literature, which describes in detail the changes that occur in the cardiovascular system, the respiratory one, in bones and other systems, both during exercise and after its delivery. However, the effects of exercise on mental processes are less treated. From the literature reviews discussed in this study, it can be detached the idea that we can not exactly say that physical exercise has beneficial effects on mental processes, but neither that it would have potentially negative effects. This uncertainty, reflected in the inability to indicate precise and unequivocal meaning, favorable-unfavorable physical effort in acting on mental processes, is a prime reason to undertake a study of the phenomenon influence effort administered physical education classes on the dynamics of mental processes like attention and memory.Keywords: management, exercise, mental process, lesson.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1667185 Feature Selection Methods for an Improved SVM Classifier
Authors: Daniel Morariu, Lucian N. Vintan, Volker Tresp
Abstract:
Text categorization is the problem of classifying text documents into a set of predefined classes. After a preprocessing step, the documents are typically represented as large sparse vectors. When training classifiers on large collections of documents, both the time and memory restrictions can be quite prohibitive. This justifies the application of feature selection methods to reduce the dimensionality of the document-representation vector. In this paper, three feature selection methods are evaluated: Random Selection, Information Gain (IG) and Support Vector Machine feature selection (called SVM_FS). We show that the best results were obtained with SVM_FS method for a relatively small dimension of the feature vector. Also we present a novel method to better correlate SVM kernel-s parameters (Polynomial or Gaussian kernel).Keywords: Feature Selection, Learning with Kernels, SupportVector Machine, and Classification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1829184 Thermal Stability of a Vertical SOI-Based Capacitorless One-Transistor DRAM with Trench-Body Structure
Authors: Po-Hsieh Lin, Jyi-Tsong Lin
Abstract:
A vertical SOI-based MOSFET with trench body structure operated as 1T DRAM cell at various temperatures has been studied and investigated. Different operation temperatures are assigned for the device for its performance comparison, thus the thermal stability is carefully evaluated for the future memory device applications. Based on the simulation, the vertical SOI-based MOSFET with trench body structure demonstrates the electrical characteristics properly and possess conspicuous kink effect at various operation temperatures. Transient characteristics were also performed to prove that its programming window values and retention time behaviors are acceptable when the new 1T DRAM cell is operated at high operation temperature.Keywords: SOI, 1T DRAM, thermal stability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1575183 Application of the Data Distribution Service for Flexible Manufacturing Automation
Authors: Marco Ryll, Svetan Ratchev
Abstract:
This paper discusses the applicability of the Data Distribution Service (DDS) for the development of automated and modular manufacturing systems which require a flexible and robust communication infrastructure. DDS is an emergent standard for datacentric publish/subscribe middleware systems that provides an infrastructure for platform-independent many-to-many communication. It particularly addresses the needs of real-time systems that require deterministic data transfer, have low memory footprints and high robustness requirements. After an overview of the standard, several aspects of DDS are related to current challenges for the development of modern manufacturing systems with distributed architectures. Finally, an example application is presented based on a modular active fixturing system to illustrate the described aspects.Keywords: Flexible Manufacturing, Publish/Subscribe, Plug & Produce.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2353182 An Embedded System Design for SRAM SEU Test
Authors: Kyoung Kun Lee, Soongyu Kwon, Jong Tae Kim
Abstract:
An embedded system for SEU(single event upset) test needs to be designed to prevent system failure by high-energy particles during measuring SEU. SEU is a phenomenon in which the data is changed temporary in semiconductor device caused by high-energy particles. In this paper, we present an embedded system for SRAM(static random access memory) SEU test. SRAMs are on the DUT(device under test) and it is separated from control board which manages the DUT and measures the occurrence of SEU. It needs to have considerations for preventing system failure while managing the DUT and making an accurate measurement of SEUs. We measure the occurrence of SEUs from five different SRAMs at three different cyclotron beam energies 30, 35, and 40MeV. The number of SEUs of SRAMs ranges from 3.75 to 261.00 in average.Keywords: embedded system, single event upset, SRAM
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1669181 Web Log Mining by an Improved AprioriAll Algorithm
Authors: Wang Tong, He Pi-lian
Abstract:
This paper sets forth the possibility and importance about applying Data Mining in Web logs mining and shows some problems in the conventional searching engines. Then it offers an improved algorithm based on the original AprioriAll algorithm which has been used in Web logs mining widely. The new algorithm adds the property of the User ID during the every step of producing the candidate set and every step of scanning the database by which to decide whether an item in the candidate set should be put into the large set which will be used to produce next candidate set. At the meantime, in order to reduce the number of the database scanning, the new algorithm, by using the property of the Apriori algorithm, limits the size of the candidate set in time whenever it is produced. Test results show the improved algorithm has a more lower complexity of time and space, better restrain noise and fit the capacity of memory.
Keywords: Candidate Sets Pruning, Data Mining, ImprovedAlgorithm, Noise Restrain, Web Log
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2282180 A Practical Distributed String Matching Algorithm Architecture and Implementation
Authors: Bi Kun, Gu Nai-jie, Tu Kun, Liu Xiao-hu, Liu Gang
Abstract:
Traditional parallel single string matching algorithms are always based on PRAM computation model. Those algorithms concentrate on the cost optimal design and the theoretical speed. Based on the distributed string matching algorithm proposed by CHEN, a practical distributed string matching algorithm architecture is proposed in this paper. And also an improved single string matching algorithm based on a variant Boyer-Moore algorithm is presented. We implement our algorithm on the above architecture and the experiments prove that it is really practical and efficient on distributed memory machine. Its computation complexity is O(n/p + m), where n is the length of the text, and m is the length of the pattern, and p is the number of the processors.Keywords: Boyer-Moore algorithm, distributed algorithm, parallel string matching, string matching.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2189179 On Dialogue Systems Based on Deep Learning
Authors: Yifan Fan, Xudong Luo, Pingping Lin
Abstract:
Nowadays, dialogue systems increasingly become the way for humans to access many computer systems. So, humans can interact with computers in natural language. A dialogue system consists of three parts: understanding what humans say in natural language, managing dialogue, and generating responses in natural language. In this paper, we survey deep learning based methods for dialogue management, response generation and dialogue evaluation. Specifically, these methods are based on neural network, long short-term memory network, deep reinforcement learning, pre-training and generative adversarial network. We compare these methods and point out the further research directions.Keywords: Dialogue management, response generation, reinforcement learning, deep learning, evaluation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 787178 Momentum and Heat Transfer in the Flow of a Viscoelastic Fluid Past a Porous Flat Plate Subject to Suction or Blowing
Authors: Motahar Reza, Anadi Sankar Gupta
Abstract:
An analysis is made of the flow of an incompressible viscoelastic fluid (of small memory) over a porous plate subject to suction or blowing. It is found that velocity at a point increases with increase in the elasticity in the fluid. It is also shown that wall shear stress depends only on suction and is also independent of the material of fluids. No steady solution for velocity distribution exists when there is blowing at the plate. Temperature distribution in the boundary layer is determined and it is found that temperature at a point decreases with increase in the elasticity in the fluid.
Keywords: Viscoelastic fluid, Flow past a porous plate, Heat transfer
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1335177 Representing Uncertainty in Computer-Generated Forces
Authors: Ruibiao J. Guo, Brad Cain, Pierre Meunier
Abstract:
The Integrated Performance Modelling Environment (IPME) is a powerful simulation engine for task simulation and performance analysis. However, it has no high level cognition such as memory and reasoning for complex simulation. This article introduces a knowledge representation and reasoning scheme that can accommodate uncertainty in simulations of military personnel with IPME. This approach demonstrates how advanced reasoning models that support similarity-based associative process, rule-based abstract process, multiple reasoning methods and real-time interaction can be integrated with conventional task network modelling to provide greater functionality and flexibility when modelling operator performance.Keywords: Computer-Generated Forces, Human Behaviour Representation, IPME, Modelling and Simulation, Uncertainty Reasoning
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2117176 Weka Based Desktop Data Mining as Web Service
Authors: Sujala.D.Shetty, S.Vadivel, Sakshi Vaghella
Abstract:
Data mining is the process of sifting through large volumes of data, analyzing data from different perspectives and summarizing it into useful information. One of the widely used desktop applications for data mining is the Weka tool which is nothing but a collection of machine learning algorithms implemented in Java and open sourced under the General Public License (GPL). A web service is a software system designed to support interoperable machine to machine interaction over a network using SOAP messages. Unlike a desktop application, a web service is easy to upgrade, deliver and access and does not occupy any memory on the system. Keeping in mind the advantages of a web service over a desktop application, in this paper we are demonstrating how this Java based desktop data mining application can be implemented as a web service to support data mining across the internet.Keywords: desktop application, Weka mining, web service
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4082175 Content-Based Color Image Retrieval Based On 2-D Histogram and Statistical Moments
Authors: Khalid Elasnaoui, Brahim Aksasse, Mohammed Ouanan
Abstract:
In this paper, we are interested in the problem of finding similar images in a large database. For this purpose we propose a new algorithm based on a combination of the 2-D histogram intersection in the HSV space and statistical moments. The proposed histogram is based on a 3x3 window and not only on the intensity of the pixel. This approach overcome the drawback of the conventional 1-D histogram which is ignoring the spatial distribution of pixels in the image, while the statistical moments are used to escape the effects of the discretisation of the color space which is intrinsic to the use of histograms. We compare the performance of our new algorithm to various methods of the state of the art and we show that it has several advantages. It is fast, consumes little memory and requires no learning. To validate our results, we apply this algorithm to search for similar images in different image databases.Keywords: 2-D histogram, Statistical moments, Indexing, Similarity distance, Histograms intersection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1931174 Test Data Compression Using a Hybrid of Bitmask Dictionary and 2n Pattern Runlength Coding Methods
Authors: C. Kalamani, K. Paramasivam
Abstract:
In VLSI, testing plays an important role. Major problem in testing are test data volume and test power. The important solution to reduce test data volume and test time is test data compression. The Proposed technique combines the bit maskdictionary and 2n pattern run length-coding method and provides a substantial improvement in the compression efficiency without introducing any additional decompression penalty. This method has been implemented using Mat lab and HDL Language to reduce test data volume and memory requirements. This method is applied on various benchmark test sets and compared the results with other existing methods. The proposed technique can achieve a compression ratio up to 86%.Keywords: Bit Mask dictionary, 2n pattern run length code, system-on-chip, SOC, test data compression.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1921173 Performance Evaluation of Popular Hash Functions
Authors: Sheena Mathew, K. Poulose Jacob
Abstract:
This paper describes the results of an extensive study and comparison of popular hash functions SHA-1, SHA-256, RIPEMD-160 and RIPEMD-320 with JERIM-320, a 320-bit hash function. The compression functions of hash functions like SHA-1 and SHA-256 are designed using serial successive iteration whereas those like RIPEMD-160 and RIPEMD-320 are designed using two parallel lines of message processing. JERIM-320 uses four parallel lines of message processing resulting in higher level of security than other hash functions at comparable speed and memory requirement. The performance evaluation of these methods has been done by using practical implementation and also by using step computation methods. JERIM-320 proves to be secure and ensures the integrity of messages at a higher degree. The focus of this work is to establish JERIM-320 as an alternative of the present day hash functions for the fast growing internet applications.Keywords: Cryptography, Hash function, JERIM-320, Messageintegrity
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2641172 SMART: Solution Methods with Ants Running by Types
Authors: Nicolas Zufferey
Abstract:
Ant algorithms are well-known metaheuristics which have been widely used since two decades. In most of the literature, an ant is a constructive heuristic able to build a solution from scratch. However, other types of ant algorithms have recently emerged: the discussion is thus not limited by the common framework of the constructive ant algorithms. Generally, at each generation of an ant algorithm, each ant builds a solution step by step by adding an element to it. Each choice is based on the greedy force (also called the visibility, the short term profit or the heuristic information) and the trail system (central memory which collects historical information of the search process). Usually, all the ants of the population have the same characteristics and behaviors. In contrast in this paper, a new type of ant metaheuristic is proposed, namely SMART (for Solution Methods with Ants Running by Types). It relies on the use of different population of ants, where each population has its own personality.Keywords: Optimization, Metaheuristics, Ant Algorithms, Evolutionary Procedures, Population-Based Methods.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1720171 Grid-based Supervised Clustering - GBSC
Authors: Pornpimol Bungkomkhun, Surapong Auwatanamongkol
Abstract:
This paper presents a supervised clustering algorithm, namely Grid-Based Supervised Clustering (GBSC), which is able to identify clusters of any shapes and sizes without presuming any canonical form for data distribution. The GBSC needs no prespecified number of clusters, is insensitive to the order of the input data objects, and is capable of handling outliers. Built on the combination of grid-based clustering and density-based clustering, under the assistance of the downward closure property of density used in bottom-up subspace clustering, the GBSC can notably reduce its search space to avoid the memory confinement situation during its execution. On two-dimension synthetic datasets, the GBSC can identify clusters with different shapes and sizes correctly. The GBSC also outperforms other five supervised clustering algorithms when the experiments are performed on some UCI datasets.Keywords: supervised clustering, grid-based clustering, subspace clustering
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1610170 A Recommender System Fusing Collaborative Filtering and User’s Review Mining
Authors: Seulbi Choi, Hyunchul Ahn
Abstract:
Collaborative filtering (CF) algorithm has been popularly used for recommender systems in both academic and practical applications. It basically generates recommendation results using users’ numeric ratings. However, the additional use of the information other than user ratings may lead to better accuracy of CF. Considering that a lot of people are likely to share their honest opinion on the items they purchased recently due to the advent of the Web 2.0, user's review can be regarded as the new informative source for identifying user's preference with accuracy. Under this background, this study presents a hybrid recommender system that fuses CF and user's review mining. Our system adopts conventional memory-based CF, but it is designed to use both user’s numeric ratings and his/her text reviews on the items when calculating similarities between users.Keywords: Recommender system, collaborative filtering, text mining, review mining.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1587169 Remarks Regarding Queuing Model and Packet Loss Probability for the Traffic with Self-Similar Characteristics
Authors: Mihails Kulikovs, Ernests Petersons
Abstract:
Network management techniques have long been of interest to the networking research community. The queue size plays a critical role for the network performance. The adequate size of the queue maintains Quality of Service (QoS) requirements within limited network capacity for as many users as possible. The appropriate estimation of the queuing model parameters is crucial for both initial size estimation and during the process of resource allocation. The accurate resource allocation model for the management system increases the network utilization. The present paper demonstrates the results of empirical observation of memory allocation for packet-based services.Keywords: Queuing System, Packet Loss Probability, Measurement-Based Admission Control (MBAC), Performanceevaluation, Quality of Service (QoS).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1774168 Comanche – A Compiler-Driven I/O Management System
Authors: Wendy Zhang, Ernst L. Leiss, Huilin Ye
Abstract:
Most scientific programs have large input and output data sets that require out-of-core programming or use virtual memory management (VMM). Out-of-core programming is very error-prone and tedious; as a result, it is generally avoided. However, in many instance, VMM is not an effective approach because it often results in substantial performance reduction. In contrast, compiler driven I/O management will allow a program-s data sets to be retrieved in parts, called blocks or tiles. Comanche (COmpiler MANaged caCHE) is a compiler combined with a user level runtime system that can be used to replace standard VMM for out-of-core programs. We describe Comanche and demonstrate on a number of representative problems that it substantially out-performs VMM. Significantly our system does not require any special services from the operating system and does not require modification of the operating system kernel.Keywords: I/O Management, Out-of-core, Compiler, Tile mapping.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1318167 Learning Based On Computer Science Unplugged in Computer Science Education: Design, Development, and Assessment
Authors: Eiko Takaoka, Yoshiyuki Fukushima, Koichiro Hirose, Tadashi Hasegawa
Abstract:
Although, all high school students in Japan are required to learn informatics, many of them do not learn this topic sufficiently. In response to this situation, we propose a support package for high school informatics classes. To examine what students learned and if they sufficiently understood the context of the lessons, a questionnaire survey was distributed to 186 students. We analyzed the results of the questionnaire and determined the weakest units, which were “basic computer configuration” and “memory and secondary storage”. We then developed a package for teaching these units. We propose that our package be applied in high school classrooms.
Keywords: Computer Science Unplugged, computer science outreach, high school curriculum, experimental evaluation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2122166 Harrison’s Stolen: Addressing Aboriginal and Indigenous Islanders Human Rights
Authors: M. Shukry
Abstract:
According to the United Nations Declaration of Human Rights in 1948, every human being is entitled to rights in life that should be respected by others and protected by the state and community. Such rights are inherent regardless of colour, ethnicity, gender, religion or otherwise, and it is expected that all humans alike have the right to live without discrimination of any sort. However, that has not been the case with Aborigines in Australia. Over a long period of time, the governments of the State and the Territories and the Australian Commonwealth denied the Aboriginal and Indigenous inhabitants of the Torres Strait Islands such rights. Past Australian governments set policies and laws that enabled them to forcefully remove Indigenous children from their parents, which resulted in creating lost generations living the trauma of the loss of cultural identity, alienation and even their own selfhood. Intending to reduce that population of natives and their Aboriginal culture while, on the other hand, assimilate them into mainstream society, they gave themselves the right to remove them from their families with no hope of return. That practice has led to tragic consequences due to the trauma that has affected those children, an experience that is depicted by Jane Harrison in her play Stolen. The drama is the outcome of a six-year project on lost children and which was first performed in 1997 in Melbourne. Five actors only appear on the stage, playing the role of all the different characters, whether the main protagonists or the remaining cast, present or non-present ones as voices. The play outlines the life of five children who have been taken from their parents at an early age, entailing a disastrous negative impact that differs from one to the other. Unknown to each other, what connects between them is being put in a children’s home. The purpose of this paper is to analyse the play’s text in light of the 1948 Declaration of Human Rights, using it as a lens that reflects the atrocities practiced against the Aborigines. It highlights how such practices formed an outrageous violation of those natives’ rights as human beings. Harrison’s dramatic technique in conveying the children’s experiences is through a non-linear structure, fluctuating between past and present that are linked together within each of the five characters, reflecting their suffering and pain to create an emotional link between them and the audience. Her dramatic handling of the issue by fusing tragedy with humour as well as symbolism is a successful technique in revealing the traumatic memory of those children and their present life. The play has made a difference in commencing to address the problem of the right of all children to be with their families, which renders the real meaning of having a home and an identity as people.
Keywords: Aboriginal, audience, Australia, children, culture, drama, home, human rights, identity, indigenous, Jane Harrison, memory, scenic effects, setting, stage, stage directions, Stolen, trauma.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1688165 Real-Time Image Analysis of Capsule Endoscopy for Bleeding Discrimination in Embedded System Platform
Authors: Yong-Gyu Lee, Gilwon Yoon
Abstract:
Image processing for capsule endoscopy requires large memory and it takes hours for diagnosis since operation time is normally more than 8 hours. A real-time analysis algorithm of capsule images can be clinically very useful. It can differentiate abnormal tissue from health structure and provide with correlation information among the images. Bleeding is our interest in this regard and we propose a method of detecting frames with potential bleeding in real-time. Our detection algorithm is based on statistical analysis and the shapes of bleeding spots. We tested our algorithm with 30 cases of capsule endoscopy in the digestive track. Results were excellent where a sensitivity of 99% and a specificity of 97% were achieved in detecting the image frames with bleeding spots.Keywords: bleeding, capsule endoscopy, image processing, real time analysis
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1876164 Grid-HPA: Predicting Resource Requirements of a Job in the Grid Computing Environment
Authors: M. Bohlouli, M. Analoui
Abstract:
For complete support of Quality of Service, it is better that environment itself predicts resource requirements of a job by using special methods in the Grid computing. The exact and correct prediction causes exact matching of required resources with available resources. After the execution of each job, the used resources will be saved in the active database named "History". At first some of the attributes will be exploit from the main job and according to a defined similarity algorithm the most similar executed job will be exploited from "History" using statistic terms such as linear regression or average, resource requirements will be predicted. The new idea in this research is based on active database and centralized history maintenance. Implementation and testing of the proposed architecture results in accuracy percentage of 96.68% to predict CPU usage of jobs and 91.29% of memory usage and 89.80% of the band width usage.
Keywords: Active Database, Grid Computing, ResourceRequirement Prediction, Scheduling,
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1432163 A Pipelined FSBM Hardware Architecture for HTDV-H.26x
Authors: H. Loukil, A. Ben Atitallah, F. Ghozzi, M. A. Ben Ayed, N. Masmoudi
Abstract:
In MPEG and H.26x standards, to eliminate the temporal redundancy we use motion estimation. Given that the motion estimation stage is very complex in terms of computational effort, a hardware implementation on a re-configurable circuit is crucial for the requirements of different real time multimedia applications. In this paper, we present hardware architecture for motion estimation based on "Full Search Block Matching" (FSBM) algorithm. This architecture presents minimum latency, maximum throughput, full utilization of hardware resources such as embedded memory blocks, and combining both pipelining and parallel processing techniques. Our design is described in VHDL language, verified by simulation and implemented in a Stratix II EP2S130F1020C4 FPGA circuit. The experiment result show that the optimum operating clock frequency of the proposed design is 89MHz which achieves 160M pixels/sec.Keywords: SAD, FSBM, Hardware Implementation, FPGA.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1642