Search results for: faster
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 819

Search results for: faster

609 Implementation of Elliptic Curve Cryptography Encryption Engine on a FPGA

Authors: Mohamad Khairi Ishak

Abstract:

Conventional public key crypto systems such as RSA (Ron Rivest, Adi Shamir and Leonard Adleman), DSA (Digital Signature Algorithm), and Elgamal are no longer efficient to be implemented in the small, memory constrained devices. Elliptic Curve Cryptography (ECC), which allows smaller key length as compared to conventional public key crypto systems, has thus become a very attractive choice for many applications. This paper describes implementation of an elliptic curve cryptography (ECC) encryption engine on a FPGA. The system has been implemented in 2 different key sizes, which are 131 bits and 163 bits. Area and timing analysis are provided for both key sizes for comparison. The crypto system, which has been implemented on Altera’s EPF10K200SBC600-1, has a hardware size of 5945/9984 and 6913/9984 of logic cells for 131 bits implementation and 163 bits implementation respectively. The crypto system operates up to 43 MHz, and performs point multiplication operation in 11.3 ms for 131 bits implementation and 14.9 ms for 163 bits implementation. In terms of speed, our crypto system is about 8 times faster than the software implementation of the same system.

Keywords: elliptic curve cryptography, FPGA, key sizes, memory

Procedia PDF Downloads 294
608 Generic Hybrid Models for Two-Dimensional Ultrasonic Guided Wave Problems

Authors: Manoj Reghu, Prabhu Rajagopal, C. V. Krishnamurthy, Krishnan Balasubramaniam

Abstract:

A thorough understanding of guided ultrasonic wave behavior in structures is essential for the application of existing Non Destructive Evaluation (NDE) technologies, as well as for the development of new methods. However, the analysis of guided wave phenomena is challenging because of their complex dispersive and multimodal nature. Although numerical solution procedures have proven to be very useful in this regard, the increasing complexity of features and defects to be considered, as well as the desire to improve the accuracy of inspection often imposes a large computational cost. Hybrid models that combine numerical solutions for wave scattering with faster alternative methods for wave propagation have long been considered as a solution to this problem. However usually such models require modification of the base code of the solution procedure. Here we aim to develop Generic Hybrid models that can be directly applied to any two different solution procedures. With this goal in mind, a Numerical Hybrid model and an Analytical-Numerical Hybrid model has been developed. The concept and implementation of these Hybrid models are discussed in this paper.

Keywords: guided ultrasonic waves, Finite Element Method (FEM), Hybrid model

Procedia PDF Downloads 441
607 Improved Processing Speed for Text Watermarking Algorithm in Color Images

Authors: Hamza A. Al-Sewadi, Akram N. A. Aldakari

Abstract:

Copyright protection and ownership proof of digital multimedia are achieved nowadays by digital watermarking techniques. A text watermarking algorithm for protecting the property rights and ownership judgment of color images is proposed in this paper. Embedding is achieved by inserting texts elements randomly into the color image as noise. The YIQ image processing model is found to be faster than other image processing methods, and hence, it is adopted for the embedding process. An optional choice of encrypting the text watermark before embedding is also suggested (in case required by some applications), where, the text can is encrypted using any enciphering technique adding more difficulty to hackers. Experiments resulted in embedding speed improvement of more than double the speed of other considered systems (such as least significant bit method, and separate color code methods), and a fairly acceptable level of peak signal to noise ratio (PSNR) with low mean square error values for watermarking purposes.

Keywords: steganography, watermarking, time complexity measurements, private keys

Procedia PDF Downloads 126
606 Knowledge Engineering Based Smart Healthcare Solution

Authors: Rhaed Khiati, Muhammad Hanif

Abstract:

In the past decade, smart healthcare systems have been on an ascendant drift, especially with the evolution of hospitals and their increasing reliance on bioinformatics and software specializing in healthcare. Doctors have become reliant on technology more than ever, something that in the past would have been looked down upon, as technology has become imperative in reducing overall costs and improving the quality of patient care. With patient-doctor interactions becoming more necessary and more complicated than ever, systems must be developed while taking into account costs, patient comfort, and patient data, among other things. In this work, we proposed a smart hospital bed, which mixes the complexity and big data usage of traditional healthcare systems with the comfort found in soft beds while taking certain concerns like data confidentiality, security, and maintaining SLA agreements, etc. into account. This research work potentially provides users, namely patients and doctors, with a seamless interaction with to their respective nurses, as well as faster access to up-to-date personal data, including prescriptions and severity of the condition in contrast to the previous research in the area where there is lack of consideration of such provisions.

Keywords: big data, smart healthcare, distributed systems, bioinformatics

Procedia PDF Downloads 180
605 Deployed Confidence: The Testing in Production

Authors: Shreya Asthana

Abstract:

Testers know that the feature they tested on stage is working perfectly in production only after release went live. Sometimes something breaks in production and testers get to know through the end user’s bug raised. The panic mode starts when your staging test results do not reflect current production behavior. And you started doubting your testing skills when finally the user reported a bug to you. Testers can deploy their confidence on release day by testing on production. Once you start doing testing in production, you will see test result accuracy because it will be running on real time data and execution will be a little faster as compared to staging one due to elimination of bad data. Feature flagging, canary releases, and data cleanup can help to achieve this technique of testing. By this paper it will be easier to understand the steps to achieve production testing before making your feature live, and to modify IT company’s testing procedure, so testers can provide the bug free experience to the end users. This study is beneficial because too many people think that testing should be done in staging but not in production and now this is high time to pull out people from their old mindset of testing into a new testing world. At the end of the day, it all just matters if the features are working in production or not.

Keywords: bug free production, new testing mindset, testing strategy, testing approach

Procedia PDF Downloads 49
604 Reaction Rate of Olive Stone during Combustion in a Bubbling Fluidized Bed

Authors: A. Soria-Verdugo, M. Rubio-Rubio, J. Arrieta, N. García-Hernando

Abstract:

Combustion of biomass is a promising alternative to reduce the high pollutant emission levels associated to the combustion of fossil flues due to the net null emission of CO2 attributed to biomass. However, the biomass selected should also have low contents of nitrogen and sulfur to limit the NOx and SOx emissions derived from its combustion. In this sense, olive stone is an excellent fuel to power combustion reactors with reduced levels of pollutant emissions. In this work, the combustion of olive stone particles is analyzed experimentally in a thermogravimetric analyzer (TGA) and in a bubbling fluidized bed reactor (BFB). The bubbling fluidized bed reactor was installed over a scale, conforming a macro-TGA. In both equipment, the evolution of the mass of the samples was registered as the combustion process progressed. The results show a much faster combustion process in the bubbling fluidized bed reactor compared to the thermogravimetric analyzer measurements, due to the higher heat transfer coefficient and the abrasion of the fuel particles by the bed material in the BFB reactor.

Keywords: olive stone, combustion, reaction rate, fluidized bed

Procedia PDF Downloads 180
603 The Superhydrophobic Surface Effect on Laminar Boundary Layer Flows

Authors: Chia-Yung Chou, Che-Chuan Cheng, Chin Chi Hsu, Chun-Hui Wu

Abstract:

This study investigates the fluid of boundary layer flow as it flows through the superhydrophobic surface. The superhydrophobic surface will be assembled into an observation channel for fluid experiments. The fluid in the channel will be doped with visual flow field particles, which will then be pumped by the syringe pump and introduced into the experimentally observed channel through the pipeline. Through the polarized light irradiation, the movement of the particles in the channel is captured by a high-speed camera, and the velocity of the particles is analyzed by MATLAB to find out the particle velocity field changes caused on the fluid boundary layer. This study found that the superhydrophobic surface can effectively increase the velocity near the wall surface, and the faster with the flow rate increases. The superhydrophobic surface also had longer the slip length compared with the plan surface. In the calculation of the drag coefficient, the superhydrophobic surface produces a lower drag coefficient, and there is a more significant difference when the Re reduced in the flow field.

Keywords: hydrophobic, boundary layer, slip length, friction

Procedia PDF Downloads 124
602 Ta-DAH: Task Driven Automated Hardware Design of Free-Flying Space Robots

Authors: Lucy Jackson, Celyn Walters, Steve Eckersley, Mini Rai, Simon Hadfield

Abstract:

Space robots will play an integral part in exploring the universe and beyond. A correctly designed space robot will facilitate OOA, satellite servicing and ADR. However, problems arise when trying to design such a system as it is a highly complex multidimensional problem into which there is little research. Current design techniques are slow and specific to terrestrial manipulators. This paper presents a solution to the slow speed of robotic hardware design, and generalizes the technique to free-flying space robots. It presents Ta-DAH Design, an automated design approach that utilises a multi-objective cost function in an iterative and automated pipeline. The design approach leverages prior knowledge and facilitates the faster output of optimal designs. The result is a system that can optimise the size of the base spacecraft, manipulator and some key subsystems for any given task. Presented in this work is the methodology behind Ta-DAH Design and a number optimal space robot designs.

Keywords: space robots, automated design, on-orbit operations, hardware design

Procedia PDF Downloads 53
601 Planning for a Smart Sustainable Cities: A Case Study

Authors: Ajaykumar Kambekar, Nikita Kalantri

Abstract:

Due to faster urbanization; developing nations will have to look forward towards establishing new planned cities those are environmentally friendly. Due to growth in Information and Communication Technology (ICT), it is evident that the rise of smart cities is witnessed as a promising trend for future growth; however, technology alone cannot make a city as a smart city. Cities must use smart systems to enhance the quality of life of its citizens and to achieve sustainable growth. Recent trends in technology may offer some indication towards harnessing our cities potential as the new engines of sustainable growth. To overcome the problems of mega-urbanization, new concept of smart cities has been introduced. The current research aims to reduce the knowledge gap in urban planning by exploring the concept of smart cities considering sustainability as a major focus. The aim of this paper is to plan for an entire smart city. The paper analyses sustainable development and identifies the key factors for the creation of future smart cities. The study also emphasizes the use of advanced planning and scheduling software such as Microsoft Project (MSP).

Keywords: urbanization, planned cities, information and communication technology, sustainable growth

Procedia PDF Downloads 287
600 Execution of Optimization Algorithm in Cascaded H-Bridge Multilevel Inverter

Authors: M. Suresh Kumar, K. Ramani

Abstract:

This paper proposed the harmonic elimination of Cascaded H-Bridge Multi-Level Inverter by using Selective Harmonic Elimination-Pulse Width Modulation method programmed with Particle Swarm Optimization algorithm. PSO method determine proficiently the required switching angles to eliminate low order harmonics up to the 11th order from the inverter output voltage waveform while keeping the magnitude of the fundamental harmonics at the desired value. Results demonstrate that the proposed method does efficiently eliminate a great number of specific harmonics and the output voltage is resulted in minimum Total Harmonic Distortion. The results shown that the PSO algorithm attain successfully to the global solution faster than other algorithms.

Keywords: multi-level inverter, Selective Harmonic Elimination Pulse Width Modulation (SHEPWM), Particle Swarm Optimization (PSO), Total Harmonic Distortion (THD)

Procedia PDF Downloads 586
599 A NoSQL Based Approach for Real-Time Managing of Robotics's Data

Authors: Gueidi Afef, Gharsellaoui Hamza, Ben Ahmed Samir

Abstract:

This paper deals with the secret of the continual progression data that new data management solutions have been emerged: The NoSQL databases. They crossed several areas like personalization, profile management, big data in real-time, content management, catalog, view of customers, mobile applications, internet of things, digital communication and fraud detection. Nowadays, these database management systems are increasing. These systems store data very well and with the trend of big data, a new challenge’s store demands new structures and methods for managing enterprise data. The new intelligent machine in the e-learning sector, thrives on more data, so smart machines can learn more and faster. The robotics are our use case to focus on our test. The implementation of NoSQL for Robotics wrestle all the data they acquire into usable form because with the ordinary type of robotics; we are facing very big limits to manage and find the exact information in real-time. Our original proposed approach was demonstrated by experimental studies and running example used as a use case.

Keywords: NoSQL databases, database management systems, robotics, big data

Procedia PDF Downloads 328
598 Alternate Dispute Resolution: Expeditious Justice

Authors: Uzma Fakhar, Osama Fakhar, Aamir Shafiq Ch

Abstract:

Methods of alternate dispute resolution (ADR) like conciliation, arbitration, mediation are the supplement to ensure inexpensive and expeditious justice in a country. Justice delayed has not only created chaos, but an element of rebellious behavior towards judiciary is being floated among people. Complexity of traditional judicial system and its diversity has created an overall coherence. Admittedly, In Pakistan the traditional judicial system has failed to achieve its goals which resulted in the backlog of cases pending in courts, resultantly even the critics of alternate dispute resolution agree to restore the spirit of expeditious justice by reforming the old Panchayat system. The Government is keen to enact certain laws and make amendments to facilitate the resolution of a dispute through a simple and faster ADR framework instead of a lengthy and exhausting complex trial in order to create proliferation and faith in alternate dispute resolution. This research highlights the value of ADR in a country like Pakistan for revival of the confidence of the people upon the judicial process and a useful judicial tool to reduce the pressure on the judiciary.

Keywords: alternate dispute resolution, development of law, expeditious justice, Pakistan

Procedia PDF Downloads 199
597 Stroma-Providing Activity of Adipose Derived Mesenchymal Stromal Cells in Tissue-Related O2 Microenvironment

Authors: P. I. Bobyleva, E. R. Andreeva, I. V. Andrianova, E. V. Maslova, L. B. Buravkova

Abstract:

This work studied the ability of adipose tissue-derived mesenchymal stromal cells (MSCs) to form stroma for expansion of cord blood hematopoietic cells. We showed that 72-hour interaction of MSCs with cord blood mononuclear cells (MNCs) in vitro at atmospheric (20%) and low (5%) O2 conditions increased the expression of ICAM-1, HCAM (at the beginning of interaction) on MSCs. Viability of MSCs and MNCs were maintained at high level. Adhesion of MNCs to MSCs was faster at 20% O2. MSCs promoted the proliferation of adhered MNCs to form the suspension containing great number of hematopoietic colony-forming units, and this effect was more pronounced at 5% O2. Thus, adipose-derived MSCs supplied sufficient stromal support to cord blood MNCs both at 20% and 5% О2, providing their adhesion with further expansion of new generation of different hematopoietic lineages.

Keywords: hematopoietic stem and progenitor cells, mesenchymal stromal cells, tissue-related oxygen, adipose tissue

Procedia PDF Downloads 403
596 Genodata: The Human Genome Variation Using BigData

Authors: Surabhi Maiti, Prajakta Tamhankar, Prachi Uttam Mehta

Abstract:

Since the accomplishment of the Human Genome Project, there has been an unparalled escalation in the sequencing of genomic data. This project has been the first major vault in the field of medical research, especially in genomics. This project won accolades by using a concept called Bigdata which was earlier, extensively used to gain value for business. Bigdata makes use of data sets which are generally in the form of files of size terabytes, petabytes, or exabytes and these data sets were traditionally used and managed using excel sheets and RDBMS. The voluminous data made the process tedious and time consuming and hence a stronger framework called Hadoop was introduced in the field of genetic sciences to make data processing faster and efficient. This paper focuses on using SPARK which is gaining momentum with the advancement of BigData technologies. Cloud Storage is an effective medium for storage of large data sets which is generated from the genetic research and the resultant sets produced from SPARK analysis.

Keywords: human genome project, Bigdata, genomic data, SPARK, cloud storage, Hadoop

Procedia PDF Downloads 232
595 Rheolaser: Light Scattering Characterization of Viscoelastic Properties of Hair Cosmetics That Are Related to Performance and Stability of the Respective Colloidal Soft Materials

Authors: Heitor Oliveira, Gabriele De-Waal, Juergen Schmenger, Lynsey Godfrey, Tibor Kovacs

Abstract:

Rheolaser MASTER™ makes use of multiple scattering of light, caused by scattering objects in a continuous medium (such as droplets and particles in colloids), to characterize the viscoelasticity of soft materials. It offers an alternative to conventional rheometers to characterize viscoelasticity of products such as hair cosmetics. Up to six simultaneous measurements at controlled temperature can be carried out simultaneously (10-15 min), and the method requires only minor sample preparation work. Conversely to conventional rheometer based methods, no mechanical stress is applied to the material during the measurements. Therefore, the properties of the exact same sample can be monitored over time, like in aging and stability studies. We determined the elastic index (EI) of water/emulsion mixtures (1 ≤ fat alcohols (FA) ≤ 5 wt%) and emulsion/gel-network mixtures (8 ≤ FA ≤ 17 wt%) and compared with the elastic/sorage mudulus (G’) for the respective samples using a TA conventional rheometer with flat plates geometry. As expected, it was found that log(EI) vs log(G’) presents a linear behavior. Moreover, log(EI) increased in a linear fashion with solids level in the entire range of compositions (1 ≤ FA ≤ 17 wt%), while rheometer measurements were limited to samples down to 4 wt% solids level. Alternatively, a concentric cilinder geometry would be required for more diluted samples (FA > 4 wt%) and rheometer results from different sample holder geometries are not comparable. The plot of the rheolaser output parameters solid-liquid balance (SLB) vs EI were suitable to monitor product aging processes. These data could quantitatively describe some observations such as formation of lumps over aging time. Moreover, this method allowed to identify that the different specifications of a key raw material (RM < 0.4 wt%) in the respective gel-network (GN) product has minor impact on product viscoelastic properties and it is not consumer perceivable after a short aging time. Broadening of a RM spec range typically has a positive impact on cost savings. Last but not least, the photon path length (λ*)—proportional to droplet size and inversely proportional to volume fraction of scattering objects, accordingly to the Mie theory—and the EI were suitable to characterize product destabilization processes (e.g., coalescence and creaming) and to predict product stability about eight times faster than our standard methods. Using these parameters we could successfully identify formulation and process parameters that resulted in unstable products. In conclusion, Rheolaser allows quick and reliable characterization of viscoelastic properties of hair cosmetics that are related to their performance and stability. It operates in a broad range of product compositions and has applications spanning from the formulation of our hair cosmetics to fast release criteria in our production sites. Last but not least, this powerful tool has positive impact on R&D development time—faster delivery of new products to the market—and consequently on cost savings.

Keywords: colloids, hair cosmetics, light scattering, performance and stability, soft materials, viscoelastic properties

Procedia PDF Downloads 153
594 Photocatalytic Degradation of Bisphenol A Using ZnO Nanoparticles as Catalyst under UV/Solar Light: Effect of Different Parameters and Kinetic Studies

Authors: Farida Kaouah, Chahida Oussalah, Wassila Hachi, Salim Boumaza, Mohamed Trari

Abstract:

A catalyst of ZnO nanoparticles was used in the photocatalytic process of treatment for potential use towards bisphenol A (BPA) degradation in an aqueous solution. To achieve this study, the effect of parameters such as the catalyst dose, initial concentration of BPA and pH on the photocatalytic degradation of BPA was studied. The results reveal that the maximum degradation (more than 93%) of BPA occurred with ZnO catalyst in 120 min of stirring at natural pH (7.1) under solar light irradiation. It was found that chemical oxygen demand (COD) reduction takes place at a faster rate under solar light as compared to that of UV light. The kinetic studies were achieved and revealed that the photocatalytic degradation process obeyed a Langmuir–Hinshelwood model and followed a pseudo-first order rate expression. This work envisages the great potential that sunlight mediated photocatalysis has in the removal of bisphenol A from wastewater.

Keywords: bisphenol A, photocatalytic degradation, sunlight, zinc oxide, Langmuir–Hinshelwood model, chemical oxygen demand

Procedia PDF Downloads 133
593 Data Rate Based Grouping Scheme for Cooperative Communications in Wireless LANs

Authors: Sunmyeng Kim

Abstract:

IEEE 802.11a/b/g standards provide multiple transmission rates, which can be changed dynamically according to the channel condition.Cooperative communications we reintroduced to improve the overallperformance of wireless LANs with the help of relay nodes with higher transmission rates. The cooperative communications are based on the fact that the transmission is much faster when sending data packets to a destination node through a relay node with higher transmission rate, rather than sending data directly to the destination node at low transmission rate. To apply the cooperative communications in wireless LAN, several MAC protocols have been proposed. Some of them can result in collisions among relay nodes in a dense network. In order to solve this problem, we propose a new protocol. Relay nodes are grouped based on their transmission rates. And then, relay nodes only in the highest group try to get channel access. Performance evaluation is conducted using simulation, and shows that the proposed protocol significantly outperforms the previous protocol in terms of throughput and collision probability.

Keywords: cooperative communications, MAC protocol, relay node, WLAN

Procedia PDF Downloads 449
592 Crystalline Structure of Starch Based Nano Composites

Authors: Farid Amidi Fazli, Afshin Babazadeh, Farnaz Amidi Fazli

Abstract:

In contrast with literal meaning of nano, researchers have been achieving mega adventures in this area and every day more nanomaterials are being introduced to the market. After long time application of fossil-based plastics, nowadays accumulation of their waste seems to be a big problem to the environment. On the other hand, mankind has more attention to safety and living environment. Replacing common plastic packaging materials with degradable ones that degrade faster and convert to non-dangerous components like water and carbon dioxide have more attractions; these new materials are based on renewable and inexpensive sources of starch and cellulose. However, the functional properties of them do not suitable for packaging. At this point, nanotechnology has an important role. Utilizing of nanomaterials in polymer structure will improve mechanical and physical properties of them; nanocrystalline cellulose (NCC) has this ability. This work has employed a chemical method to produce NCC and starch bio nanocomposite containing NCC. X-Ray Diffraction technique has characterized the obtained materials. Results showed that applied method is a suitable one as well as applicable one to NCC production.

Keywords: biofilm, cellulose, nanocomposite, starch

Procedia PDF Downloads 388
591 The Use of Regional Blocks Versus IV Opioid Analgesics for Acute Traumatic Pain Management in the Emergency Department

Authors: Lajeesh Jabbar, Shibu T. Varghese

Abstract:

Being under pain is a very distressing factor that it prolongs the healing of any kind of trauma and add to the post traumatic stressful state. Alleviating the pain from acute traumatic conditions like fracture, degloving injury etc will help in faster recovery and also decrease the incidence of post traumatic stress disorder. Most of the emergency departments in INDIA are using IV opioid analgesics to relieve the patient from pain in cases of acute traumatic injuries. None of the Emergency Departments practice regional blocks in the country. In this study, we are comparing the efficacy of Regional Blocks in relieving the pain in lower limb fractures versus the use of IV analgesics for the same in the emergency department. The site of study is Malabar Institute Of Medical Sciences in Calicut in Kerala in India and is a place which receives approximately 10-20 traumatic fracture cases per day. The fracture sites used for the study purpose are femur fracture and phalangeal fractures.

Keywords: regional blocks, IV analgesia, acute traumatic pain, femur fractures, phalanx fractures

Procedia PDF Downloads 400
590 Physical Verification Flow on Multiple Foundries

Authors: Rohaya Abdul Wahab, Raja Mohd Fuad Tengku Aziz, Nazaliza Othman, Sharifah Saleh, Nabihah Razali, Muhammad Al Baqir Zinal Abidin, Md Hanif Md Nasir

Abstract:

This paper will discuss how we optimize our physical verification flow in our IC Design Department having various rule decks from multiple foundries. Our ultimate goal is to achieve faster time to tape-out and avoid schedule delay. Currently the physical verification runtimes and memory usage have drastically increased with the increasing number of design rules, design complexity and the size of the chips to be verified. To manage design violations, we use a number of solutions to reduce the amount of violations needed to be checked by physical verification engineers. The most important functions in physical verifications are DRC (design rule check), LVS (layout vs. schematic) and XRC (extraction). Since we have a multiple number of foundries for our design tape-outs, we need a flow that improve the overall turnaround time and ease of use of the physical verification process. The demand for fast turnaround time is even more critical since the physical design is the last stage before sending the layout to the foundries.

Keywords: physical verification, DRC, LVS, XRC, flow, foundry, runset

Procedia PDF Downloads 636
589 Distangling Biological Noise in Cellular Images with a Focus on Explainability

Authors: Manik Sharma, Ganapathy Krishnamurthi

Abstract:

The cost of some drugs and medical treatments has risen in recent years, that many patients are having to go without. A classification project could make researchers more efficient. One of the more surprising reasons behind the cost is how long it takes to bring new treatments to market. Despite improvements in technology and science, research and development continues to lag. In fact, finding new treatment takes, on average, more than 10 years and costs hundreds of millions of dollars. If successful, we could dramatically improve the industry's ability to model cellular images according to their relevant biology. In turn, greatly decreasing the cost of treatments and ensure these treatments get to patients faster. This work aims at solving a part of this problem by creating a cellular image classification model which can decipher the genetic perturbations in cell (occurring naturally or artificially). Another interesting question addressed is what makes the deep-learning model decide in a particular fashion, which can further help in demystifying the mechanism of action of certain perturbations and paves a way towards the explainability of the deep-learning model.

Keywords: cellular images, genetic perturbations, deep-learning, explainability

Procedia PDF Downloads 90
588 Low Cost Real Time Robust Identification of Impulsive Signals

Authors: R. Biondi, G. Dys, G. Ferone, T. Renard, M. Zysman

Abstract:

This paper describes an automated implementable system for impulsive signals detection and recognition. The system uses a Digital Signal Processing device for the detection and identification process. Here the system analyses the signals in real time in order to produce a particular response if needed. The system analyses the signals in real time in order to produce a specific output if needed. Detection is achieved through normalizing the inputs and comparing the read signals to a dynamic threshold and thus avoiding detections linked to loud or fluctuating environing noise. Identification is done through neuronal network algorithms. As a setup our system can receive signals to “learn” certain patterns. Through “learning” the system can recognize signals faster, inducing flexibility to new patterns similar to those known. Sound is captured through a simple jack input, and could be changed for an enhanced recording surface such as a wide-area recorder. Furthermore a communication module can be added to the apparatus to send alerts to another interface if needed.

Keywords: sound detection, impulsive signal, background noise, neural network

Procedia PDF Downloads 298
587 Adsorption of Iodine from Aqueous Solution on Modified Silica Gel with Cyclodextrin Derivatives

Authors: Raied, Badr Al-Fulaiti, E. I. El-Shafey

Abstract:

Cyclodextrin (CD) derivatives (αCD, βCD, ϒCD and hp-βCD) were successfully immobilized on silica gel surface via epichlorohydrin as a cross linker. The ratio of silica to CD was optimized in preliminary experiments based on best performance of iodine adsorption capacity. Selected adsorbents with ratios of silica to CD derivatives, in this study, include Si-αCD (3:2), Si-βCD (4:1), Si-ϒCD (4:1) and Si-hp-βCD (4:1). The adsorption of iodine (I2/KI) solution was investigated in terms of initial pH, contact time, iodine concentration and temperature. No significant variations was noticed for iodine adsorption at different pH values, thus, initial pH 6 was selected for further studies. Equilibrium adsorption was reached faster on Si-hp-βCD than other adsorbents with kinetic adsorption data fitting well pseudo second order model. Activation energy (Ea) was found to be in the range of 12.7 - 23.4 kJ/mol. Equilibrium adsorption data were found to fit well the Langmuir adsorption model with lower uptake as temperature rises. Iodine uptake follows the order: Si-hp-βCD (714 mg/g) >Si-αCD (625 mg/g) >Si-βCD (555.6 mg/g)> Si-ϒCD (435 mg/g). Thermodynamic study showed that iodine adsorption is exothermic and spontaneous. Adsorbents reuse exhibited excellent performance for iodine adsorption with a decrease in iodine uptake of ~ 2- 4 % in the third adsorption cycle.

Keywords: adsorption, iodine, silica, cyclodextrin, functionalization, epichlorohydrin

Procedia PDF Downloads 113
586 Literature Review: Microalgae as Functional Foods with Solvent Free Extraction

Authors: Angela Justina Kumalaputri

Abstract:

Indonesia, as a maritime country, has abundant marine living resources yet has not been optimally utilized. So far, we only focusing on fisheries. In the other hand, Indonesia, as the country with the fourth longest coastline, is a very good cultivation place for microalgae. Microalgae can be diversified to many important products, such as food, fuel, pharmaceutical products, functional food, and cosmetics.This research is focusing on the literature study about types of microalgae as sources for functional foods (such as antioxidants), including the contents and the separation methods. The research methods which we use are: (1) Literature study about various microalgaes (2) Literature study about extractions using supercritical fluid of CO₂, which are free from toxic organic solvents, environmentally friendly, and safe for food products. Supercritical fluid extraction using CO₂ (low critical points: temperature at 31.1 oC and pressure at 72.9 bars) could be done at a low temperature which are suitable for temperature labile compounds, low energy, and faster extraction time compared with conventional method of extraction.

Keywords: antioxidants, supercritical fluid extraction, solvent-free extraction, microalgae

Procedia PDF Downloads 55
585 Topic Modelling Using Latent Dirichlet Allocation and Latent Semantic Indexing on SA Telco Twitter Data

Authors: Phumelele Kubheka, Pius Owolawi, Gbolahan Aiyetoro

Abstract:

Twitter is one of the most popular social media platforms where users can share their opinions on different subjects. As of 2010, The Twitter platform generates more than 12 Terabytes of data daily, ~ 4.3 petabytes in a single year. For this reason, Twitter is a great source for big mining data. Many industries such as Telecommunication companies can leverage the availability of Twitter data to better understand their markets and make an appropriate business decision. This study performs topic modeling on Twitter data using Latent Dirichlet Allocation (LDA). The obtained results are benchmarked with another topic modeling technique, Latent Semantic Indexing (LSI). The study aims to retrieve topics on a Twitter dataset containing user tweets on South African Telcos. Results from this study show that LSI is much faster than LDA. However, LDA yields better results with higher topic coherence by 8% for the best-performing model represented in Table 1. A higher topic coherence score indicates better performance of the model.

Keywords: big data, latent Dirichlet allocation, latent semantic indexing, telco, topic modeling, twitter

Procedia PDF Downloads 135
584 R Software for Parameter Estimation of Spatio-Temporal Model

Authors: Budi Nurani Ruchjana, Atje Setiawan Abdullah, I. Gede Nyoman Mindra Jaya, Eddy Hermawan

Abstract:

In this paper, we propose the application package to estimate parameters of spatiotemporal model based on the multivariate time series analysis using the R open-source software. We build packages mainly to estimate the parameters of the Generalized Space Time Autoregressive (GSTAR) model. GSTAR is a combination of time series and spatial models that have parameters vary per location. We use the method of Ordinary Least Squares (OLS) and use the Mean Average Percentage Error (MAPE) to fit the model to spatiotemporal real phenomenon. For case study, we use oil production data from volcanic layer at Jatibarang Indonesia or climate data such as rainfall in Indonesia. Software R is very user-friendly and it is making calculation easier, processing the data is accurate and faster. Limitations R script for the estimation of model parameters spatiotemporal GSTAR built is still limited to a stationary time series model. Therefore, the R program under windows can be developed either for theoretical studies and application.

Keywords: GSTAR Model, MAPE, OLS method, oil production, R software

Procedia PDF Downloads 221
583 Application of Association Rule Using Apriori Algorithm for Analysis of Industrial Accidents in 2013-2014 in Indonesia

Authors: Triano Nurhikmat

Abstract:

Along with the progress of science and technology, the development of the industrialized world in Indonesia took place very rapidly. This leads to a process of industrialization of society Indonesia faster with the establishment of the company and the workplace are diverse. Development of the industry relates to the activity of the worker. Where in these work activities do not cover the possibility of an impending crash on either the workers or on a construction project. The cause of the occurrence of industrial accidents was the fault of electrical damage, work procedures, and error technique. The method of an association rule is one of the main techniques in data mining and is the most common form used in finding the patterns of data collection. In this research would like to know how relations of the association between the incidence of any industrial accidents. Therefore, by using methods of analysis association rule patterns associated with combination obtained two iterations item set (2 large item set) when every factor of industrial accidents with a West Jakarta so industrial accidents caused by the occurrence of an electrical value damage = 0.2 support and confidence value = 1, and the reverse pattern with value = 0.2 support and confidence = 0.75.

Keywords: association rule, data mining, industrial accidents, rules

Procedia PDF Downloads 267
582 Preliminary Study on the Removal of Solid Uranium Compound in Nuclear Fuel Production System

Authors: Bai Zhiwei, Zhang Shuxia

Abstract:

By sealing constraint, the system of nuclear fuel production penetrates a trace of air in during its service. The vapor in the air can react with material in the system and generate solid uranium compounds. These solid uranium compounds continue to accumulate and attached to the production equipment and pipeline of system, which not only affects the operation reliability of production equipment and give off radiation hazard as well after system retired. Therefore, it is necessary to select a reasonable method to remove it. Through the analysis of physicochemical properties of solid uranium compounds, halogenated fluoride compounds are selected as a cleaning agent, which can remove solid uranium compounds effectively. This paper studied the related chemical reaction under the condition of static test and results show that the selection of high fluoride halogen compounds can be removed solid uranium compounds completely. The study on the influence of reaction pressure with the reaction rate discovered a phenomenon that the higher the pressure, the faster the reaction rate.

Keywords: fluoride halogen compound, remove, radiation, solid uranium compound

Procedia PDF Downloads 283
581 Golden Brain Theory (GBT) for Language Learning

Authors: Tapas Karmaker

Abstract:

Centuries ago, we came to know about ‘Golden Ratio’ also known as Golden Angle. The idea of this research is based on this theme. Researcher perceives ‘The Golden Ratio’ in terms of harmony, meaning that every single item in the universe follows a harmonic behavior. In case of human being, brain responses easily and quickly to this harmony to help memorization. In this theory, harmony means a link. This study has been carried out on a segment of school students and a segment of common people for a period of three years from 2003 to 2006. The research in this respect intended to determine the impact of harmony in the brain of these people. It has been found that students and common people can increase their memorization capacity as much as 70 times more by applying this method. This method works faster and better between age of 8 and 30 years. This result was achieved through tests to assess memorizing capacity by using tools like words, rhymes, texts, math and drawings. The research concludes that this harmonic method can be applied for improving the capacity of learning languages, for the better quality of lifestyle, or any other terms of life as well as in professional activity.

Keywords: language, education, golden brain, learning, teaching

Procedia PDF Downloads 181
580 Design and Implementation of a Hardened Cryptographic Coprocessor with 128-bit RISC-V Core

Authors: Yashas Bedre Raghavendra, Pim Vullers

Abstract:

This study presents the design and implementation of an abstract cryptographic coprocessor, leveraging AMBA(Advanced Microcontroller Bus Architecture) protocols - APB (Advanced Peripheral Bus) and AHB (Advanced High-performance Bus), to enable seamless integration with the main CPU(Central processing unit) and enhance the coprocessor’s algorithm flexibility. The primary objective is to create a versatile coprocessor that can execute various cryptographic algorithms, including ECC(Elliptic-curve cryptography), RSA(Rivest–Shamir–Adleman), and AES (Advanced Encryption Standard) while providing a robust and secure solution for modern secure embedded systems. To achieve this goal, the coprocessor is equipped with a tightly coupled memory (TCM) for rapid data access during cryptographic operations. The TCM is placed within the coprocessor, ensuring quick retrieval of critical data and optimizing overall performance. Additionally, the program memory is positioned outside the coprocessor, allowing for easy updates and reconfiguration, which enhances adaptability to future algorithm implementations. Direct links are employed instead of DMA(Direct memory access) for data transfer, ensuring faster communication and reducing complexity. The AMBA-based communication architecture facilitates seamless interaction between the coprocessor and the main CPU, streamlining data flow and ensuring efficient utilization of system resources. The abstract nature of the coprocessor allows for easy integration of new cryptographic algorithms in the future. As the security landscape continues to evolve, the coprocessor can adapt and incorporate emerging algorithms, making it a future-proof solution for cryptographic processing. Furthermore, this study explores the addition of custom instructions into RISC-V ISE (Instruction Set Extension) to enhance cryptographic operations. By incorporating custom instructions specifically tailored for cryptographic algorithms, the coprocessor achieves higher efficiency and reduced cycles per instruction (CPI) compared to traditional instruction sets. The adoption of RISC-V 128-bit architecture significantly reduces the total number of instructions required for complex cryptographic tasks, leading to faster execution times and improved overall performance. Comparisons are made with 32-bit and 64-bit architectures, highlighting the advantages of the 128-bit architecture in terms of reduced instruction count and CPI. In conclusion, the abstract cryptographic coprocessor presented in this study offers significant advantages in terms of algorithm flexibility, security, and integration with the main CPU. By leveraging AMBA protocols and employing direct links for data transfer, the coprocessor achieves high-performance cryptographic operations without compromising system efficiency. With its TCM and external program memory, the coprocessor is capable of securely executing a wide range of cryptographic algorithms. This versatility and adaptability, coupled with the benefits of custom instructions and the 128-bit architecture, make it an invaluable asset for secure embedded systems, meeting the demands of modern cryptographic applications.

Keywords: abstract cryptographic coprocessor, AMBA protocols, ECC, RSA, AES, tightly coupled memory, secure embedded systems, RISC-V ISE, custom instructions, instruction count, cycles per instruction

Procedia PDF Downloads 53