Search results for: analog signal processing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5056

Search results for: analog signal processing

2086 Design and Implementation of a Hardened Cryptographic Coprocessor with 128-bit RISC-V Core

Authors: Yashas Bedre Raghavendra, Pim Vullers

Abstract:

This study presents the design and implementation of an abstract cryptographic coprocessor, leveraging AMBA(Advanced Microcontroller Bus Architecture) protocols - APB (Advanced Peripheral Bus) and AHB (Advanced High-performance Bus), to enable seamless integration with the main CPU(Central processing unit) and enhance the coprocessor’s algorithm flexibility. The primary objective is to create a versatile coprocessor that can execute various cryptographic algorithms, including ECC(Elliptic-curve cryptography), RSA(Rivest–Shamir–Adleman), and AES (Advanced Encryption Standard) while providing a robust and secure solution for modern secure embedded systems. To achieve this goal, the coprocessor is equipped with a tightly coupled memory (TCM) for rapid data access during cryptographic operations. The TCM is placed within the coprocessor, ensuring quick retrieval of critical data and optimizing overall performance. Additionally, the program memory is positioned outside the coprocessor, allowing for easy updates and reconfiguration, which enhances adaptability to future algorithm implementations. Direct links are employed instead of DMA(Direct memory access) for data transfer, ensuring faster communication and reducing complexity. The AMBA-based communication architecture facilitates seamless interaction between the coprocessor and the main CPU, streamlining data flow and ensuring efficient utilization of system resources. The abstract nature of the coprocessor allows for easy integration of new cryptographic algorithms in the future. As the security landscape continues to evolve, the coprocessor can adapt and incorporate emerging algorithms, making it a future-proof solution for cryptographic processing. Furthermore, this study explores the addition of custom instructions into RISC-V ISE (Instruction Set Extension) to enhance cryptographic operations. By incorporating custom instructions specifically tailored for cryptographic algorithms, the coprocessor achieves higher efficiency and reduced cycles per instruction (CPI) compared to traditional instruction sets. The adoption of RISC-V 128-bit architecture significantly reduces the total number of instructions required for complex cryptographic tasks, leading to faster execution times and improved overall performance. Comparisons are made with 32-bit and 64-bit architectures, highlighting the advantages of the 128-bit architecture in terms of reduced instruction count and CPI. In conclusion, the abstract cryptographic coprocessor presented in this study offers significant advantages in terms of algorithm flexibility, security, and integration with the main CPU. By leveraging AMBA protocols and employing direct links for data transfer, the coprocessor achieves high-performance cryptographic operations without compromising system efficiency. With its TCM and external program memory, the coprocessor is capable of securely executing a wide range of cryptographic algorithms. This versatility and adaptability, coupled with the benefits of custom instructions and the 128-bit architecture, make it an invaluable asset for secure embedded systems, meeting the demands of modern cryptographic applications.

Keywords: abstract cryptographic coprocessor, AMBA protocols, ECC, RSA, AES, tightly coupled memory, secure embedded systems, RISC-V ISE, custom instructions, instruction count, cycles per instruction

Procedia PDF Downloads 55
2085 Integration GIS–SCADA Power Systems to Enclosure Air Dispersion Model

Authors: Ibrahim Shaker, Amr El Hossany, Moustafa Osman, Mohamed El Raey

Abstract:

This paper will explore integration model between GIS–SCADA system and enclosure quantification model to approach the impact of failure-safe event. There are real demands to identify spatial objects and improve control system performance. Nevertheless, the employed methodology is predicting electro-mechanic operations and corresponding time to environmental incident variations. Open processing, as object systems technology, is presented for integration enclosure database with minimal memory size and computation time via connectivity drivers such as ODBC:JDBC during main stages of GIS–SCADA connection. The function of Geographic Information System is manipulating power distribution in contrast to developing issues. In other ward, GIS-SCADA systems integration will require numerical objects of process to enable system model calibration and estimation demands, determine of past events for analysis and prediction of emergency situations for response training.

Keywords: air dispersion model, environmental management, SCADA systems, GIS system, integration power system

Procedia PDF Downloads 350
2084 Optimized Preprocessing for Accurate and Efficient Bioassay Prediction with Machine Learning Algorithms

Authors: Jeff Clarine, Chang-Shyh Peng, Daisy Sang

Abstract:

Bioassay is the measurement of the potency of a chemical substance by its effect on a living animal or plant tissue. Bioassay data and chemical structures from pharmacokinetic and drug metabolism screening are mined from and housed in multiple databases. Bioassay prediction is calculated accordingly to determine further advancement. This paper proposes a four-step preprocessing of datasets for improving the bioassay predictions. The first step is instance selection in which dataset is categorized into training, testing, and validation sets. The second step is discretization that partitions the data in consideration of accuracy vs. precision. The third step is normalization where data are normalized between 0 and 1 for subsequent machine learning processing. The fourth step is feature selection where key chemical properties and attributes are generated. The streamlined results are then analyzed for the prediction of effectiveness by various machine learning algorithms including Pipeline Pilot, R, Weka, and Excel. Experiments and evaluations reveal the effectiveness of various combination of preprocessing steps and machine learning algorithms in more consistent and accurate prediction.

Keywords: bioassay, machine learning, preprocessing, virtual screen

Procedia PDF Downloads 260
2083 Precious and Rare Metals in Overburden Carbonaceous Rocks: Methods of Extraction

Authors: Tatyana Alexandrova, Alexandr Alexandrov, Nadezhda Nikolaeva

Abstract:

A problem of complex mineral resources development is urgent and priority, it is aimed at realization of the processes of their ecologically safe development, one of its components is revealing the influence of the forms of element compounds in raw materials and in the processing products. In view of depletion of the precious metal reserves at the traditional deposits in the XXI century the large-size open cast deposits, localized in black shale strata begin to play the leading role. Carbonaceous (black) shales carry a heightened metallogenic potential. Black shales with high content of carbon are widely distributed within the scope of Bureinsky massif. According to academician Hanchuk`s data black shales of Sutirskaya series contain generally PGEs native form. The presence of high absorptive towards carbonaceous matter gold and PGEs compounds in crude ore results in decrease of valuable components extraction because of their sorption into dissipated carbonaceous matter.

Keywords: сarbonaceous rocks, bitumens, precious metals, concentration, extraction

Procedia PDF Downloads 231
2082 A Resource Optimization Strategy for CPU (Central Processing Unit) Intensive Applications

Authors: Junjie Peng, Jinbao Chen, Shuai Kong, Danxu Liu

Abstract:

On the basis of traditional resource allocation strategies, the usage of resources on physical servers in cloud data center is great uncertain. It will cause waste of resources if the assignment of tasks is not enough. On the contrary, it will cause overload if the assignment of tasks is too much. This is especially obvious when the applications are the same type because of its resource preferences. Considering CPU intensive application is one of the most common types of application in the cloud, we studied the optimization strategy for CPU intensive applications on the same server. We used resource preferences to analyze the case that multiple CPU intensive applications run simultaneously, and put forward a model which can predict the execution time for CPU intensive applications which run simultaneously. Based on the prediction model, we proposed the method to select the appropriate number of applications for a machine. Experiments show that the model can predict the execution time accurately for CPU intensive applications. To improve the execution efficiency of applications, we propose a scheduling model based on priority for CPU intensive applications. Extensive experiments verify the validity of the scheduling model.

Keywords: cloud computing, CPU intensive applications, resource optimization, strategy

Procedia PDF Downloads 265
2081 Simulation of the Collimator Plug Design for Prompt-Gamma Activation Analysis in the IEA-R1 Nuclear Reactor

Authors: Carlos G. Santos, Frederico A. Genezini, A. P. Dos Santos, H. Yorivaz, P. T. D. Siqueira

Abstract:

The Prompt-Gamma Activation Analysis (PGAA) is a valuable technique for investigating the elemental composition of various samples. However, the installation of a PGAA system entails specific conditions such as filtering the neutron beam according to the target and providing adequate shielding for both users and detectors. These requirements incur substantial costs, exceeding $100,000, including manpower. Nevertheless, a cost-effective approach involves leveraging an existing neutron beam facility to create a hybrid system integrating PGAA and Neutron Tomography (NT). The IEA-R1 nuclear reactor at IPEN/USP possesses an NT facility with suitable conditions for adapting and implementing a PGAA device. The NT facility offers a thermal flux slightly colder and provides shielding for user protection. The key additional requirement involves designing detector shielding to mitigate high gamma ray background and safeguard the HPGe detector from neutron-induced damage. This study employs Monte Carlo simulations with the MCNP6 code to optimize the collimator plug for PGAA within the IEA-R1 NT facility. Three collimator models are proposed and simulated to assess their effectiveness in shielding gamma and neutron radiation from nucleon fission. The aim is to achieve a focused prompt-gamma signal while shielding ambient gamma radiation. The simulation results indicate that one of the proposed designs is particularly suitable for the PGAA-NT hybrid system.

Keywords: MCNP6.1, neutron, prompt-gamma ray, prompt-gamma activation analysis

Procedia PDF Downloads 49
2080 Simulation of the Reactive Rotational Molding Using Smoothed Particle Hydrodynamics

Authors: A. Hamidi, S. Khelladi, L. Illoul, A. Tcharkhtchi

Abstract:

Reactive rotational molding (RRM) is a process to manufacture hollow plastic parts with reactive material has several advantages compared to conventional roto molding of thermoplastic powders: process cycle time is shorter; raw material is less expensive because polymerization occurs during processing and high-performance polymers may be used such as thermosets, thermoplastics or blends. However, several phenomena occur during this process which makes the optimization of the process quite complex. In this study, we have used a mixture of isocyanate and polyol as a reactive system. The chemical transformation of this system to polyurethane has been studied by thermal analysis and rheology tests. Thanks to these results of the curing process and rheological measurements, the kinetic and rheokinetik of polyurethane was identified. Smoothed Particle Hydrodynamics, a Lagrangian meshless method, was chosen to simulate reactive fluid flow in 2 and 3D configurations of the polyurethane during the process taking into account the chemical, and chemiorehological results obtained experimentally in this study.

Keywords: reactive rotational molding, simulation, smoothed particle hydrodynamics, surface tension, rheology, free surface flows, viscoelastic, interpolation

Procedia PDF Downloads 274
2079 Architectural Engineering and Executive Design: Modelling Procedures, Scientific Tools, Simulation Processing

Authors: Massimiliano Nastri

Abstract:

The study is part of the scientific references on executive design in engineering and architecture, understood as an interdisciplinary field aimed at anticipating and simulating, planning and managing, guiding and instructing construction operations on site. On this basis, the study intends to provide an analysis of a theoretical, methodological, and guiding character aimed at constituting the disciplinary sphere of the executive design, often in the absence of supporting methodological and procedural guidelines in engineering and architecture. The basic methodologies of the study refer to the investigation of the theories and references that can contribute to constituting the scenario of the executive design as the practice of modelling, visualization, and simulation of the construction phases, through the practices of projection of the pragmatic issues of the building. This by proposing a series of references, interrelations, and openings intended to support (for intellectual, procedural, and applicative purposes) the executive definition of the project, aimed at activating the practices of cognitive acquisition and realization intervention within reality.

Keywords: modelling and simulation technology, executive design, discretization of the construction, engineering design for building

Procedia PDF Downloads 63
2078 Natural Dyeing of Textile Cotton Fabric and Its Characterization

Authors: Rabia Almas

Abstract:

Today’s world is demanding natural and biological colorants on priority bases as an alternative to toxic and unsustainable synthetic dyes. Sustainable natural colors from plants and/or living organisms such as bacteria's and fungi attracted the world research scholars and textile industries recently due to the excitement and opportunities they covered. So, in the present study, natural colors from food waste, such as orange peels and peanuts, were extracted and applied to cotton fabric. The dyeing recipes were optimized in terms of dye concentration, processing temperature and time for higher color strength. The characterization of the dyes and fabric, such as Fourier transform infrared spectroscopy, Scanning Electron Microscopy, and fastness properties were measured for the identification of the chemical groups involved for a better understanding of the dyeing behavior. The results revealed that proper mordanting and concentration of dye on cotton fabric could give high color strength and good fastness to wash and light and these natural dyes can be used as an alternative to synthetic toxic colorants.

Keywords: textile, textile dyes, natural dyes, bio colors

Procedia PDF Downloads 70
2077 Artificial Intelligent Methodology for Liquid Propellant Engine Design Optimization

Authors: Hassan Naseh, Javad Roozgard

Abstract:

This paper represents the methodology based on Artificial Intelligent (AI) applied to Liquid Propellant Engine (LPE) optimization. The AI methodology utilized from Adaptive neural Fuzzy Inference System (ANFIS). In this methodology, the optimum objective function means to achieve maximum performance (specific impulse). The independent design variables in ANFIS modeling are combustion chamber pressure and temperature and oxidizer to fuel ratio and output of this modeling are specific impulse that can be applied with other objective functions in LPE design optimization. To this end, the LPE’s parameter has been modeled in ANFIS methodology based on generating fuzzy inference system structure by using grid partitioning, subtractive clustering and Fuzzy C-Means (FCM) clustering for both inferences (Mamdani and Sugeno) and various types of membership functions. The final comparing optimization results shown accuracy and processing run time of the Gaussian ANFIS Methodology between all methods.

Keywords: ANFIS methodology, artificial intelligent, liquid propellant engine, optimization

Procedia PDF Downloads 562
2076 Characterization of Kopff Crater Using Remote Sensing Data

Authors: Shreekumari Patel, Prabhjot Kaur, Paras Solanki

Abstract:

Moon Mineralogy Mapper (M3), Miniature Radio Frequency (Mini-RF), Kaguya Terrain Camera images, Lunar Orbiter Laser Altimeter (LOLA) digital elevation model (DEM) and Lunar Reconnaissance Orbiter Camera (LROC)- Narrow angle camera (NAC) and Wide angle camera (WAC) images were used to study mineralogy, surface physical properties, and age of the 42 km diameter Kopff crater. M3 indicates the low albedo crater floor to be high-Ca pyroxene dominated associated with floor fracture suggesting the igneous activity of the gabbroic material. Signature of anorthositic material is sampled on the eastern edge as target material is excavated from ~3 km diameter impact crater providing access to the crustal composition. Several occurrences of spinel were detected in northwestern rugged terrain. Our observation can be explained by exposure of spinel by this crater that impacted onto the inner rings of Orientale basin. Spinel was part of the pre-impact target, an intrinsic unit of basin ring. Crater floor was dated by crater counts performed on Kaguya TC images. Nature of surface was studied in detail with LROC NAC and Mini-RF. Freshly exposed surface and boulder or debris seen in LROC NAC images have enhanced radar signal in comparison to mature terrain of Kopff crater. This multidisciplinary analysis of remote sensing data helps to assess lunar surface in detail.

Keywords: crater, mineralogy, moon, radar observations

Procedia PDF Downloads 147
2075 Analyzing the Significance of Religion in Economic Development in East and Southeast Asia: Case Study of the City of Wenzhou in China

Authors: Wenting Pan, Fang Chen

Abstract:

The aim is to increase understanding of the potential effects of religion and economy development in East and Southeast Asia. Religion developed in the east, and southeast Asia is connected with community intensively, especially the activities by women. It could facilitate spiritual awakening in the community and economic empowerment. The theories were assessed by using survey information for Wenzhou which is the legendary city of Chinese economic development, measuring attendance at formal religious services, religious beliefs, and self-identification as religious. Wenzhou’s chamber of commerce is all over the world. Apart from large and small processing factories, Wenzhou is dotted with temples and Taoist temples. In the survey four of the control variables (size of temples, profitability, multiple densities, type of industry and so on) were significant issues to find a relationship between local people and the culture of local religion. What’s more, women should be taken into account seriously. This study has social economy implications for Wenzhou as well as a number of other countries in the East and Southeast Asia.

Keywords: East and Southeast Asia, economy development, Religion, Wenzhou

Procedia PDF Downloads 303
2074 Deployment of Beyond 4G Wireless Communication Networks with Carrier Aggregation

Authors: Bahram Khan, Anderson Rocha Ramos, Rui R. Paulo, Fernando J. Velez

Abstract:

With the growing demand for a new blend of applications, the users dependency on the internet is increasing day by day. Mobile internet users are giving more attention to their own experiences, especially in terms of communication reliability, high data rates and service stability on move. This increase in the demand is causing saturation of existing radio frequency bands. To address these challenges, researchers are investigating the best approaches, Carrier Aggregation (CA) is one of the newest innovations, which seems to fulfill the demands of the future spectrum, also CA is one the most important feature for Long Term Evolution - Advanced (LTE-Advanced). For this purpose to get the upcoming International Mobile Telecommunication Advanced (IMT-Advanced) mobile requirements (1 Gb/s peak data rate), the CA scheme is presented by 3GPP, which would sustain a high data rate using widespread frequency bandwidth up to 100 MHz. Technical issues such as aggregation structure, its implementations, deployment scenarios, control signal techniques, and challenges for CA technique in LTE-Advanced, with consideration of backward compatibility, are highlighted in this paper. Also, performance evaluation in macro-cellular scenarios through a simulation approach is presented, which shows the benefits of applying CA, low-complexity multi-band schedulers in service quality, system capacity enhancement and concluded that enhanced multi-band scheduler is less complex than the general multi-band scheduler, which performs better for a cell radius longer than 1800 m (and a PLR threshold of 2%).

Keywords: component carrier, carrier aggregation, LTE-advanced, scheduling

Procedia PDF Downloads 176
2073 Analysis of Sediment Distribution around Karang Sela Coral Reef Using Multibeam Backscatter

Authors: Razak Zakariya, Fazliana Mustajap, Lenny Sharinee Sakai

Abstract:

A sediment map is quite important in the marine environment. The sediment itself contains thousands of information that can be used for other research. This study was conducted by using a multibeam echo sounder Reson T20 on 15 August 2020 at the Karang Sela (coral reef area) at Pulau Bidong. The study aims to identify the sediment type around the coral reef by using bathymetry and backscatter data. The sediment in the study area was collected as ground truthing data to verify the classification of the seabed. A dry sieving method was used to analyze the sediment sample by using a sieve shaker. PDS 2000 software was used for data acquisition, and Qimera QPS version 2.4.5 was used for processing the bathymetry data. Meanwhile, FMGT QPS version 7.10 processes the backscatter data. Then, backscatter data were analyzed by using the maximum likelihood classification tool in ArcGIS version 10.8 software. The result identified three types of sediments around the coral which were very coarse sand, coarse sand, and medium sand.

Keywords: sediment type, MBES echo sounder, backscatter, ArcGIS

Procedia PDF Downloads 65
2072 Salicylic Acid Signalling in Relation to Root Colonization in Rice

Authors: Seema Garcha, Sheetal Chopra, Navraj Sarao

Abstract:

Plant hormones play a role in internal colonization by beneficial microbes and also systemic acquired resistance. They define qualitative and quantitative nature of root microbiome and also influence dynamics of root rhizospheric soil. The present study is an attempt to relate salicylic acid (signal molecule) content and qualitative nature of root endophytes at various stages in the growth of rice varieties of commercial value- Parmal 121 and Basmati 1121. Root seedlings of these varieties were raised using tissue culture techniques and then they were transplanted in the fields. Cultivation was done using conventional methods in agriculture. Field soil contained 0.39% N, 75.12 Kg/hectare of phosphorus and 163.0 Kg/hectare of potassium. Microfloral profiling of the root tissue was done using the selective microbiological medium. The salicylic acid content was estimated using HPLC-Agilent 1100 HPLC Series. Salicylic acid level of Basmati 1121 remained relatively low at the time of transplant and 90 days after transplant. It increased marginally at 60 days. A similar trend was observed with Parmal 121 as well. However, Parmal variety recorded 0.935 ug/g of salicylic acid at 60 days after transplant. Salicylic acid content decreased after 90 days as both the rice varieties remained disease free. The endophytic root microflora was established by 60 days after transplant in both the varieties after which their population became constant. Rhizobium spp dominated over Azotobacter spp. Genetic profiling of endophytes for nitrogen-fixing ability is underway.

Keywords: plant-microbe interaction, rice, root microbiome, salicylic acid

Procedia PDF Downloads 186
2071 An Authentication Protocol for Quantum Enabled Mobile Devices

Authors: Natarajan Venkatachalam, Subrahmanya V. R. K. Rao, Vijay Karthikeyan Dhandapani, Swaminathan Saravanavel

Abstract:

The quantum communication technology is an evolving design which connects multiple quantum enabled devices to internet for secret communication or sensitive information exchange. In future, the number of these compact quantum enabled devices will increase immensely making them an integral part of present communication systems. Therefore, safety and security of such devices is also a major concern for us. To ensure the customer sensitive information will not be eavesdropped or deciphered, we need a strong authentications and encryption mechanism. In this paper, we propose a mutual authentication scheme between these smart quantum devices and server based on the secure exchange of information through quantum channel which gives better solutions for symmetric key exchange issues. An important part of this work is to propose a secure mutual authentication protocol over the quantum channel. We show that our approach offers robust authentication protocol and further our solution is lightweight, scalable, cost-effective with optimized computational processing overheads.

Keywords: quantum cryptography, quantum key distribution, wireless quantum communication, authentication protocol, quantum enabled device, trusted third party

Procedia PDF Downloads 158
2070 Design and Performance Improvement of Three-Dimensional Optical Code Division Multiple Access Networks with NAND Detection Technique

Authors: Satyasen Panda, Urmila Bhanja

Abstract:

In this paper, we have presented and analyzed three-dimensional (3-D) matrices of wavelength/time/space code for optical code division multiple access (OCDMA) networks with NAND subtraction detection technique. The 3-D codes are constructed by integrating a two-dimensional modified quadratic congruence (MQC) code with one-dimensional modified prime (MP) code. The respective encoders and decoders were designed using fiber Bragg gratings and optical delay lines to minimize the bit error rate (BER). The performance analysis of the 3D-OCDMA system is based on measurement of signal to noise ratio (SNR), BER and eye diagram for a different number of simultaneous users. Also, in the analysis, various types of noises and multiple access interference (MAI) effects were considered. The results obtained with NAND detection technique were compared with those obtained with OR and AND subtraction techniques. The comparison results proved that the NAND detection technique with 3-D MQC\MP code can accommodate more number of simultaneous users for longer distances of fiber with minimum BER as compared to OR and AND subtraction techniques. The received optical power is also measured at various levels of BER to analyze the effect of attenuation.

Keywords: Cross Correlation (CC), Three dimensional Optical Code Division Multiple Access (3-D OCDMA), Spectral Amplitude Coding Optical Code Division Multiple Access (SAC-OCDMA), Multiple Access Interference (MAI), Phase Induced Intensity Noise (PIIN), Three Dimensional Modified Quadratic Congruence/Modified Prime (3-D MQC/MP) code

Procedia PDF Downloads 401
2069 Integrated Microsystem for Multiplexed Genosensor Detection of Biowarfare Agents

Authors: Samuel B. Dulay, Sandra Julich, Herbert Tomaso, Ciara K. O'Sullivan

Abstract:

An early, rapid and definite detection for the presence of biowarfare agents, pathogens, viruses and toxins is required in different situations which include civil rescue and security units, homeland security, military operations, public transportation securities such as airports, metro and railway stations due to its harmful effect on the human population. In this work, an electrochemical genosensor array that allows simultaneous detection of different biowarfare agents within an integrated microsystem that provides an easy handling of the technology which combines a microfluidics setup with a multiplexing genosensor array has been developed and optimised for the following targets: Bacillus anthracis, Brucella abortis and melitensis, Bacteriophage lambda, Francisella tularensis, Burkholderia mallei and pseudomallei, Coxiella burnetii, Yersinia pestis, and Bacillus thuringiensis. The electrode array was modified via co-immobilisation of a 1:100 (mol/mol) mixture of a thiolated probe and an oligoethyleneglycol-terminated monopodal thiol. PCR products from these relevant biowarfare agents were detected reproducibly through a sandwich assay format with the target hybridised between a surface immobilised probe into the electrode and a horseradish peroxidase-labelled secondary reporter probe, which provided an enzyme based electrochemical signal. The potential of the designed microsystem for multiplexed genosensor detection and cross-reactivity studies over potential interfering DNA sequences has demonstrated high selectivity using the developed platform producing high-throughput.

Keywords: biowarfare agents, genosensors, multipled detection, microsystem

Procedia PDF Downloads 256
2068 Memory and Narratives Rereading before and after One Week

Authors: Abigail M. Csik, Gabriel A. Radvansky

Abstract:

As people read through event-based narratives, they construct an event model that captures information about the characters, goals, location, time, and causality. For many reasons, memory for such narratives is represented at different levels, namely, the surface form, textbase, and event model levels. Rereading has been shown to decrease surface form memory, while, at the same time, increasing textbase and event model memories. More generally, distributed practice has consistently shown memory benefits over massed practice for different types of materials, including texts. However, little research has investigated distributed practice of narratives at different inter-study intervals and these effects on these three levels of memory. Recent work in our lab has indicated that there may be dramatic changes in patterns of forgetting around one week, which may affect the three levels of memory. The present experiment aimed to determine the effects of rereading on the three levels of memory as a factor of whether the texts were reread before versus after one week. Participants (N = 42) read a set of stories, re-read them either before or after one week (with an inter-study interval of three days, seven days, or fourteen days), and then took a recognition test, from which the three levels of representation were derived. Signal detection results from this study reveal that differential patterns at the three levels as a factor of whether the narratives were re-read prior to one week or after one week. In particular, an ANOVA revealed that surface form memory was lower (p = .08) while textbase (p = .02) and event model memory (p = .04) were greater if narratives were re-read 14 days later compared to memory when narratives were re-read 3 days later. These results have implications for what type of memory benefits from distributed practice at various inter-study intervals.

Keywords: memory, event cognition, distributed practice, consolidation

Procedia PDF Downloads 204
2067 A Multi Sensor Monochrome Video Fusion Using Image Quality Assessment

Authors: M. Prema Kumar, P. Rajesh Kumar

Abstract:

The increasing interest in image fusion (combining images of two or more modalities such as infrared and visible light radiation) has led to a need for accurate and reliable image assessment methods. This paper gives a novel approach of merging the information content from several videos taken from the same scene in order to rack up a combined video that contains the finest information coming from different source videos. This process is known as video fusion which helps in providing superior quality (The term quality, connote measurement on the particular application.) image than the source images. In this technique different sensors (whose redundant information can be reduced) are used for various cameras that are imperative for capturing the required images and also help in reducing. In this paper Image fusion technique based on multi-resolution singular value decomposition (MSVD) has been used. The image fusion by MSVD is almost similar to that of wavelets. The idea behind MSVD is to replace the FIR filters in wavelet transform with singular value decomposition (SVD). It is computationally very simple and is well suited for real time applications like in remote sensing and in astronomy.

Keywords: multi sensor image fusion, MSVD, image processing, monochrome video

Procedia PDF Downloads 554
2066 Enabling Oral Communication and Accelerating Recovery: The Creation of a Novel Low-Cost Electroencephalography-Based Brain-Computer Interface for the Differently Abled

Authors: Rishabh Ambavanekar

Abstract:

Expressive Aphasia (EA) is an oral disability, common among stroke victims, in which the Broca’s area of the brain is damaged, interfering with verbal communication abilities. EA currently has no technological solutions and its only current viable solutions are inefficient or only available to the affluent. This prompts the need for an affordable, innovative solution to facilitate recovery and assist in speech generation. This project proposes a novel concept: using a wearable low-cost electroencephalography (EEG) device-based brain-computer interface (BCI) to translate a user’s inner dialogue into words. A low-cost EEG device was developed and found to be 10 to 100 times less expensive than any current EEG device on the market. As part of the BCI, a machine learning (ML) model was developed and trained using the EEG data. Two stages of testing were conducted to analyze the effectiveness of the device: a proof-of-concept and a final solution test. The proof-of-concept test demonstrated an average accuracy of above 90% and the final solution test demonstrated an average accuracy of above 75%. These two successful tests were used as a basis to demonstrate the viability of BCI research in developing lower-cost verbal communication devices. Additionally, the device proved to not only enable users to verbally communicate but has the potential to also assist in accelerated recovery from the disorder.

Keywords: neurotechnology, brain-computer interface, neuroscience, human-machine interface, BCI, HMI, aphasia, verbal disability, stroke, low-cost, machine learning, ML, image recognition, EEG, signal analysis

Procedia PDF Downloads 106
2065 Closing the Gap: Efficient Voxelization with Equidistant Scanlines and Gap Detection

Authors: S. Delgado, C. Cerrada, R. S. Gómez

Abstract:

This research introduces an approach to voxelizing the surfaces of triangular meshes with efficiency and accuracy. Our method leverages parallel equidistant scan-lines and introduces a Gap Detection technique to address the limitations of existing approaches. We present a comprehensive study showcasing the method's effectiveness, scalability, and versatility in different scenarios. Voxelization is a fundamental process in computer graphics and simulations, playing a pivotal role in applications ranging from scientific visualization to virtual reality. Our algorithm focuses on enhancing the voxelization process, especially for complex models and high resolutions. One of the major challenges in voxelization in the Graphics Processing Unit (GPU) is the high cost of discovering the same voxels multiple times. These repeated voxels incur in costly memory operations with no useful information. Our scan-line-based method ensures that each voxel is detected exactly once when processing the triangle, enhancing performance without compromising the quality of the voxelization. The heart of our approach lies in the use of parallel, equidistant scan-lines to traverse the interiors of triangles. This minimizes redundant memory operations and avoids revisiting the same voxels, resulting in a significant performance boost. Moreover, our method's computational efficiency is complemented by its simplicity and portability. Written as a single compute shader in Graphics Library Shader Language (GLSL), it is highly adaptable to various rendering pipelines and hardware configurations. To validate our method, we conducted extensive experiments on a diverse set of models from the Stanford repository. Our results demonstrate not only the algorithm's efficiency, but also its ability to produce 26 tunnel free accurate voxelizations. The Gap Detection technique successfully identifies and addresses gaps, ensuring consistent and visually pleasing voxelized surfaces. Furthermore, we introduce the Slope Consistency Value metric, quantifying the alignment of each triangle with its primary axis. This metric provides insights into the impact of triangle orientation on scan-line based voxelization methods. It also aids in understanding how the Gap Detection technique effectively improves results by targeting specific areas where simple scan-line-based methods might fail. Our research contributes to the field of voxelization by offering a robust and efficient approach that overcomes the limitations of existing methods. The Gap Detection technique fills a critical gap in the voxelization process. By addressing these gaps, our algorithm enhances the visual quality and accuracy of voxelized models, making it valuable for a wide range of applications. In conclusion, "Closing the Gap: Efficient Voxelization with Equidistant Scan-lines and Gap Detection" presents an effective solution to the challenges of voxelization. Our research combines computational efficiency, accuracy, and innovative techniques to elevate the quality of voxelized surfaces. With its adaptable nature and valuable innovations, this technique could have a positive influence on computer graphics and visualization.

Keywords: voxelization, GPU acceleration, computer graphics, compute shaders

Procedia PDF Downloads 52
2064 Research on the Aero-Heating Prediction Based on Hybrid Meshes and Hybrid Schemes

Authors: Qiming Zhang, Youda Ye, Qinxue Jiang

Abstract:

Accurate prediction of external flowfield and aero-heating at the wall of hypersonic vehicle is very crucial for the design of aircrafts. Unstructured/hybrid meshes have more powerful advantages than structured meshes in terms of pre-processing, parallel computing and mesh adaptation, so it is imperative to develop high-resolution numerical methods for the calculation of aerothermal environment on unstructured/hybrid meshes. The inviscid flux scheme is one of the most important factors affecting the accuracy of unstructured/ hybrid mesh heat flux calculation. Here, a new hybrid flux scheme is developed and the approach of interface type selection is proposed: i.e. 1) using the exact Riemann scheme solution to calculate the flux on the faces parallel to the wall; 2) employing Sterger-Warming (S-W) scheme to improve the stability of the numerical scheme in other interfaces. The results of the heat flux fit the one observed experimentally and have little dependence on grids, which show great application prospect in unstructured/ hybrid mesh.

Keywords: aero-heating prediction, computational fluid dynamics, hybrid meshes, hybrid schemes

Procedia PDF Downloads 219
2063 Quantifying Product Impacts on Biodiversity: The Product Biodiversity Footprint

Authors: Leveque Benjamin, Rabaud Suzanne, Anest Hugo, Catalan Caroline, Neveux Guillaume

Abstract:

Human products consumption is one of the main drivers of biodiversity loss. However, few pertinent ecological indicators regarding product life cycle impact on species and ecosystems have been built. Life cycle assessment (LCA) methodologies are well under way to conceive standardized methods to assess this impact, by taking already partially into account three of the Millennium Ecosystem Assessment pressures (land use, pollutions, climate change). Coupling LCA and ecological data and methods is an emerging challenge to develop a product biodiversity footprint. This approach was tested on three case studies from food processing, textile, and cosmetic industries. It allowed first to improve the environmental relevance of the Potential Disappeared Fraction of species, end-point indicator typically used in life cycle analysis methods, and second to introduce new indicators on overexploitation and invasive species. This type of footprint is a major step in helping companies to identify their impacts on biodiversity and to propose potential improvements.

Keywords: biodiversity, companies, footprint, life cycle assessment, products

Procedia PDF Downloads 310
2062 Organization of the Olfactory System and the Mushroom Body of the Weaver Ant, Oecophylla smaragdina

Authors: Rajashekhar K. Patil, Martin J. Babu

Abstract:

Weaver ants-Oecophylla smaragdina live in colonies that have polymorphic castes. The females which include the queen, major and minor workers are haploid. The individuals of castes are dependent on olfactory cues for carrying out caste-specific behaviour. In an effort to understand whether organizational differences exist to support these behavioural differences, we studied the olfactory system at the level of the sensilla on the antennae, olfactory glomeruli and the Kenyon cells in the mushroom bodies (MB). The MB differ in major and minor workers in terms of their size, with the major workers having relatively larger calyces and peduncle. The morphology of different types of Kenyon cells as revealed by Golgi-rapid staining was studied and the major workers had more dendritic arbors than minor workers. This suggests a greater degree of olfactory processing in major workers. Differences in caste-specific arrangement of sensilla, olfactory glomeruli and celluar architecture of MB indicate a developmental programme that forms basis of differential behaviour.

Keywords: ant, oecophylla, caste, mushroom body

Procedia PDF Downloads 460
2061 Conceptualizing IoT Based Framework for Enhancing Environmental Accounting By ERP Systems

Authors: Amin Ebrahimi Ghadi, Morteza Moalagh

Abstract:

This research is carried out to find how a perfect combination of IoT architecture (Internet of Things) and ERP system can strengthen environmental accounting to incorporate both economic and environmental information. IoT (e.g., sensors, software, and other technologies) can be used in the company’s value chain from raw material extraction through materials processing, manufacturing products, distribution, use, repair, maintenance, and disposal or recycling products (Cradle to Grave model). The desired ERP software then will have the capability to track both midpoint and endpoint environmental impacts on a green supply chain system for the whole life cycle of a product. All these enable environmental accounting to calculate, and real-time analyze the operation environmental impacts, control costs, prepare for environmental legislation and enhance the decision-making process. In this study, we have developed a model on how to use IoT devices in life cycle assessment (LCA) to gather emissions, energy consumption, hazards, and wastes information to be processed in different modules of ERP systems in an integrated way for using in environmental accounting to achieve sustainability.

Keywords: ERP, environmental accounting, green supply chain, IOT, life cycle assessment, sustainability

Procedia PDF Downloads 159
2060 Recovery of Value-Added Whey Proteins from Dairy Effluent Using Aqueous Two-Phase System

Authors: Perumalsamy Muthiah, Murugesan Thanapalan

Abstract:

The remains of cheese production contain nutritional value added proteins viz., α-Lactalbumin, β-Lactoglobulin representing 80- 90% of the total volume of milk entering the process. Although several possibilities for cheese-whey exploitation have been assayed, approximately half of world cheese-whey production is not treated but is discarded as effluent. It is necessary to develop an effective and environmentally benign extraction process for the recovery of value added cheese whey proteins. Recently aqueous two phase system (ATPS) have emerged as potential separation process, particularly in the field of biotechnology due to the mild conditions of the process, short processing time, and ease of scale-up. In order to design an ATPS process for the recovery of cheese whey proteins, development of phase diagram and the effect of system parameters such as pH, types and the concentrations of the phase forming components, temperature, etc., on the partitioning of proteins were addressed in order to maximize the recovery of proteins. Some of the practical problems encountered in the application of aqueous two-phase systems for the recovery of Cheese whey proteins were also discussed.

Keywords: aqueous two-phase system, phase diagram, extraction, cheese whey

Procedia PDF Downloads 396
2059 Spatial Integrity of Seismic Data for Oil and Gas Exploration

Authors: Afiq Juazer Rizal, Siti Zaleha Misnan, M. Zairi M. Yusof

Abstract:

Seismic data is the fundamental tool utilized by exploration companies to determine potential hydrocarbon. However, the importance of seismic trace data will be undermined unless the geo-spatial component of the data is understood. Deriving a proposed well to be drilled from data that has positional ambiguity will jeopardize business decision and millions of dollars’ investment that every oil and gas company would like to avoid. Spatial integrity QC workflow has been introduced in PETRONAS to ensure positional errors within the seismic data are recognized throughout the exploration’s lifecycle from acquisition, processing, and seismic interpretation. This includes, amongst other tests, quantifying that the data is referenced to the appropriate coordinate reference system, survey configuration validation, and geometry loading verification. The direct outcome of the workflow implementation helps improve reliability and integrity of sub-surface geological model produced by geoscientist and provide important input to potential hazard assessment where positional accuracy is crucial. This workflow’s development initiative is part of a bigger geospatial integrity management effort, whereby nearly eighty percent of the oil and gas data are location-dependent.

Keywords: oil and gas exploration, PETRONAS, seismic data, spatial integrity QC workflow

Procedia PDF Downloads 203
2058 Optimization of Multiplier Extraction Digital Filter On FPGA

Authors: Shiksha Jain, Ramesh Mishra

Abstract:

One of the most widely used complex signals processing operation is filtering. The most important FIR digital filter are widely used in DSP for filtering to alter the spectrum according to some given specifications. Power consumption and Area complexity in the algorithm of Finite Impulse Response (FIR) filter is mainly caused by multipliers. So we present a multiplier less technique (DA technique). In this technique, precomputed value of inner product is stored in LUT. Which are further added and shifted with number of iterations equal to the precision of input sample. But the exponential growth of LUT with the order of FIR filter, in this basic structure, makes it prohibitive for many applications. The significant area and power reduction over traditional Distributed Arithmetic (DA) structure is presented in this paper, by the use of slicing of LUT to the desired length. An architecture of 16 tap FIR filter is presented, with different length of slice of LUT. The result of FIR Filter implementation on Xilinx ISE synthesis tool (XST) vertex-4 FPGA Tool by using proposed method shows the increase of the maximum frequency, the decrease of the resources as usage saving in area with more number of slices and the reduction dynamic power.

Keywords: multiplier less technique, linear phase symmetric FIR filter, FPGA tool, look up table

Procedia PDF Downloads 377
2057 LGBTQ+ Visibility: An Analysis of the Mechanisms for Safeguarding Sexual Minorities within the Common European Asylum System

Authors: Alessandra Tosi, Teia M. Rogers

Abstract:

The Common European Asylum System (CEAS) is the framework that standardises the treatment of applicants for international protection and harmonises asylum systems throughout the European Union. This paper interrogates the rules applied within the CEAS, specifically Directive 2013/33/EU of the European Parliament and of the Council of 26 June 2013, which puts forth the standards for the reception of vulnerable people applying for asylum. Absent from the definition of ‘vulnerable people’ are sexual minorities who routinely experience discrimination in reception centres and emergency accommodations. This paper undertakes an analysis of policies and legalisation of reception centres within the European Union. In confronting the flaws inherent to the system of processing asylum applications, this paper argues for the reform of the CEAS with emphasis on the inclusion of LBGTQ+ asylum seekers as vulnerable people following standards set by international human rights law.

Keywords: accommodation, asylum seekers, CEAS, Common European Asylum System, European Union, LGBTQ+, reception conditions, vulnerable people

Procedia PDF Downloads 123