Search results for: parallel processing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4646

Search results for: parallel processing

2846 A Corpus-Based Study of Subtitling Religious Words into Arabic

Authors: Yousef Sahari, Eisa Asiri

Abstract:

Hollywood films are produced in an open and liberal context, and when subtitling for a more conservative and closed society such as an Arabic society, religious words can pose a thorny challenge for subtitlers. Using a corpus of 90 Hollywood films released between 2000 and 2018 and applying insights from Descriptive Translation Studies (Toury, 1995, 2012) and the dichotomy of domestication and foreignization, this paper investigates three main research questions: (1) What are the dominant religious terms and functions in the English subtitles? (2) What are the dominant translation strategies used in the translation of religious words? (3) Do these strategies tend to be SL-oriented or TL-oriented (domesticating or foreignising)? To answer the research questions above, a quantitative and qualitative analysis of the corpus is conducted, in which the researcher adopts a self-designed, parallel, aligned corpus of ninety films and their Arabic subtitles. A quantitative analysis is performed to compare the frequencies and distribution of religious words, their functions, and the translation strategies employed by the subtitlers of ninety films, with the aim of identifying similarities or differences in addition to identifying the impact of functions of religious terms on the use of subtitling strategies. Based on the quantitative analysis, a qualitative analysis is performed to identify any translational patterns in Arabic translations of religious words and the possible reasons for subtitlers’ choices. The results show that the function of religious words has a strong influence on the choice of subtitling strategies. Also, it is found that foreignization strategies are applied in about two-thirds of the total occurrences of religious words.

Keywords: religious terms, subtitling, audiovisual translation, modern standard arabic, subtitling strategies, english-arabic subtitling

Procedia PDF Downloads 138
2845 A Sub-Scalar Approach to the MIPS Architecture

Authors: Kumar Sambhav Pandey, Anamika Singh

Abstract:

The continuous researches in the field of computer architecture basically aims at accelerating the computational speed and to gain enhanced performance. In this era, the superscalar, sub-scalar concept has not gained enough attention for improving the computation performance. In this paper, we have presented a sub-scalar approach to utilize the parallelism present with in the data while processing. The main idea is to split the data into individual smaller entities and these entities are processed with a defined known set of instructions. This sub-scalar approach to the MIPS architecture can bring out significant improvement in the computational speedup. MIPS-I is the basic design taken in consideration for the development of sub-scalar MIPS64 for increasing the instruction level parallelism (ILP) and resource utilization.

Keywords: dataword, MIPS, processor, sub-scalar

Procedia PDF Downloads 529
2844 An Efficient Clustering Technique for Copy-Paste Attack Detection

Authors: N. Chaitawittanun, M. Munlin

Abstract:

Due to rapid advancement of powerful image processing software, digital images are easy to manipulate and modify by ordinary people. Lots of digital images are edited for a specific purpose and more difficult to distinguish form their original ones. We propose a clustering method to detect a copy-move image forgery of JPEG, BMP, TIFF, and PNG. The process starts with reducing the color of the photos. Then, we use the clustering technique to divide information of measuring data by Hausdorff Distance. The result shows that the purposed methods is capable of inspecting the image file and correctly identify the forgery.

Keywords: image detection, forgery image, copy-paste, attack detection

Procedia PDF Downloads 326
2843 Development of the Religious Out-Group Aggression Scale

Authors: Rylle Evan Gabriel Zamora, Micah Dennise Malia, Abygail Deniese Villabona

Abstract:

When examining studies on aggression, the studies about individual aggression vastly outnumbers those studies on group aggression. Given the nature of aggression to be violent and cyclical, and the amount violent events that have occurred in the near present, the study of group aggression is relevant now more than ever. This discrepancy is parallel with the number of valid and reliable psychological tests that measure group aggression. Throughout history, one of the biggest causes of group based violence and aggression is religion. This is particularly true within the context of the Philippines as there are a large number of religious groups. Thus, this study aimed to develop a standardized test that measures an individual’s tendency to be aggressive to those who are in conflict with his or her religious beliefs. This study employs a test development design that employs a qualitative phase to ensure the validity of the scale. Thus, the study was divided into three phases. First is a pilot test wherein an instrument was designed from existing literature which was then administered to 173 respondents from the four largest religious groups in the Philippines. After extensive factor analysis and reliability testing, new items were then formed from the qualitative data collected from eight participants, consisting of two individuals per religious group. The final testing integrates all statistically significant items from the first phase, and the newly formed items from the second phase, which was then administered to 200 respondents. The results were then tested further for reliability using Cronbach’s alpha and validity through factor analysis. The items that were proven to be significant were then combined to create a final instrument that may be used by future studies.

Keywords: religious aggression, group aggression, test development, psychological assessment, social psychology

Procedia PDF Downloads 281
2842 3D Geomechanical Model the Best Solution of the 21st Century for Perforation's Problems

Authors: Luis Guiliana, Andrea Osorio

Abstract:

The lack of comprehension of the reservoir geomechanics conditions may cause operational problems that cost to the industry billions of dollars per year. The drilling operations at the Ceuta Field, Area 2 South, Maracaibo Lake, have been very expensive due to problems associated with drilling. The principal objective of this investigation is to develop a 3D geomechanical model in this area, in order to optimize the future drillings in the field. For this purpose, a 1D geomechanical model was built at first instance, following the workflow of the MEM (Mechanical Earth Model), this consists of the following steps: 1) Data auditing, 2) Analysis of drilling events and structural model, 3) Mechanical stratigraphy, 4) Overburden stress, 5) Pore pressure, 6) Rock mechanical properties, 7) Horizontal stresses, 8) Direction of the horizontal stresses, 9) Wellbore stability. The 3D MEM was developed through the geostatistic model of the Eocene C-SUP VLG-3676 reservoir and the 1D MEM. With this data the geomechanical grid was embedded. The analysis of the results threw, that the problems occurred in the wells that were examined were mainly due to wellbore stability issues. It was determined that the stress field change as the stratigraphic column deepens, it is normal to strike-slip at the Middle Miocene and Lower Miocene, and strike-slipe to reverse at the Eocene. In agreement to this, at the level of the Eocene, the most advantageous direction to drill is parallel to the maximum horizontal stress (157º). The 3D MEM allowed having a tridimensional visualization of the rock mechanical properties, stresses and operational windows (mud weight and pressures) variations. This will facilitate the optimization of the future drillings in the area, including those zones without any geomechanics information.

Keywords: geomechanics, MEM, drilling, stress

Procedia PDF Downloads 259
2841 Thin Films of Glassy Carbon Prepared by Cluster Deposition

Authors: Hatem Diaf, Patrice Melinon, Antonio Pereira, Bernard Moine, Nicholas Blanchard, Florent Bourquard, Florence Garrelie, Christophe Donnet

Abstract:

Glassy carbon exhibits excellent biological compatibility with live tissues meaning it has high potential for applications in life science. Moreover, glassy carbon has interesting properties including 'high temperature resistance', hardness, low density, low electrical resistance, low friction, and low thermal resistance. The structure of glassy carbon has long been a subject of debate. It is now admitted that glassy carbon is 100% sp2. This term is a little bit confusing as long sp2 hybridization defined from quantum chemistry is related to both properties: threefold configuration and pi bonding (parallel pz orbitals). Using plasma laser deposition of carbon clusters combined with pulsed nano/femto laser annealing, we are able to synthesize thin films of glassy carbon of good quality (probed by G band/ D disorder band ratio in Raman spectroscopy) without thermal post annealing. A careful inspecting of Raman signal, plasmon losses and structure performed by HRTEM (High Resolution Transmission Electron Microscopy) reveals that both properties (threefold and pi orbitals) cannot coexist together. The structure of the films is compared to models including schwarzites based from negatively curved surfaces at the opposite of onions or fullerene-like structures with positively curved surfaces. This study shows that a huge collection of porous carbon named vitreous carbon with different structures can coexist.

Keywords: glassy carbon, cluster deposition, coating, electronic structure

Procedia PDF Downloads 302
2840 Path-Spin to Spin-Spin Hybrid Quantum Entanglement: A Conversion Protocol

Authors: Indranil Bayal, Pradipta Panchadhyayee

Abstract:

Path-spin hybrid entanglement generated and confined in a single spin-1/2 particle is converted to spin-spin hybrid interparticle entanglement, which finds its important applications in quantum information processing. This protocol uses beam splitter, spin flipper, spin measurement, classical channel, unitary transformations, etc., and requires no collective operation on the pair of particles whose spin variables share complete entanglement after the accomplishment of the protocol. The specialty of the protocol lies in the fact that the path-spin entanglement is transferred between spin degrees of freedom of two separate particles initially possessed by a single party.

Keywords: entanglement, path-spin entanglement, spin-spin entanglement, CNOT operation

Procedia PDF Downloads 180
2839 Detection Characteristics of the Random and Deterministic Signals in Antenna Arrays

Authors: Olesya Bolkhovskaya, Alexey Davydov, Alexander Maltsev

Abstract:

In this paper approach to incoherent signal detection in multi-element antenna array are researched and modeled. Two types of useful signals with unknown wavefront were considered. First one is deterministic (Barker code), the second one is random (Gaussian distribution). The derivation of the sufficient statistics took into account the linearity of the antenna array. The performance characteristics and detecting curves are modeled and compared for different useful signals parameters and for different number of elements of the antenna array. Results of researches in case of some additional conditions can be applied to a digital communications systems.

Keywords: antenna array, detection curves, performance characteristics, quadrature processing, signal detection

Procedia PDF Downloads 395
2838 Building Data Infrastructure for Public Use and Informed Decision Making in Developing Countries-Nigeria

Authors: Busayo Fashoto, Abdulhakeem Shaibu, Justice Agbadu, Samuel Aiyeoribe

Abstract:

Data has gone from just rows and columns to being an infrastructure itself. The traditional medium of data infrastructure has been managed by individuals in different industries and saved on personal work tools; one of such is the laptop. This hinders data sharing and Sustainable Development Goal (SDG) 9 for infrastructure sustainability across all countries and regions. However, there has been a constant demand for data across different agencies and ministries by investors and decision-makers. The rapid development and adoption of open-source technologies that promote the collection and processing of data in new ways and in ever-increasing volumes are creating new data infrastructure in sectors such as lands and health, among others. This paper examines the process of developing data infrastructure and, by extension, a data portal to provide baseline data for sustainable development and decision making in Nigeria. This paper employs the FAIR principle (Findable, Accessible, Interoperable, and Reusable) of data management using open-source technology tools to develop data portals for public use. eHealth Africa, an organization that uses technology to drive public health interventions in Nigeria, developed a data portal which is a typical data infrastructure that serves as a repository for various datasets on administrative boundaries, points of interest, settlements, social infrastructure, amenities, and others. This portal makes it possible for users to have access to datasets of interest at any point in time at no cost. A skeletal infrastructure of this data portal encompasses the use of open-source technology such as Postgres database, GeoServer, GeoNetwork, and CKan. These tools made the infrastructure sustainable, thus promoting the achievement of SDG 9 (Industries, Innovation, and Infrastructure). As of 6th August 2021, a wider cross-section of 8192 users had been created, 2262 datasets had been downloaded, and 817 maps had been created from the platform. This paper shows the use of rapid development and adoption of technologies that facilitates data collection, processing, and publishing in new ways and in ever-increasing volumes. In addition, the paper is explicit on new data infrastructure in sectors such as health, social amenities, and agriculture. Furthermore, this paper reveals the importance of cross-sectional data infrastructures for planning and decision making, which in turn can form a central data repository for sustainable development across developing countries.

Keywords: data portal, data infrastructure, open source, sustainability

Procedia PDF Downloads 78
2837 Grid Pattern Recognition and Suppression in Computed Radiographic Images

Authors: Igor Belykh

Abstract:

Anti-scatter grids used in radiographic imaging for the contrast enhancement leave specific artifacts. Those artifacts may be visible or may cause Moiré effect when a digital image is resized on a diagnostic monitor. In this paper, we propose an automated grid artifacts detection and suppression algorithm which is still an actual problem. Grid artifacts detection is based on statistical approach in spatial domain. Grid artifacts suppression is based on Kaiser bandstop filter transfer function design and application avoiding ringing artifacts. Experimental results are discussed and concluded with description of advantages over existing approaches.

Keywords: grid, computed radiography, pattern recognition, image processing, filtering

Procedia PDF Downloads 262
2836 Fermentation of Tolypocladium inflatum to Produce Cyclosporin in Dairy Waste Culture Medium

Authors: Fereshteh Falah, Alireza Vasiee, Farideh Tabatabaei-Yazdi

Abstract:

In this research, we investigated the usage of dairy sludge in the fermentation process and cyclosporin production. This bioactive compound is a metabolite produced by Tolypocladium inflatum. Results showed that about 200 ppm of cyclosporin can be produced in this fermentation. In order to have a proper and specific function, CyA must be free of any impurities, so we need purification. In this downstream processing, we used chromatographic extraction and evaluation of pharmacological activities of cyA. Results showed that the obtained metabolite has very high activity against Aspergilus niger (25mm clear zone). This cyclosporin was isolated for use as an antibiotic. The current research shows that this drug is very vital and commercially very important.

Keywords: fermentation, cyclosporin A, Tolypocladium inflatum, TLC

Procedia PDF Downloads 106
2835 The Composition and Activity of Germinated Broccoli Seeds and Their Extract

Authors: Boris Nemzer, Tania Reyes-Izquierdo, Zbigniew Pietrzkowski

Abstract:

Glucosinolate is a family of glucosides that can be found in a family of brassica vegetables. Upon the damage of the plant, glucosinolate breakdown by an internal enzyme myrosinase (thioglucosidase; EC 3.2.3.1) into isothiocyanates, such as sulforaphane. Sulforaphane is formed by glucoraphanin cleaving the sugar off by myrosinase and rearranged. Sulforaphane nitrile is formed in the same reaction as sulforaphane with the active of epithiospecifier protein (ESP). Most common food processing procedure would break the plant and mix the glucoraphanin and myrosinase together, and the formed sulforaphane would be further degraded. The purpose of this study is to understand the glucoraphanin/sulforaphane and the myrosinase activity of broccoli seeds germinated at a different time and technological processing conditions that keep the activity of the enzyme to form sulforaphane. Broccoli seeds were germinated in the house. Myrosinase activities were tested as the glucose content using glucose assay kit and measured UV-Vis spectrophotometer. Glucosinolates were measured by HPLC/DAD. Sulforaphane was measured using HPLC-DAD and GC/MS. The 6 hr germinated sprouts have a myrosinase activity 32.2 mg glucose/g, which is comparable with 12 and 24 hour germinated seeds and higher than dry seeds. The glucoraphanin content in 6 hour germinated sprouts is 13935 µg/g which is comparable to 24 hour germinated seeds and lower than the dry seeds. GC/MS results show that the amount of sulforaphane is higher than the amount of sulforaphane nitrile in seeds, 6 hour and 24 hour germinated seeds. The ratio of sulforaphane and sulforaphane nitrile is high in 6 hour germinated seeds, which indicates the inactivated ESP in the reaction. After evaluating the results, the short time germinated seeds can be used as the source of glucoraphanin and myrosinase supply to form potential higher sulforaphane content. Broccoli contains glucosinolates, glucoraphanin (4-methylsulfinylbutyl glucosinolate), which is an important metabolite with health-promoting effects. In the pilot clinical study, we observed the effects of a glucosinolates/glucoraphanin-rich extract from short time germinated broccoli seeds on blood adenosine triphosphate (ATP), reactive oxygen species (ROS) and lactate levels. A single dose of 50 mg of broccoli sprouts extract increased blood levels of ATP up to 61% (p=0.0092) during the first 2 hours after the ingestion. Interestingly, this effect was not associated with an increase in blood ROS or lactate. When compared to the placebo group, levels of lactate were reduced by 10% (p=0.006). These results indicate that broccoli germinated seed extract may positively affect the generation of ATP in humans. Due to the preliminary nature of this work and promising results, larger clinical trials are justified.

Keywords: broccoli glucosinolates, glucoraphanin, germinated seeds, myrosinase, adenosine triphosphate

Procedia PDF Downloads 277
2834 Structured-Ness and Contextual Retrieval Underlie Language Comprehension

Authors: Yao-Ying Lai, Maria Pinango, Ashwini Deo

Abstract:

While grammatical devices are essential to language processing, how comprehension utilizes cognitive mechanisms is less emphasized. This study addresses this issue by probing the complement coercion phenomenon: an entity-denoting complement following verbs like begin and finish receives an eventive interpretation. For example, (1) “The queen began the book” receives an agentive reading like (2) “The queen began [reading/writing/etc.…] the book.” Such sentences engender additional processing cost in real-time comprehension. The traditional account attributes this cost to an operation that coerces the entity-denoting complement to an event, assuming that these verbs require eventive complements. However, in closer examination, examples like “Chapter 1 began the book” undermine this assumption. An alternative, Structured Individual (SI) hypothesis, proposes that the complement following aspectual verbs (AspV; e.g. begin, finish) is conceptualized as a structured individual, construed as an axis along various dimensions (e.g. spatial, eventive, temporal, informational). The composition of an animate subject and an AspV such as (1) engenders an ambiguity between an agentive reading along the eventive dimension like (2), and a constitutive reading along the informational/spatial dimension like (3) “[The story of the queen] began the book,” in which the subject is interpreted as a subpart of the complement denotation. Comprehenders need to resolve the ambiguity by searching contextual information, resulting in additional cost. To evaluate the SI hypothesis, a questionnaire was employed. Method: Target AspV sentences such as “Shakespeare began the volume.” were preceded by one of the following types of context sentence: (A) Agentive-biasing, in which an event was mentioned (…writers often read…), (C) Constitutive-biasing, in which a constitutive meaning was hinted (Larry owns collections of Renaissance literature.), (N) Neutral context, which allowed both interpretations. Thirty-nine native speakers of English were asked to (i) rate each context-target sentence pair from a 1~5 scale (5=fully understandable), and (ii) choose possible interpretations for the target sentence given the context. The SI hypothesis predicts that comprehension is harder for the Neutral condition, as compared to the biasing conditions because no contextual information is provided to resolve an ambiguity. Also, comprehenders should obtain the specific interpretation corresponding to the context type. Results: (A) Agentive-biasing and (C) Constitutive-biasing were rated higher than (N) Neutral conditions (p< .001), while all conditions were within the acceptable range (> 3.5 on the 1~5 scale). This suggests that when lacking relevant contextual information, semantic ambiguity decreases comprehensibility. The interpretation task shows that the participants selected the biased agentive/constitutive reading for condition (A) and (C) respectively. For the Neutral condition, the agentive and constitutive readings were chosen equally often. Conclusion: These findings support the SI hypothesis: the meaning of AspV sentences is conceptualized as a parthood relation involving structured individuals. We argue that semantic representation makes reference to spatial structured-ness (abstracted axis). To obtain an appropriate interpretation, comprehenders utilize contextual information to enrich the conceptual representation of the sentence in question. This study connects semantic structure to human’s conceptual structure, and provides a processing model that incorporates contextual retrieval.

Keywords: ambiguity resolution, contextual retrieval, spatial structured-ness, structured individual

Procedia PDF Downloads 315
2833 Comparison of Processing Conditions for Plasticized PVC and PVB

Authors: Michael Tupý, Jaroslav Císař, Pavel Mokrejš, Dagmar Měřínská, Alice Tesaříková-Svobodová

Abstract:

The worldwide problem is that the recycled PVB is wildly stored in landfills. However, PVB have very similar chemical properties such as PVC. Moreover, both of them are used in plasticized form. Thus, the thermal properties of plasticized PVC obtained from primary production and the PVB was obtained by recycling of windshields are compared. It is carried out in order to find degradable conditions and decide if blend of PVB/PVC can be processable together. Tested PVC contained 38 % of plasticizer diisononyl phthalate (DINP) and PVB was plasticized with 28 % of triethylene glycol, bis(2-ethylhexanoate) (3GO). Thermal and thermo-oxidative decomposition of both vinyl polymers are compared such as DSC and OOT analysis. The tensile strength analysis is added.

Keywords: polyvinyl chloride, polyvinyl butyral, recycling, reprocessing, thermal analysis, decomposition

Procedia PDF Downloads 494
2832 Critically Sampled Hybrid Trigonometry Generalized Discrete Fourier Transform for Multistandard Receiver Platform

Authors: Temidayo Otunniyi

Abstract:

This paper presents a low computational channelization algorithm for the multi-standards platform using poly phase implementation of a critically sampled hybrid Trigonometry generalized Discrete Fourier Transform, (HGDFT). An HGDFT channelization algorithm exploits the orthogonality of two trigonometry Fourier functions, together with the properties of Quadrature Mirror Filter Bank (QMFB) and Exponential Modulated filter Bank (EMFB), respectively. HGDFT shows improvement in its implementation in terms of high reconfigurability, lower filter length, parallelism, and medium computational activities. Type 1 and type 111 poly phase structures are derived for real-valued HGDFT modulation. The design specifications are decimated critically and over-sampled for both single and multi standards receiver platforms. Evaluating the performance of oversampled single standard receiver channels, the HGDFT algorithm achieved 40% complexity reduction, compared to 34% and 38% reduction in the Discrete Fourier Transform (DFT) and tree quadrature mirror filter (TQMF) algorithm. The parallel generalized discrete Fourier transform (PGDFT) and recombined generalized discrete Fourier transform (RGDFT) had 41% complexity reduction and HGDFT had a 46% reduction in oversampling multi-standards mode. While in the critically sampled multi-standard receiver channels, HGDFT had complexity reduction of 70% while both PGDFT and RGDFT had a 34% reduction.

Keywords: software defined radio, channelization, critical sample rate, over-sample rate

Procedia PDF Downloads 113
2831 Implementation of Iterative Algorithm for Earthquake Location

Authors: Hussain K. Chaiel

Abstract:

The development in the field of the digital signal processing (DSP) and the microelectronics technology reduces the complexity of the iterative algorithms that need large number of arithmetic operations. Virtex-Field Programmable Gate Arrays (FPGAs) are programmable silicon foundations which offer an important solution for addressing the needs of high performance DSP designer. In this work, Virtex-7 FPGA technology is used to implement an iterative algorithm to estimate the earthquake location. Simulation results show that an implementation based on block RAMB36E1 and DSP48E1 slices of Virtex-7 type reduces the number of cycles of the clock frequency. This enables the algorithm to be used for earthquake prediction.

Keywords: DSP, earthquake, FPGA, iterative algorithm

Procedia PDF Downloads 372
2830 Self-Efficacy Perceptions of Pre-Service Art and Music Teachers towards the Use of Information and Communication Technologies

Authors: Agah Tugrul Korucu

Abstract:

Information and communication technologies have become an important part of our daily lives with significant investments in technology in the 21st century. Individuals are more willing to design and implement computer-related activities, and they are the main component of computer self-efficacy and self-efficacy related to the fact that the increase in information technology, with operations in parallel with these activities more successful. The Self-efficacy level is a significant factor which determines how individuals act in events, situations and difficult processes. It is observed that individuals with higher self-efficacy perception of computers who encounter problems related to computer use overcome them more easily. Therefore, this study aimed to examine self-efficacy perceptions of pre-service art and music teachers towards the use of information and communication technologies in terms of different variables. Research group consists of 60 pre-service teachers who are studying at Necmettin Erbakan University Ahmet Keleşoğlu Faculty of Education Art and Music department. As data collection tool of the study; “personal information form” developed by the researcher and used to collect demographic data and "the perception scale related to self-efficacy of informational technology" are used. The scale is 5-point Likert-type scale. It consists of 27 items. The Kaiser-Meyer-Olkin (KMO) sample compliance value is found 0.959. The Cronbach alpha reliability coefficient of the scale is found to be 0.97. computer-based statistical software package (SPSS 21.0) is used in order to analyze the data collected by data collection tools; descriptive statistics, t-test, analysis of variance are used as statistical techniques.

Keywords: self-efficacy perceptions, teacher candidate, information and communication technologies, art teacher

Procedia PDF Downloads 311
2829 Oleic Acid Enhances Hippocampal Synaptic Efficacy

Authors: Rema Vazhappilly, Tapas Das

Abstract:

Oleic acid is a cis unsaturated fatty acid and is known to be a partially essential fatty acid due to its limited endogenous synthesis during pregnancy and lactation. Previous studies have demonstrated the role of oleic acid in neuronal differentiation and brain phospholipid synthesis. These evidences indicate a major role for oleic acid in learning and memory. Interestingly, oleic acid has been shown to enhance hippocampal long term potentiation (LTP), the physiological correlate of long term synaptic plasticity. However the effect of oleic acid on short term synaptic plasticity has not been investigated. Short term potentiation (STP) is the physiological correlate of short term synaptic plasticity which is the key underlying molecular mechanism of short term memory and neuronal information processing. STP in the hippocampal CA1 region has been known to require the activation of N-methyl-D-aspartate receptors (NMDARs). The NMDAR dependent hippocampal STP as a potential mechanism for short term memory has been a subject of intense interest for the past few years. Therefore in the present study the effect of oleic acid on NMDAR dependent hippocampal STP was determined in mouse hippocampal slices (in vitro) using Multi-electrode array system. STP was induced by weak tetanic Stimulation (one train of 100 Hz stimulations for 0.1s) of the Schaffer collaterals of CA1 region of the hippocampus in slices treated with different concentrations of oleic acid in presence or absence of NMDAR antagonist D-AP5 (30 µM) . Oleic acid at 20 (mean increase in fEPSP amplitude = ~135 % Vs. Control = 100%; P<0.001) and 30 µM (mean increase in fEPSP amplitude = ~ 280% Vs. Control = 100%); P<0.001) significantly enhanced the STP following weak tetanic stimulation. Lower oleic acid concentrations at 10 µM did not modify the hippocampal STP induced by weak tetanic stimulation. The hippocampal STP induced by weak tetanic stimulation was completely blocked by the NMDA receptor antagonist D-AP5 (30µM) in both oleic acid and control treated hippocampal slices. This lead to the conclusion that the hippocampal STP elicited by weak tetanic stimulation and enhanced by oleic acid was NMDAR dependent. Together these findings suggest that oleic acid may enhance the short term memory and neuronal information processing through the modulation of NMDAR dependent hippocampal short-term synaptic plasticity. In conclusion this study suggests the possible role of oleic acid to prevent the short term memory loss and impaired neuronal function throughout development.

Keywords: oleic acid, short-term potentiation, memory, field excitatory post synaptic potentials, NMDA receptor

Procedia PDF Downloads 319
2828 Human Rights to Environment: The Constitutional and Judicial Perspective in India

Authors: Varinder Singh

Abstract:

The primitive man had not known anything like human rights. In the later centuries of human progress with the development of scientific and technological knowledge, the growth of population and the tremendous changes in the human environment, the laws of nature that maintained the Eco-balance crumbled. The race for better and comfortable life landed mankind in a vicious circle. It created environmental imbalance, unplanned and uneven development, breakdown of self-sustaining village economy, mushrooming of shanty towns and slums, widening the chasm between the rich and the poor, over-exploitation of natural resources, desertification of arable lands, pollution of different kinds, heating up of earth and depletion of ozone layer. Modem International Life has been deeply marked and transformed by current endeavors to meet the needs and fulfill the requirements of protection of human person and of the environment. Such endeavors have been encouraged by the widespread recognition that protection of human being and the environment reflects common superior values and constitutes a common concern of mankind. The parallel evolutions of human rights protection and environmental protection disclose some close affinities. There was the occurrence of process of internationalization of both human rights protection and environmental protection, the former beginning with the 1948 Universal Declaration of Human Rights, the latter with the 1972 Stockholm Declaration on the Human Environment.It is now well established that it is the basic human right of every individual to live in a pollution free environment with full human dignity. The judiciary has so far pronounced a number of judgments in this regard. The Supreme Court in view of various laws relating to environment protection and the constitutional provision has held that right to pollution free environment. Article-21 is the heart of the fundamental rights and has received expanded meanings from time to time.

Keywords: human rights, law, environment, polluter

Procedia PDF Downloads 210
2827 Aseismic Stiffening of Architectural Buildings as Preventive Restoration Using Unconventional Materials

Authors: Jefto Terzovic, Ana Kontic, Isidora Ilic

Abstract:

In the proposed design concept, laminated glass and laminated plexiglass, as ”unconventional materials”, are considered as a filling in a steel frame on which they overlap by the intermediate rubber layer, thereby forming a composite assembly. In this way vertical elements of stiffening are formed, capable for reception of seismic force and integrated into the structural system of the building. The applicability of such a system was verified by experiments in laboratory conditions where the experimental models based on laminated glass and laminated plexiglass had been exposed to the cyclic loads that simulate the seismic force. In this way the load capacity of composite assemblies was tested for the effects of dynamic load that was parallel to assembly plane. Thus, the stress intensity to which composite systems might be exposed was determined as well as the range of the structure stiffening referring to the expressed deformation along with the advantages of a particular type of filling compared to the other one. Using specialized software whose operation is based on the finite element method, a computer model of the structure was created and processed in the case study; the same computer model was used for analyzing the problem in the first phase of the design process. The stiffening system based on composite assemblies tested in laboratories is implemented in the computer model. The results of the modal analysis and seismic calculation from the computer model with stiffeners applied showed an efficacy of such a solution, thus rounding the design procedures for aseismic stiffening by using unconventional materials.

Keywords: laminated glass, laminated plexiglass, aseismic stiffening, experiment, laboratory testing, computer model, finite element method

Procedia PDF Downloads 65
2826 Regulatory and Economic Challenges of AI Integration in Cyber Insurance

Authors: Shreyas Kumar, Mili Shangari

Abstract:

Integrating artificial intelligence (AI) in the cyber insurance sector represents a significant advancement, offering the potential to revolutionize risk assessment, fraud detection, and claims processing. However, this integration introduces a range of regulatory and economic challenges that must be addressed to ensure responsible and effective deployment of AI technologies. This paper examines the multifaceted regulatory landscape governing AI in cyber insurance and explores the economic implications of compliance, innovation, and market dynamics. AI's capabilities in processing vast amounts of data and identifying patterns make it an invaluable tool for insurers in managing cyber risks. Yet, the application of AI in this domain is subject to stringent regulatory scrutiny aimed at safeguarding data privacy, ensuring algorithmic transparency, and preventing biases. Regulatory bodies, such as the European Union with its General Data Protection Regulation (GDPR), mandate strict compliance requirements that can significantly impact the deployment of AI systems. These regulations necessitate robust data protection measures, ethical AI practices, and clear accountability frameworks, all of which entail substantial compliance costs for insurers. The economic implications of these regulatory requirements are profound. Insurers must invest heavily in upgrading their IT infrastructure, implementing robust data governance frameworks, and training personnel to handle AI systems ethically and effectively. These investments, while essential for regulatory compliance, can strain financial resources, particularly for smaller insurers, potentially leading to market consolidation. Furthermore, the cost of regulatory compliance can translate into higher premiums for policyholders, affecting the overall affordability and accessibility of cyber insurance. Despite these challenges, the potential economic benefits of AI integration in cyber insurance are significant. AI-enhanced risk assessment models can provide more accurate pricing, reduce the incidence of fraudulent claims, and expedite claims processing, leading to overall cost savings and increased efficiency. These efficiencies can improve the competitiveness of insurers and drive innovation in product offerings. However, balancing these benefits with regulatory compliance is crucial to avoid legal penalties and reputational damage. The paper also explores the potential risks associated with AI integration, such as algorithmic biases that could lead to unfair discrimination in policy underwriting and claims adjudication. Regulatory frameworks need to evolve to address these issues, promoting fairness and transparency in AI applications. Policymakers play a critical role in creating a balanced regulatory environment that fosters innovation while protecting consumer rights and ensuring market stability. In conclusion, the integration of AI in cyber insurance presents both regulatory and economic challenges that require a coordinated approach involving regulators, insurers, and other stakeholders. By navigating these challenges effectively, the industry can harness the transformative potential of AI, driving advancements in risk management and enhancing the resilience of the cyber insurance market. This paper provides insights and recommendations for policymakers and industry leaders to achieve a balanced and sustainable integration of AI technologies in cyber insurance.

Keywords: artificial intelligence (AI), cyber insurance, regulatory compliance, economic impact, risk assessment, fraud detection, cyber liability insurance, risk management, ransomware

Procedia PDF Downloads 14
2825 Incidence of Fungal Infections and Mycotoxicosis in Pork Meat and Pork By-Products in Egyptian Markets

Authors: Ashraf Samir Hakim, Randa Mohamed Alarousy

Abstract:

The consumption of food contaminated with molds (microscopic filamentous fungi) and their toxic metabolites results in the development of food-borne mycotoxicosis. The spores of molds are ubiquitously spread in the environment and can be detected everywhere. Ochratoxin A is a potentially carcinogenic fungal toxin found in a variety of food commodities , not only is considered the most abundant and hence the most commonly detected member but also is the most toxic one.Ochratoxin A is the most abundant and hence the most commonly detected member, but is also the most toxic of the three. A very limited research works concerning foods of porcine origin in Egypt were obtained in spite of presence a considerable swine population and consumers. In this study, the quality of various ready-to-eat local and imported pork meat and meat byproducts sold in Egyptian markets as well as edible organs as liver and kidney were assessed for the presence of various molds and their toxins as a raw material. Mycological analysis was conducted on (n=110) samples which included pig livers n=10 and kidneys n=10 from the Basateen slaughter house; local n=70 and 20 imported processed pork meat byproducts.The isolates were identified using traditional mycological and biochemical tests while, Ochratoxin A levels were quantitatively analyzed using the high performance liquid. Results of conventional mycological tests for detecting the presence of fungal growth (yeasts or molds) were negative, while the results of mycotoxins concentrations were be greatly above the permiceable limits or "tolerable weekly intake" (TWI) of ochratoxin A established by EFSA in 2006 in local pork and pork byproducts while the imported samples showed a very slightly increasing.Since ochratoxin A is stable and generally resistant to heat and processing, control of ochratoxin A contamination lies in the control of the growth of the toxin-producing fungi. Effective prevention of ochratoxin A contamination therefore depends on good farming and agricultural practices. Good Agricultural Practices (GAP) including methods to reduce fungal infection and growth during harvest, storage, transport and processing provide the primary line of defense against contamination with ochratoxin A. To the best of our knowledge this is the first report of mycological assessment, especially the mycotoxins in pork byproducts in Egypt.

Keywords: Egyptian markets, mycotoxicosis, ochratoxin A, pork meat, pork by-products

Procedia PDF Downloads 451
2824 The Image Redefinition of Urban Destinations: The Case of Madrid and Barcelona

Authors: Montserrat Crespi Vallbona, Marta Domínguez Pérez

Abstract:

Globalization impacts on cities and especially on their centers, especially on those spaces more visible and coveted. Changes are involved in processes such as touristification, gentrification or studentification, in addition of shop trendiness. The city becomes a good of interchange rather than a communal good for its inhabitants and consequently, its value is monetized. So, these different tendencies are analyzed: on one hand, the presence of tourists, the home rental increase, the explosion of businesses related to tourism; on the other hand; the return of middle classes or gentries to the center in a socio-spatial model that has changed highlighting the centers by their culture and their opportunities as well as by the value of public space and centrality; then, the interest of students (national and international) to be part of these city centers as dynamic groups and emerging classes with a higher purchasing power and better cultural capital than in the past; and finally, the conversion of old stores into modern ones, where vintage trend and the renewal of antiquity is the essence. All these transforming processes impact the European cities and redefine their image. All these trends reinforce the impression and brand of the urban center as an attractive space for investment, keeping such nonsense meaningful. These four tendencies have been spreading correlatively impacting the centers and transforming them involving the displacement of former residents of these spaces and revitalizing the center that is financed and commercialized in parallel. The cases of Madrid and Barcelona as spaces of greater evidence in Spain of these tendencies serve to illustrate these processes and represent the spearhead. Useful recommendations are presented to urban planners to find the conciliation of communal and commercialized spaces.

Keywords: gentrification, shop trendiness, studentification, touristification

Procedia PDF Downloads 151
2823 Impact of a Novel Technique of S-Shaped Tracheostoma in Pediatric Tracheostomy in Intensive Care Unit on Success and Procedure Related Complications

Authors: Devendra Gupta, Sushilk K. Agarwal, Amit Kesari, P. K. Singh

Abstract:

Objectives: Pediatric patients often may experience persistent respiratory failure that requires tracheostomy placement in Pediatric ICU. We have designed a technique of tracheostomy in pediatric patients with S-shaped incision on the tracheal wall with higher success rate and lower complication rate. Technique: Following general anesthesia and positioning of the patient, the trachea was exposed in midline by a vertical skin incision. In order to make S-shaped tracheostoma, second tracheal ring was identified. The conventional vertical incision was made in second tracheal ring and then extended at both its ends laterally in the inter-cartilaginous space parallel to the tracheal cartilage in the opposite direction to make the incision S-shaped. The trachea was dilated with tracheal dilator and appropriate size of tracheostomy tube was then placed into the trachea. Results: S-shaped tracheostomy was performed in 20 children with mean age of 6.25 years (age range is 2-7) requiring tracheostomy placement. The tracheostomy tubes were successfully placed in all the patients in single attempt. There was no incidence of significant intra-operative bleeding, subcutaneous emphysema, vocal cord palsy or pneumothorax. Two patients developed pneumonia and expired within a year. However, there was no incidence of tracheo-esophageal fistula, suprastomal collapse or difficulty in decannulation on one year of follow up related to our technique. One patient developed late trachietis managed conservatively. Conclusion: S-shaped tracheoplasty was associated with high success rate, reduced risk of the early and late complications in pediatric patients requiring tracheostomy.

Keywords: peatrics, tracheostomy, ICU, tracheostoma

Procedia PDF Downloads 256
2822 Denoising of Magnetotelluric Signals by Filtering

Authors: Rodrigo Montufar-Chaveznava, Fernando Brambila-Paz, Ivette Caldelas

Abstract:

In this paper, we present the advances corresponding to the denoising processing of magnetotelluric signals using several filters. In particular, we use the most common spatial domain filters such as median and mean, but we are also using the Fourier and wavelet transform for frequency domain filtering. We employ three datasets obtained at the different sampling rate (128, 4096 and 8192 bps) and evaluate the mean square error, signal-to-noise relation, and peak signal-to-noise relation to compare the kernels and determine the most suitable for each case. The magnetotelluric signals correspond to earth exploration when water is searched. The object is to find a denoising strategy different to the one included in the commercial equipment that is employed in this task.

Keywords: denoising, filtering, magnetotelluric signals, wavelet transform

Procedia PDF Downloads 349
2821 DNA Damage and Apoptosis Induced in Drosophila melanogaster Exposed to Different Duration of 2400 MHz Radio Frequency-Electromagnetic Fields Radiation

Authors: Neha Singh, Anuj Ranjan, Tanu Jindal

Abstract:

Over the last decade, the exponential growth of mobile communication has been accompanied by a parallel increase in density of electromagnetic fields (EMF). The continued expansion of mobile phone usage raises important questions as EMF, especially radio frequency (RF), have long been suspected of having biological effects. In the present experiments, we studied the effects of RF-EMF on cell death (apoptosis) and DNA damage of a well- tested biological model, Drosophila melanogaster exposed to 2400 MHz frequency for different time duration i.e. 2 hrs, 4 hrs, 6 hrs,8 hrs, 10 hrs, and 12 hrs each day for five continuous days in ambient temperature and humidity conditions inside an exposure chamber. The flies were grouped into control, sham-exposed, and exposed with 100 flies in each group. In this study, well-known techniques like Comet Assay and TUNEL (Terminal deoxynucleotide transferase dUTP Nick End Labeling) Assay were used to detect DNA damage and for apoptosis studies, respectively. Experiments results showed DNA damage in the brain cells of Drosophila which increases as the duration of exposure increases when observed under the observed when we compared results of control, sham-exposed, and exposed group which indicates that EMF radiation-induced stress in the organism that leads to DNA damage and cell death. The process of apoptosis and mutation follows similar pathway for all eukaryotic cells; therefore, studying apoptosis and genotoxicity in Drosophila makes similar relevance for human beings as well.

Keywords: cell death, apoptosis, Comet Assay, DNA damage, Drosophila, electromagnetic fields, EMF, radio frequency, RF, TUNEL assay

Procedia PDF Downloads 143
2820 Central African Republic Government Recruitment Agency Based on Identity Management and Public Key Encryption

Authors: Koyangbo Guere Monguia Michel Alex Emmanuel

Abstract:

In e-government and especially recruitment, many researches have been conducted to build a trustworthy and reliable online or application system capable to process users or job applicant files. In this research (Government Recruitment Agency), cloud computing, identity management and public key encryption have been used to management domains, access control authorization mechanism and to secure data exchange between entities for reliable procedure of processing files.

Keywords: cloud computing network, identity management systems, public key encryption, access control and authorization

Procedia PDF Downloads 343
2819 Design and Implementation of an Image Based System to Enhance the Security of ATM

Authors: Seyed Nima Tayarani Bathaie

Abstract:

In this paper, an image-receiving system was designed and implemented through optimization of object detection algorithms using Haar features. This optimized algorithm served as face and eye detection separately. Then, cascading them led to a clear image of the user. Utilization of this feature brought about higher security by preventing fraud. This attribute results from the fact that services will be given to the user on condition that a clear image of his face has already been captured which would exclude the inappropriate person. In order to expedite processing and eliminating unnecessary ones, the input image was compressed, a motion detection function was included in the program, and detection window size was confined.

Keywords: face detection algorithm, Haar features, security of ATM

Procedia PDF Downloads 399
2818 3D-printing for Ablation Planning in Patients Undergoing Atrial Fibrillation Ablation: 3D-GALA Trial

Authors: Terentes Printzios Dimitrios, Loanna Gourgouli, Vlachopoulos Charalambos

Abstract:

Aims: Atrial fibrillation (AF) remains one of the major causes of stroke, heart failure, sudden death and cardiovascular morbidity. Ablation techniques are becoming more appealing after the latest results of randomized trials showing the overall clinical benefit. On the other hand, imaging techniques and the frontier application of 3D printing are emerging as a valuable ally for cardiac procedures. However, no randomized trial has directly assessed the impact of preprocedural imaging and especially 3D printing guidance for AF ablation. The present study is designed to investigate for the first time the effect of 3D printing of the heart on the safety and effectiveness of the ablation procedure. Methods and design: The 3D-GALA trial is a randomized, open-label, controlled, multicentre clinical trial of 2 parallel groups designed to enroll a total of 100 patients undergoing ablation using cryo-balloon for paroxysmal and persistent AF. Patients will be randomized with a patient allocation ratio of 1: 1 to preprocedural MRI scan of the heart and 3D printing of left atrium and pulmonary veins and cryoablation versus standard cryoablation without imaging. Patients will be followed up to 6 months after the index procedure. The primary outcome measure is the reduction of radiation dose and contrast amount during pulmonary veins isolation. Secondary endpoints will include the percentage of atrial fibrillation relapse at 24h-Holter electrocardiogram monitoring at 6 months after initial treatment. Discussion: To our knowledge, the 3D-GALA trial will be the first study to provide evidence about the clinical impact of preprocedural imaging and 3D printing before cryoablation.

Keywords: atrial fibrillation, cardiac MRI, cryoablation, 3-d printing

Procedia PDF Downloads 160
2817 Grey Prediction of Atmospheric Pollutants in Shanghai Based on GM(1,1) Model Group

Authors: Diqin Qi, Jiaming Li, Siman Li

Abstract:

Based on the use of the three-point smoothing method for selectively processing original data columns, this paper establishes a group of grey GM(1,1) models to predict the concentration ranges of four major air pollutants in Shanghai from 2023 to 2024. The results indicate that PM₁₀, SO₂, and NO₂ maintain the national Grade I standards, while the concentration of PM₂.₅ has decreased but still remains within the national Grade II standards. Combining the forecast results, recommendations are provided for the Shanghai municipal government's efforts in air pollution prevention and control.

Keywords: atmospheric pollutant prediction, Grey GM(1, 1), model group, three-point smoothing method

Procedia PDF Downloads 23