Search results for: data block
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 24920

Search results for: data block

24680 Measuring the Effectiveness of Response Inhibition regarding to Motor Complexity: Evidence from the Stroop Effect

Authors: Germán Gálvez-García, Marta Lavin, Javiera Peña, Javier Albayay, Claudio Bascour, Jesus Fernandez-Gomez, Alicia Pérez-Gálvez

Abstract:

We studied the effectiveness of response inhibition in movements with different degrees of motor complexity when they were executed in isolation and alternately. Sixteen participants performed the Stroop task which was used as a measure of response inhibition. Participants responded by lifting the index finger and reaching the screen with the same finger. Both actions were performed separately and alternately in different experimental blocks. Repeated measures ANOVAs were used to compare reaction time, movement time, kinematic errors and Movement errors across conditions (experimental block, movement, and congruency). Delta plots were constructed to perform distributional analyses of response inhibition and accuracy rate. The effectiveness of response inhibition did not show difference when the movements were performed in separated blocks. Nevertheless, it showed differences when they were performed alternately in the same experimental block, being more effective for the lifting action. This could be due to a competition of the available resources during a more complex scenario which also demands to adopt some strategy to avoid errors.

Keywords: response inhibition, motor complexity, Stroop task, delta plots

Procedia PDF Downloads 371
24679 Development of a Novel Antibacterial to Block Growth of Pseudomonas Aeruginosa and Prevent Biofilm Formation

Authors: Clara Franch de la Cal, Christopher J Morris, Michael McArthur

Abstract:

Cystic fibrosis (CF) is an autosomal recessive genetic disorder characterized by abnormal transport of chloride and sodium across the lung epithelium, leading to thick and viscous secretions. Within which CF patients suffer from repeated bacterial pulmonary infections, with Pseudomonas aeru-ginosa (PA) eliciting the greatest inflammatory response, causing an irreversible loss of lung func-tion that determines morbidity and mortality. The cell wall of PA is a permeability barrier to many antibacterials and the rise of Mutli-Drug Resistant strains (MDR) is eroding the efficacy of the few remaining clinical options. In addition when PA infection becomes established it forms an antibi-otic-resistant biofilm, embedded in which are slow growing cells that are refractive to drug treat-ment. Making the development of new antibacterials a major challenge. This work describes the development of new type of nanoparticulate oligonucleotide antibacterial capable of tackling PA infections, including MDR strains. It is being developed to both block growth and prevent biofilm formation. These oligonucleotide therapeutics, Transcription Factor Decoys (TFD), act on novel genomic targets by capturing key regulatory proteins to block essential bacterial genes and defeat infection. They have been successfully transfected into a wide range of pathogenic bacteria, both in vitro and in vivo, using a proprietary delivery technology. The surfactant used self-assembles with TFD to form a nanoparticle stable in biological fluids, which protects the TFD from degradation and preferentially transfects prokaryotic membranes. Key challenges are to adapt the nanoparticle so it is active against PA in the context of biofilms and to formulate it for administration by inhalation. This would allow the drug to be delivered to the respiratory tract, thereby achieving drug concentrations sufficient to eradicate the pathogenic organisms at the site of infection.

Keywords: antibacterials, transcriptional factor decoys (TFDs), pseudomonas aeruginosa

Procedia PDF Downloads 260
24678 Wear Behavior of Commercial Aluminium Engine Block and Piston under Dry Sliding Condition

Authors: Md. Salim Kaiser

Abstract:

In the present work, the effect of load and sliding distance on the performance tribology of commercially used aluminium-silicon engine block and piston was evaluated at ambient conditions with humidity of 80% under dry sliding conditions using a pin-on-disc with two different loads of 5N and 20N yielding applied pressure of 0.30MPa and 1.4MPa, respectively, at sliding velocity of 0.29ms-1 and with varying sliding distance ranging from 260m-4200m. Factors and conditions that had significant effect were identified. The results showed that the load and the sliding distance affect the wear rate of the alloys and the wear rate increased with increasing load for both the alloys. Wear rate also increases almost linearly at low loads and increase to a maximum then attain a plateau with increasing sliding distance. For both applied loads, the piston alloy showed the better performance due to higher Ni and Mg content. The worn surface and wear debris was characterized by optical microscope, SEM and EDX analyzer. The worn surface was characterized by surface with shallow grooves at loads while the groove width and depth increased as the loads increases. Oxidative wear was found to be the predominant mechanisms in the dry sliding of Al-Si alloys at low loads

Keywords: wear, friction, gravimetric analysis, aluminium-silicon alloys, SEM, EDX

Procedia PDF Downloads 231
24677 On Dynamic Chaotic S-BOX Based Advanced Encryption Standard Algorithm for Image Encryption

Authors: Ajish Sreedharan

Abstract:

Security in transmission and storage of digital images has its importance in today’s image communications and confidential video conferencing. Due to the increasing use of images in industrial process, it is essential to protect the confidential image data from unauthorized access. Advanced Encryption Standard (AES) is a well known block cipher that has several advantages in data encryption. However, it is not suitable for real-time applications. This paper presents modifications to the Advanced Encryption Standard to reflect a high level security and better image encryption. The modifications are done by adjusting the ShiftRow Transformation and using On Dynamic chaotic S-BOX. In AES the Substitute bytes, Shift row and Mix columns by themselves would provide no security because they do not use the key. In Dynamic chaotic S-BOX Based AES the Substitute bytes provide security because the S-Box is constructed from the key. Experimental results verify and prove that the proposed modification to image cryptosystem is highly secure from the cryptographic viewpoint. The results also prove that with a comparison to original AES encryption algorithm the modified algorithm gives better encryption results in terms of security against statistical attacks.

Keywords: advanced encryption standard (AES), on dynamic chaotic S-BOX, image encryption, security analysis, ShiftRow transformation

Procedia PDF Downloads 407
24676 Evaluation of Uniformity for Gafchromic Sheets for Film Dosimetry

Authors: Fayzan Ahmed, Saad Bin Saeed, Abdul Qadir Jangda

Abstract:

Gafchromic™ sheet are extensively used for the QA of intensity modulated radiation therapy and other in-vivo dosimetry. Intra-sheet Non-uniformity of scanner as well as film causes undesirable fluctuations which are reflected in dosimetry The aim of this study is to define a systematic and robust method to investigate the intra-sheet uniformity of the unexposed Gafchromic Sheets and the region of interest (ROI) of the scanner. Sheets of lot No#: A05151201 were scanned before and after the expiry period with the EPSON™ XL10000 scanner in the transmission mode, landscape orientation and 72 dpi resolution. ROI of (8’x 10’ inches) equal to the sheet dimension in the center of the scanner is used to acquire images with full transmission, block transmission and with sheets in place. 500 virtual grids, created in MATALB® are imported as a macros in ImageJ (1.49m Wayne Rasband) to analyze the images. In order to remove the edge effects, the outer 86 grids are excluded from the analysis. The standard deviation of the block transmission and full transmission are 0.38% and 0.66% confirming a higher uniformity of the scanner. Expired and non-expired sheets have standard deviations of 2.18% and 1.29%, show that uniformity decreases after expiry. The results are promising and indicates a good potential of this method to be used as a uniformity check for scanner and unexposed Gafchromic sheets.

Keywords: IMRT, film dosimetry, virtual grids, uniformity

Procedia PDF Downloads 460
24675 Analysis of Big Data

Authors: Sandeep Sharma, Sarabjit Singh

Abstract:

As per the user demand and growth trends of large free data the storage solutions are now becoming more challenge-able to protect, store and to retrieve data. The days are not so far when the storage companies and organizations are start saying 'no' to store our valuable data or they will start charging a huge amount for its storage and protection. On the other hand as per the environmental conditions it becomes challenge-able to maintain and establish new data warehouses and data centers to protect global warming threats. A challenge of small data is over now, the challenges are big that how to manage the exponential growth of data. In this paper we have analyzed the growth trend of big data and its future implications. We have also focused on the impact of the unstructured data on various concerns and we have also suggested some possible remedies to streamline big data.

Keywords: big data, unstructured data, volume, variety, velocity

Procedia PDF Downloads 516
24674 A Hebbian Neural Network Model of the Stroop Effect

Authors: Vadim Kulikov

Abstract:

The classical Stroop effect is the phenomenon that it takes more time to name the ink color of a printed word if the word denotes a conflicting color than if it denotes the same color. Over the last 80 years, there have been many variations of the experiment revealing various mechanisms behind semantic, attentional, behavioral and perceptual processing. The Stroop task is known to exhibit asymmetry. Reading the words out loud is hardly dependent on the ink color, but naming the ink color is significantly influenced by the incongruent words. This asymmetry is reversed, if instead of naming the color, one has to point at a corresponding color patch. Another debated aspects are the notions of automaticity and how much of the effect is due to semantic and how much due to response stage interference. Is automaticity a continuous or an all-or-none phenomenon? There are many models and theories in the literature tackling these questions which will be discussed in the presentation. None of them, however, seems to capture all the findings at once. A computational model is proposed which is based on the philosophical idea developed by the author that the mind operates as a collection of different information processing modalities such as different sensory and descriptive modalities, which produce emergent phenomena through mutual interaction and coherence. This is the framework theory where ‘framework’ attempts to generalize the concepts of modality, perspective and ‘point of view’. The architecture of this computational model consists of blocks of neurons, each block corresponding to one framework. In the simplest case there are four: visual color processing, text reading, speech production and attention selection modalities. In experiments where button pressing or pointing is required, a corresponding block is added. In the beginning, the weights of the neural connections are mostly set to zero. The network is trained using Hebbian learning to establish connections (corresponding to ‘coherence’ in framework theory) between these different modalities. The amount of data fed into the network is supposed to mimic the amount of practice a human encounters, in particular it is assumed that converting written text into spoken words is a more practiced skill than converting visually perceived colors to spoken color-names. After the training, the network performs the Stroop task. The RT’s are measured in a canonical way, as these are continuous time recurrent neural networks (CTRNN). The above-described aspects of the Stroop phenomenon along with many others are replicated. The model is similar to some existing connectionist models but as will be discussed in the presentation, has many advantages: it predicts more data, the architecture is simpler and biologically more plausible.

Keywords: connectionism, Hebbian learning, artificial neural networks, philosophy of mind, Stroop

Procedia PDF Downloads 242
24673 NR/PEO Block Copolymer: A Chelating Exchanger for Metal Ions

Authors: M. S. Mrudula, M. R. Gopinathan Nair

Abstract:

In order to utilize the natural rubber for developing new green polymeric materials for specialty applications, we have prepared natural rubber and polyethylene oxide based polymeric networks by two shot method. The polymeric networks thus formed have been used as chelating exchanger for metal ion binding. Chelating exchangers are, in general, coordinating copolymers containing one or more electron donor atoms such as N, S, O, and P that can form coordinate bonds with metals. Hydrogels are water- swollen network of hydrophilic homopolymer or copolymers. They acquire a great interest due to the facility of the incorporation of different chelating groups into the polymeric networks. Such polymeric hydrogels are promising materials in the field of hydrometallurgical applications and water purification due to their chemical stability. The current study discusses the swelling response of the polymeric networks as a function of time, temperature, pH and [NaCl] and sorption studies. Equilibrium swelling has been observed to depend on both structural aspects of the polymers and environmental factors. Metal ion sorption shows that these polymeric networks can be used for removal, separation, and enrichment of metal ions from aqueous solutions and can play an important role for environmental remediation of municipal and industrial wastewater.

Keywords: block copolymer, adsorption, chelating exchanger, swelling study, polymer, metal complexes

Procedia PDF Downloads 314
24672 Copolymers of Epsilon-Caprolactam Received via Anionic Polymerization in the Presence of Polypropylene Glycol Based Polymeric Activators

Authors: Krasimira N. Zhilkova, Mariya K. Kyulavska, Roza P. Mateva

Abstract:

The anionic polymerization of -caprolactam (CL) with bifunctional activators has been extensively studied as an effective and beneficial method of improving chemical and impact resistances, elasticity and other mechanical properties of polyamide (PA6). In presence of activators or macroactivators (MAs) also called polymeric activators (PACs) the anionic polymerization of lactams proceeds rapidly at a temperature range of 130-180C, well below the melting point of PA-6 (220C) permitting thus the direct manufacturing of copolymer product together with desired modifications of polyamide properties. Copolymers of PA6 with an elastic polypropylene glycol (PPG) middle block into main chain were successfully synthesized via activated anionic ring opening polymerization (ROP) of CL. Using novel PACs based on PPG polyols (with differ molecular weight) the anionic ROP of CL was realized and investigated in the presence of a basic initiator sodium salt of CL (NaCL). The PACs were synthesized as N-carbamoyllactam derivatives of hydroxyl terminated PPG functionalized with isophorone diisocyanate [IPh, 5-Isocyanato-1-(isocyanatomethyl)-1,3,3-trimethylcyclohexane] and blocked then with CL units via an addition reaction. The block copolymers were analyzed and proved with 1H-NMR and FT-IR spectroscopy. The influence of the CL/PACs ratio in feed, the length of the PPG segments and polymerization conditions on the kinetics of anionic ROP, on average molecular weight, and on the structure of the obtained block copolymers were investigated. The structure and phase behaviour of the copolymers were explored with differential scanning calorimetry, wide-angle X-ray diffraction, thermogravimetric analysis and dynamic mechanical thermal analysis. The crystallinity dependence of PPG content incorporated into copolymers main backbone was estimate. Additionally, the mechanical properties of the obtained copolymers were studied by notched impact test. From the performed investigation in this study could be concluded that using PPG based PACs at the chosen ROP conditions leads to obtaining well-defined PA6-b-PPG-b-PA6 copolymers with improved impact resistance.

Keywords: anionic ring opening polymerization, caprolactam, polyamide copolymers, polypropylene glycol

Procedia PDF Downloads 385
24671 Polymeric Micelles Based on Block Copolymer α-Tocopherol Succinate-g-Carboxymethyl Chitosan for Tamoxifen Delivery

Authors: Sunil K. Jena, Sanjaya K. Samal, Mahesh Chand, Abhay T. Sangamwar

Abstract:

Tamoxifen (TMX) and its analogues are approved as a first line therapy for the treatment of estrogen receptor-positive tumors. However, clinical development of TMX has been hampered by its low bioavailability and severe hepatotoxicity. Herein, we attempt to design a new drug delivery vehicle that could enhance the pharmacokinetic performance of TMX. Initially, high-molecular weight carboxymethyl chitosan was hydrolyzed to low-molecular weight carboxymethyl chitosan (LMW CMC) with hydrogen peroxide under the catalysis of phosphotungstic acid. Amphiphilic block copolymers of LMW CMC were synthesized via amidation reaction between the carboxyl group of α-tocopherol succinate (TS) and an amine group of LMW CMC. These amphiphilic block copolymers were self-assembled to nanosize core-shell-structural micelles in the aqueous medium. The critical micelle concentration (CMC) decreased with the increasing substitution of TS on LMW CMC, which ranged from 1.58 × 10-6 to 7.94 × 10-8 g/mL. Maximum TMX loading up to 8.08 ± 0.98% was achieved with Cmc-TS4.5 (TMX/Cmc-TS4.5 with 1:8 weight ratio). Both blank and TMX-loaded polymeric micelles (TMX-PM) of Cmc-TS4.5 exhibits spherical shape with the particle size below 200 nm. TMX-PM has been found to be stable in the gastrointestinal conditions and released only 44.5% of the total drug content by the first 72 h in simulated gastric fluid (SGF), pH 1.2. However, the presence of pepsin does not significantly increased the TMX release in SGF, pH 1.2, released only about 46.2% by the first 72 h suggesting its inability to cleave the peptide bond. In contrast, the release of TMX from TMX-PM4.5 in SIF, pH 6.8 (without pancreatin) was slow and sustained, released only about 10.43% of the total drug content within the first 30 min and nearly about 12.41% by the first 72 h. The presence of pancreatin in SIF, pH 6.8 led to an improvement in drug release. About 28.09% of incorporated TMX was released in the presence of pancreatin in 72 h. A cytotoxicity study demonstrated that TMX-PM exhibited time-delayed cytotoxicity in human MCF-7 breast cancer cells. Pharmacokinetic studies on Sprague-Dawley rats revealed a remarkable increase in oral bioavailability (1.87-fold) with significant (p < 0.0001) enhancement in AUC0-72 h, t1/2 and MRT of TMX-PM4.5 than that of TMX-suspension. Thus, the results suggested that CMC-TS micelles are a promising carrier for TMX delivery.

Keywords: carboxymethyl chitosan, d-α-tocopherol succinate, pharmacokinetic, polymeric micelles, tamoxifen

Procedia PDF Downloads 310
24670 Joint Simulation and Estimation for Geometallurgical Modeling of Crushing Consumption Energy in the Mineral Processing Plants

Authors: Farzaneh Khorram, Xavier Emery

Abstract:

In this paper, it is aimed to create a crushing consumption energy (CCE) block model and determine the blocks with the potential to have the maximum grinding process energy consumption for the study area. For this purpose, a joint estimate (co-kriging) and joint simulation (turning band method and plurigaussian methods) to predict the CCE based on its correlation with SAG power index (SPI), A×B, and ball mill bond work Index (BWI). The analysis shows that TBCOSIM and plurigaussian have the more realistic results compared to cokriging. It seems logical due to the nature of the data geometallurgical and the linearity of the kriging method and the smoothing effect of kriging.

Keywords: plurigaussian, turning band, cokriging, geometallurgy

Procedia PDF Downloads 28
24669 Underdiagnosis of Supraclavicular Brachial Plexus Metastasis in the Shadow of Cervical Disc Herniation: Insights from a Lung Cancer Case Study

Authors: Eunhwa Jun

Abstract:

This case report describes the misdiagnosis of a patient who presented with right arm pain as cervical disc herniation. The patient had several underlying conditions, including hypertension, diabetes mellitus, liver cirrhosis, a history of lung cancer with left lower lobe lobectomy, and adjuvant chemoradiotherapy. An external cervical spine MRI revealed central protruding discs at the C4-5-6-7 levels. Despite treatment with medication and epidural blocks, the patient's pain persisted. A C-RACZ procedure was planned, but the patient's pain had worsened before admission. Using ultrasound, a brachial plexus block was attempted, but the brachial plexus eluded clear visualization, hinting at underlying neurological complexities. Chest CT revealed a new, large soft tissue mass in the right supraclavicular region with adjacent right axillary lymphadenopathy, leading to the diagnosis of metastatic squamous cell carcinoma. Palliative radiation therapy and chemotherapy were initiated as part of the treatment plan, and the patient's pain score decreased to 3 out of 10 on the Numeric Rating Scale (NRS), revealing the pain was due to metastatic lung cancer.

Keywords: supraclavicula brachial plexus metastasis, cervical disc herniation, brachial plexus block, metastatic lung cancer

Procedia PDF Downloads 14
24668 A Simple Method for Evaluation of Uniformity for Gafchromic Sheets for Film Dosimetry

Authors: Fayzan Ahmed, Saad Bin Saeed, Abdul Qadir Jangda

Abstract:

Gafchromic™ sheet are extensively used for the QA of intensity modulated radiation therapy and other in-vivo dosimetry. Intra-sheet Non-uniformity of scanner as well as film causes undesirable fluctuations which are reflected in dosimetry The aim of this study is to define a systematic and robust method to investigate the intra-sheet uniformity of the unexposed Gafchromic Sheets and the region of interest (ROI) of the scanner. Sheets of lot No#: A05151201 were scanned before and after the expiry period with the EPSON™ XL10000 scanner in the transmission mode, landscape orientation, and 72 dpi resolution. ROI of (8’x 10’ inches) equal to the sheet dimension in the center of the scanner is used to acquire images with full transmission, block transmission and with sheets in place. 500 virtual grids, created in MATALB® are imported as a macros in ImageJ (1.49m Wayne Rasband) to analyze the images. In order to remove the edge effects, the outer 86 grids are excluded from the analysis. The standard deviation of the block transmission and full transmission are 0.38% and 0.66% confirming a higher uniformity of the scanner. Expired and non-expired sheets have standard deviations of 2.18% and 1.29%, show that uniformity decreases after expiry. The results are promising and indicate a good potential of this method to be used as a uniformity check for scanner and unexposed Gafchromic sheets.

Keywords: IMRT, film dosimetry, virtual grids, uniformity

Procedia PDF Downloads 403
24667 Research of Data Cleaning Methods Based on Dependency Rules

Authors: Yang Bao, Shi Wei Deng, WangQun Lin

Abstract:

This paper introduces the concept and principle of data cleaning, analyzes the types and causes of dirty data, and proposes several key steps of typical cleaning process, puts forward a well scalability and versatility data cleaning framework, in view of data with attribute dependency relation, designs several of violation data discovery algorithms by formal formula, which can obtain inconsistent data to all target columns with condition attribute dependent no matter data is structured (SQL) or unstructured (NoSQL), and gives 6 data cleaning methods based on these algorithms.

Keywords: data cleaning, dependency rules, violation data discovery, data repair

Procedia PDF Downloads 534
24666 Understanding Post-Displacement Earnings Losses: The Role of Wealth Inequality

Authors: M. Bartal

Abstract:

A large empirical evidence points to sizable lifetime earnings losses associated with the displacement of tenured workers. The causes of these losses are still not well-understood. Existing explanations are heavily based on human capital depreciation during non-employment spells. In this paper, a new avenue is explored. Evidence on the role of household liquidity constraints in accounting for the persistence of post-displacement earning losses is provided based on SIPP data. Then, a directed search and matching model with endogenous human capital and wealth accumulation is introduced. The model is computationally tractable thanks to its block-recursive structure and highlights a non-trivial, yet intuitive, interaction between wealth and human capital. Constrained workers tend to accept jobs with low firm-sponsored training because the latter are (endogenously) easier to find. This new channel provides a plausible explanation for why young (highly constrained) workers suffer persistent scars after displacement. Finally, the model is calibrated on US data to show that the interplay between wealth and human capital is crucial to replicate the observed lifecycle pattern of earning losses. JEL— E21, E24, J24, J63.

Keywords: directed search, human capital accumulation, job displacement, wealth accumulation

Procedia PDF Downloads 170
24665 Anaesthetic Management of Congenitally Corrected Transposition of Great Arteries with Complete Heart Block in a Parturient for Emergency Caesarean Section

Authors: Lokvendra S. Budania, Yogesh K Gaude, Vamsidhar Chamala

Abstract:

Introduction: Congenitally corrected transposition of great arteries (CCTGA) is a complex congenital heart disease where there are both atrioventricular and ventriculoarterial discordances, usually accompanied by other cardiovascular malformations. Case Report: A 24-year-old primigravida known case of CCTGA at 37 weeks of gestation was referred to our hospital for safe delivery. Her electrocardiogram showed HR-40/pm, echocardiography showed Ejection Fraction of 65% and CCTGA. Temporary pacemaker was inserted by cardiologist in catheterization laboratory, before giving trial of labour in view of complete heart block. She was planned for normal delivery, but emergency Caesarean section was planned due to non-reassuring foetal Cardiotocography Pre-op vitals showed PR-50 bpm with temporary pacemaker, Blood pressure-110/70 mmHg, SpO2-99% on room air. Nil per oral was inadequate. Patency of two peripheral IV cannula checked and left radial arterial line secured. Epidural Anaesthesia was planned, and catheter was placed at L2-L3. Test dose was given, Anaesthesia was provided with 5ml + 5ml of 2% Lignocaine with 25 mcg Fentanyl and further 2.5Ml of 0.5% Bupivacaine was given to achieve a sensory level of T6. Cesarean section was performed and baby was delivered. Cautery was avoided during this procedure. IV Oxytocin (15U) was added to 500 mL of ringer’s lactate. Hypotension was treated with phenylephrine boluses. Patient was shifted to post-operative care unit and later to high dependency unit for monitoring. Post op vitals remained stable. Temporary pacemaker was removed after 24 hours of surgery. Her post-operative period was uneventful and discharged from hospital. Conclusion: Rare congenital cardiac disorders require detail knowledge of pathophysiology and associated comorbidities with the disease. Meticulously planned and carefully titrated neuraxial techniques will be beneficial for such cases.

Keywords: congenitally corrected transposition of great arteries, complete heart block, emergency LSCS, epidural anaesthesia

Procedia PDF Downloads 100
24664 Synthesis of Rare-Earth Pyrazolate Compounds

Authors: Nazli Eslamirad, Peter C. Junk, Jun Wang, Glen B. Deacon

Abstract:

Since coordination behavior of pyrazoles and pyrazolate ions are widely versatile towards a great range of metals such as d-block, f-block as well as main group elements; they attract interest as ligands for preparing compounds. A variety of rare-earth pyrazolate complexes have been synthesized by redox transmetalation/protolysis (RTP) previously, therefore, a variety of rare-earth pyrazolate complexes using two pyrazoles, 3,5-dimethylpyrazole (Me₂pzH) and 3,5-di-tert -butylpyrazolate (t-Bu₂pzH), in which the structures span the whole La-Lu array beside Sc and Y has been synthesized by RTP reaction. There have been further developments in this study: Synthesizing structure of [Tb(Me₂pz)₃(thf)]₂ which is isomorphous with those of the previously reported [Dy(Me₂pz)₃(thf)]₂ and [Lu(Me₂pz)₃(thf)]₂ analogous that has two µ-1(N):2(Nʹ)-Me2pz ligands (the most common pyrazolate ligation for non-rare-earth complexes). Previously most of the reported compounds using t-Bu2pzH were monomeric compounds however the lanthanum derivative [La(Me₂pz)₃thf₂] ,which has been reported previously without crystal structure, has now been structurally characterized, along with cerium and lutetium analogue. Also a polymeric structure with samarium has now been synthesized which the neodymium analogue has been reported previously and comparing these polymeric structures can support the idea that the geometry of Sm(tBu₂pz)₃ affect the coordination of the solvent. Also, by using 1,2-dimethoxyethane (DME) instead of tetrahydrofuran (THF) new [Er(tBu₂pz)₃ (dme)₂] has now been reported.

Keywords: lanthanoid complexes, pyrazolate, redox transmetalation/protolysis, x-ray crystal structures

Procedia PDF Downloads 189
24663 Assessment of Soil Quality Indicators in Rice Soil of Tamil Nadu

Authors: Kaleeswari R. K., Seevagan L .

Abstract:

Soil quality in an agroecosystem is influenced by the cropping system, water and soil fertility management. A valid soil quality index would help to assess the soil and crop management practices for desired productivity and soil health. The soil quality indices also provide an early indication of soil degradation and needy remedial and rehabilitation measures. Imbalanced fertilization and inadequate organic carbon dynamics deteriorate soil quality in an intensive cropping system. The rice soil ecosystem is different from other arable systems since rice is grown under submergence, which requires a different set of key soil attributes for enhancing soil quality and productivity. Assessment of the soil quality index involves indicator selection, indicator scoring and comprehensive score into one index. The most appropriate indicator to evaluate soil quality can be selected by establishing the minimum data set, which can be screened by linear and multiple regression factor analysis and score function. This investigation was carried out in intensive rice cultivating regions (having >1.0 lakh hectares) of Tamil Nadu viz., Thanjavur, Thiruvarur, Nagapattinam, Villupuram, Thiruvannamalai, Cuddalore and Ramanathapuram districts. In each district, intensive rice growing block was identified. In each block, two sampling grids (10 x 10 sq.km) were used with a sampling depth of 10 – 15 cm. Using GIS coordinates, and soil sampling was carried out at various locations in the study area. The number of soil sampling points were 41, 28, 28, 32, 37, 29 and 29 in Thanjavur, Thiruvarur, Nagapattinam, Cuddalore, Villupuram, Thiruvannamalai and Ramanathapuram districts, respectively. Principal Component Analysis is a data reduction tool to select some of the potential indicators. Principal Component is a linear combination of different variables that represents the maximum variance of the dataset. Principal Component that has eigenvalues equal or higher than 1.0 was taken as the minimum data set. Principal Component Analysis was used to select the representative soil quality indicators in rice soils based on factor loading values and contribution percent values. Variables having significant differences within the production system were used for the preparation of the minimum data set. Each Principal Component explained a certain amount of variation (%) in the total dataset. This percentage provided the weight for variables. The final Principal Component Analysis based soil quality equation is SQI = ∑ i=1 (W ᵢ x S ᵢ); where S- score for the subscripted variable; W-weighing factor derived from PCA. Higher index scores meant better soil quality. Soil respiration, Soil available Nitrogen and Potentially Mineralizable Nitrogen were assessed as soil quality indicators in rice soil of the Cauvery Delta zone covering Thanjavur, Thiruvavur and Nagapattinam districts. Soil available phosphorus could be used as a soil quality indicator of rice soils in the Cuddalore district. In rain-fed rice ecosystems of coastal sandy soil, DTPA – Zn could be used as an effective soil quality indicator. Among the soil parameters selected from Principal Component Analysis, Microbial Biomass Nitrogen could be used quality indicator for rice soils of the Villupuram district. Cauvery Delta zone has better SQI as compared with other intensive rice growing zone of Tamil Nadu.

Keywords: soil quality index, soil attributes, soil mapping, and rice soil

Procedia PDF Downloads 54
24662 Congestion Control in Mobile Network by Prioritizing Handoff Calls

Authors: O. A. Lawal, O. A Ojesanmi

Abstract:

The demand for wireless cellular services continues to increase while the radio resources remain limited. Thus, network operators have to continuously manage the scarce radio resources in order to have an improved quality of service for mobile users. This paper proposes how to handle the problem of congestion in the mobile network by prioritizing handoff call, using the guard channel allocation scheme. The research uses specific threshold value for the time of allocation of the channel in the algorithm. The scheme would be simulated by generating various data for different traffics in the network as it would be in the real life. The result would be used to determine the probability of handoff call dropping and the probability of the new call blocking as a way of measuring the network performance.

Keywords: call block, channel, handoff, mobile cellular network

Procedia PDF Downloads 371
24661 Embedded Acoustic Signal Processing System Using OpenMP Architecture

Authors: Abdelkader Elhanaoui, Mhamed Hadji, Rachid Skouri, Said Agounad

Abstract:

In this paper, altera de1-SoC FPGA board technology is utilized as a distinguished tool for nondestructive characterization of an aluminum circular cylindrical shell of radius ratio b/a (a: outer radius; b: inner radius). The acoustic backscattered signal processing system has been developed using OpenMP architecture. The design is built in three blocks; it is implemented per functional block, in a heterogeneous Intel-Altera system running under Linux. The useful data to determine the performances of SoC FPGA is computed by the analytical method. The exploitation of SoC FPGA has lead to obtain the backscattering form function and resonance spectra. A0 and S0 modes of propagation in the tube are shown. The findings are then compared to those achieved from the Matlab simulation of analytical method. A good agreement has, therefore, been noted. Moreover, the detailed SoC FPGA-based system has shown that acoustic spectra are performed at up to 5 times faster than the Matlab implementation using almost the same data. This FPGA-based system implementation of processing algorithms is realized with a coefficient of correlation R and absolute error respectively about 0.962 and 5 10⁻⁵.

Keywords: OpenMP, signal processing system, acoustic backscattering, nondestructive characterization, thin tubes

Procedia PDF Downloads 60
24660 LTE Performance Analysis in the City of Bogota Northern Zone for Two Different Mobile Broadband Operators over Qualipoc

Authors: Víctor D. Rodríguez, Edith P. Estupiñán, Juan C. Martínez

Abstract:

The evolution in mobile broadband technologies has allowed to increase the download rates in users considering the current services. The evaluation of technical parameters at the link level is of vital importance to validate the quality and veracity of the connection, thus avoiding large losses of data, time and productivity. Some of these failures may occur between the eNodeB (Evolved Node B) and the user equipment (UE), so the link between the end device and the base station can be observed. LTE (Long Term Evolution) is considered one of the IP-oriented mobile broadband technologies that work stably for data and VoIP (Voice Over IP) for those devices that have that feature. This research presents a technical analysis of the connection and channeling processes between UE and eNodeB with the TAC (Tracking Area Code) variables, and analysis of performance variables (Throughput, Signal to Interference and Noise Ratio (SINR)). Three measurement scenarios were proposed in the city of Bogotá using QualiPoc, where two operators were evaluated (Operator 1 and Operator 2). Once the data were obtained, an analysis of the variables was performed determining that the data obtained in transmission modes vary depending on the parameters BLER (Block Error Rate), performance and SNR (Signal-to-Noise Ratio). In the case of both operators, differences in transmission modes are detected and this is reflected in the quality of the signal. In addition, due to the fact that both operators work in different frequencies, it can be seen that Operator 1, despite having spectrum in Band 7 (2600 MHz), together with Operator 2, is reassigning to another frequency, a lower band, which is AWS (1700 MHz), but the difference in signal quality with respect to the establishment with data by the provider Operator 2 and the difference found in the transmission modes determined by the eNodeB in Operator 1 is remarkable.

Keywords: BLER, LTE, network, qualipoc, SNR.

Procedia PDF Downloads 88
24659 Counterfeit Product Detection Using Block Chain

Authors: Sharanya C. H., Pragathi M., Vathsala R. S., Theja K. V., Yashaswini S.

Abstract:

Identifying counterfeit products have become increasingly important in the product manufacturing industries in recent decades. This current ongoing product issue of counterfeiting has an impact on company sales and profits. To address the aforementioned issue, a functional blockchain technology was implemented, which effectively prevents the product from being counterfeited. By utilizing the blockchain technology, consumers are no longer required to rely on third parties to determine the authenticity of the product being purchased. Blockchain is a distributed database that stores data records known as blocks and several databases known as chains across various networks. Counterfeit products are identified using a QR code reader, and the product's QR code is linked to the blockchain management system. It compares the unique code obtained from the customer to the stored unique code to determine whether or not the product is original.

Keywords: blockchain, ethereum, QR code

Procedia PDF Downloads 148
24658 Investigating the Dynamic Response of the Ballast

Authors: Osama Brinji, Wing Kong Chiu, Graham Tew

Abstract:

Understanding the stability of rail ballast is one of the most important aspects in the railways. An unstable track may cause some issues such as unnecessary vibration and ultimately loss of track quality. The track foundation plays an important role in the stabilization of the railway. The dynamic response of rail ballast in the vicinity of the rail sleeper can affect the stability of the rail track and this has not been studied in detail. A review of literature showed that most of the works focused on the area under the concrete sleeper. Although there are some theories about the shear (longitudinal) effect of the rail ballast, these have not properly been studied and hence are not well understood. The stability of a rail track will depend on the compactness of the ballast in its vicinity. This paper will try to determine the dynamic response of the ballast to identify its resonant behaviour. This preliminary research is one of several studies that examine the vibration response of the granular materials. The main aim is to use this information for future design of sleepers to ensure that any dynamic response of the sleeper will not compromise the state of compactness of the ballast. This paper will report on the dependence of damping and the natural frequency of the ballast as a function of depth and distance from the point of excitation introduced through a concrete block. The concrete block is used to simulate a sleeper and the ballast is simulated with gravel. In spite of these approximations, the results presented in the paper will show an agreement with theories and the assumptions that are used in study the mechanical behaviour of the rail ballast.

Keywords: ballast, dynamic response, sleeper, stability

Procedia PDF Downloads 473
24657 The Development of Statistical Analysis in Agriculture Experimental Design Using R

Authors: Somruay Apichatibutarapong, Chookiat Pudprommart

Abstract:

The purpose of this study was to develop of statistical analysis by using R programming via internet applied for agriculture experimental design. Data were collected from 65 items in completely randomized design, randomized block design, Latin square design, split plot design, factorial design and nested design. The quantitative approach was used to investigate the quality of learning media on statistical analysis by using R programming via Internet by six experts and the opinions of 100 students who interested in experimental design and applied statistics. It was revealed that the experts’ opinions were good in all contents except a usage of web board and the students’ opinions were good in overall and all items.

Keywords: experimental design, r programming, applied statistics, statistical analysis

Procedia PDF Downloads 336
24656 Mining Big Data in Telecommunications Industry: Challenges, Techniques, and Revenue Opportunity

Authors: Hoda A. Abdel Hafez

Abstract:

Mining big data represents a big challenge nowadays. Many types of research are concerned with mining massive amounts of data and big data streams. Mining big data faces a lot of challenges including scalability, speed, heterogeneity, accuracy, provenance and privacy. In telecommunication industry, mining big data is like a mining for gold; it represents a big opportunity and maximizing the revenue streams in this industry. This paper discusses the characteristics of big data (volume, variety, velocity and veracity), data mining techniques and tools for handling very large data sets, mining big data in telecommunication and the benefits and opportunities gained from them.

Keywords: mining big data, big data, machine learning, telecommunication

Procedia PDF Downloads 371
24655 Multi-Layer Multi-Feature Background Subtraction Using Codebook Model Framework

Authors: Yun-Tao Zhang, Jong-Yeop Bae, Whoi-Yul Kim

Abstract:

Background modeling and subtraction in video analysis has been widely proved to be an effective method for moving objects detection in many computer vision applications. Over the past years, a large number of approaches have been developed to tackle different types of challenges in this field. However, the dynamic background and illumination variations are two of the most frequently occurring issues in the practical situation. This paper presents a new two-layer model based on codebook algorithm incorporated with local binary pattern (LBP) texture measure, targeted for handling dynamic background and illumination variation problems. More specifically, the first layer is designed by block-based codebook combining with LBP histogram and mean values of RGB color channels. Because of the invariance of the LBP features with respect to monotonic gray-scale changes, this layer can produce block-wise detection results with considerable tolerance of illumination variations. The pixel-based codebook is employed to reinforce the precision from the outputs of the first layer which is to eliminate false positives further. As a result, the proposed approach can greatly promote the accuracy under the circumstances of dynamic background and illumination changes. Experimental results on several popular background subtraction datasets demonstrate a very competitive performance compared to previous models.

Keywords: background subtraction, codebook model, local binary pattern, dynamic background, illumination change

Procedia PDF Downloads 193
24654 JavaScript Object Notation Data against eXtensible Markup Language Data in Software Applications a Software Testing Approach

Authors: Theertha Chandroth

Abstract:

This paper presents a comparative study on how to check JSON (JavaScript Object Notation) data against XML (eXtensible Markup Language) data from a software testing point of view. JSON and XML are widely used data interchange formats, each with its unique syntax and structure. The objective is to explore various techniques and methodologies for validating comparison and integration between JSON data to XML and vice versa. By understanding the process of checking JSON data against XML data, testers, developers and data practitioners can ensure accurate data representation, seamless data interchange, and effective data validation.

Keywords: XML, JSON, data comparison, integration testing, Python, SQL

Procedia PDF Downloads 91
24653 Effect of Discharge Pressure Conditions on Flow Characteristics in Axial Piston Pump

Authors: Jonghyuk Yoon, Jongil Yoon, Seong-Gyo Chung

Abstract:

In many kinds of industries which usually need a large amount of power, an axial piston pump has been widely used as a main power source of a hydraulic system. The axial piston pump is a type of positive displacement pump that has several pistons in a circular array within a cylinder block. As the cylinder block and pistons start to rotate, since the exposed ends of the pistons are constrained to follow the surface of the swashed plate, the pistons are driven to reciprocate axially and then a hydraulic power is produced. In the present study, a numerical simulation which has three dimensional full model of the axial piston pump was carried out using a commercial CFD code (Ansys CFX 14.5). In order to take into consideration motion of compression and extension by the reciprocating pistons, the moving boundary conditions were applied as a function of the rotation angle to that region. In addition, this pump using hydraulic oil as working fluid is intentionally designed as a small amount of oil leaks out in order to lubricate moving parts. Since leakage could directly affect the pump efficiency, evaluation of effect of oil-leakage is very important. In order to predict the effect of the oil leakage on the pump efficiency, we considered the leakage between piston-shoe and swash-plate by modeling cylindrical shaped-feature at the end of the cylinder. In order to validate the numerical method used in this study, the numerical results of the flow rate at the discharge port are compared with the experimental data, and good agreement between them was shown. Using the validated numerical method, the effect of the discharge pressure was also investigated. The result of the present study can be useful information of small axial piston pump used in many different manufacturing industries. Acknowledgement: This research was financially supported by the “Next-generation construction machinery component specialization complex development program” through the Ministry of Trade, Industry and Energy (MOTIE) and Korea Institute for Advancement of Technology (KIAT).

Keywords: axial piston pump, CFD, discharge pressure, hydraulic system, moving boundary condition, oil leaks

Procedia PDF Downloads 225
24652 Myanmar Character Recognition Using Eight Direction Chain Code Frequency Features

Authors: Kyi Pyar Zaw, Zin Mar Kyu

Abstract:

Character recognition is the process of converting a text image file into editable and searchable text file. Feature Extraction is the heart of any character recognition system. The character recognition rate may be low or high depending on the extracted features. In the proposed paper, 25 features for one character are used in character recognition. Basically, there are three steps of character recognition such as character segmentation, feature extraction and classification. In segmentation step, horizontal cropping method is used for line segmentation and vertical cropping method is used for character segmentation. In the Feature extraction step, features are extracted in two ways. The first way is that the 8 features are extracted from the entire input character using eight direction chain code frequency extraction. The second way is that the input character is divided into 16 blocks. For each block, although 8 feature values are obtained through eight-direction chain code frequency extraction method, we define the sum of these 8 feature values as a feature for one block. Therefore, 16 features are extracted from that 16 blocks in the second way. We use the number of holes feature to cluster the similar characters. We can recognize the almost Myanmar common characters with various font sizes by using these features. All these 25 features are used in both training part and testing part. In the classification step, the characters are classified by matching the all features of input character with already trained features of characters.

Keywords: chain code frequency, character recognition, feature extraction, features matching, segmentation

Procedia PDF Downloads 293
24651 Multi-Source Data Fusion for Urban Comprehensive Management

Authors: Bolin Hua

Abstract:

In city governance, various data are involved, including city component data, demographic data, housing data and all kinds of business data. These data reflects different aspects of people, events and activities. Data generated from various systems are different in form and data source are different because they may come from different sectors. In order to reflect one or several facets of an event or rule, data from multiple sources need fusion together. Data from different sources using different ways of collection raised several issues which need to be resolved. Problem of data fusion include data update and synchronization, data exchange and sharing, file parsing and entry, duplicate data and its comparison, resource catalogue construction. Governments adopt statistical analysis, time series analysis, extrapolation, monitoring analysis, value mining, scenario prediction in order to achieve pattern discovery, law verification, root cause analysis and public opinion monitoring. The result of Multi-source data fusion is to form a uniform central database, which includes people data, location data, object data, and institution data, business data and space data. We need to use meta data to be referred to and read when application needs to access, manipulate and display the data. A uniform meta data management ensures effectiveness and consistency of data in the process of data exchange, data modeling, data cleansing, data loading, data storing, data analysis, data search and data delivery.

Keywords: multi-source data fusion, urban comprehensive management, information fusion, government data

Procedia PDF Downloads 352