Search results for: Neural Processing Element (NPE)
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7894

Search results for: Neural Processing Element (NPE)

844 Examining the European Central Bank's Marginal Attention to Human Rights Concerns during the Eurozone Crisis through the Lens of Organizational Culture

Authors: Hila Levi

Abstract:

Respect for human rights is a fundamental element of the European Union's (EU) identity and law. Surprisingly, however, the protection of human rights has been significantly restricted in the austerity programs ordered by the International Monetary Fund (IMF), the European Central Bank (ECB) and the European Commission (EC) (often labeled 'the Troika') in return for financial aid to the crisis-hit countries. This paper focuses on the role of the ECB in the crisis management. While other international financial institutions, such as the IMF or the World Bank, may opt to neglect human rights obligations, one might expect a greater respect of human rights from the ECB, which is bound by the EU Charter of Fundamental Rights. However, this paper argues that ECB officials made no significant effort to protect human rights or strike an adequate balance between competing financial and human rights needs while coping with the crisis. ECB officials were preoccupied with the need to stabilize the economy and prevent a collapse of the Eurozone, and paid only marginal attention to human rights concerns in the design and implementation of Troikas' programs. This paper explores the role of Organizational Culture (OC) in explaining this marginalization. While International Relations (IR) research on Intergovernmental Organizations (IGOs) behavior has traditionally focused on external interests of powerful member states, and on national and economic considerations, this study focuses on particular institutions' internal factors and independent processes. OC characteristics have been identified in OC literature as an important determinant of organizational behavior. This paper suggests that cultural characteristics are also vital for the examination of IGOs, and particularly for understanding the ECB's behavior during the crisis. In order to assess the OC of the ECB and the impact it had on its policies and decisions during the Eurozone crisis, the paper uses the results of numerous qualitative interviews conducted with high-ranking officials and staff members of the ECB involved in the crisis management. It further reviews primary sources of the ECB (such as ECB statutes, and the Memoranda of Understanding signed between the crisis countries and the Troika), and secondary sources (such as the report of the UN High Commissioner for Human Rights on Austerity measures and economic, social, and cultural rights). It thus analyzes the interaction between the ECBs culture and the almost complete absence of human rights considerations in the Eurozone crisis resolution scheme. This paper highlights the importance and influence of internal ideational factors on IGOs behavior. From a more practical perspective, this paper may contribute to understanding one of the obstacles in the process of human rights implementation in international organizations, and provide instruments for better protection of social and economic rights.

Keywords: European central bank, eurozone crisis, intergovernmental organizations, organizational culture

Procedia PDF Downloads 137
843 Integrating Computational Modeling and Analysis with in Vivo Observations for Enhanced Hemodynamics Diagnostics and Prognosis

Authors: Shreyas S. Hegde, Anindya Deb, Suresh Nagesh

Abstract:

Computational bio-mechanics is developing rapidly as a non-invasive tool to assist the medical fraternity to help in both diagnosis and prognosis of human body related issues such as injuries, cardio-vascular dysfunction, atherosclerotic plaque etc. Any system that would help either properly diagnose such problems or assist prognosis would be a boon to the doctors and medical society in general. Recently a lot of work is being focused in this direction which includes but not limited to various finite element analysis related to dental implants, skull injuries, orthopedic problems involving bones and joints etc. Such numerical solutions are helping medical practitioners to come up with alternate solutions for such problems and in most cases have also reduced the trauma on the patients. Some work also has been done in the area related to the use of computational fluid mechanics to understand the flow of blood through the human body, an area of hemodynamics. Since cardio-vascular diseases are one of the main causes of loss of human life, understanding of the blood flow with and without constraints (such as blockages), providing alternate methods of prognosis and further solutions to take care of issues related to blood flow would help save valuable life of such patients. This project is an attempt to use computational fluid dynamics (CFD) to solve specific problems related to hemodynamics. The hemodynamics simulation is used to gain a better understanding of functional, diagnostic and theoretical aspects of the blood flow. Due to the fact that many fundamental issues of the blood flow, like phenomena associated with pressure and viscous forces fields, are still not fully understood or entirely described through mathematical formulations the characterization of blood flow is still a challenging task. The computational modeling of the blood flow and mechanical interactions that strongly affect the blood flow patterns, based on medical data and imaging represent the most accurate analysis of the blood flow complex behavior. In this project the mathematical modeling of the blood flow in the arteries in the presence of successive blockages has been analyzed using CFD technique. Different cases of blockages in terms of percentages have been modeled using commercial software CATIA V5R20 and simulated using commercial software ANSYS 15.0 to study the effect of varying wall shear stress (WSS) values and also other parameters like the effect of increase in Reynolds number. The concept of fluid structure interaction (FSI) has been used to solve such problems. The model simulation results were validated using in vivo measurement data from existing literature

Keywords: computational fluid dynamics, hemodynamics, blood flow, results validation, arteries

Procedia PDF Downloads 390
842 Exploring Bidirectional Encoder Representations from the Transformers’ Capabilities to Detect English Preposition Errors

Authors: Dylan Elliott, Katya Pertsova

Abstract:

Preposition errors are some of the most common errors created by L2 speakers. In addition, improving error correction and detection methods remains an open issue in the realm of Natural Language Processing (NLP). This research investigates whether the bidirectional encoder representations from the transformers model (BERT) have the potential to correct preposition errors accurately enough to be useful in error correction software. This research finds that BERT performs strongly when the scope of its error correction is limited to preposition choice. The researchers used an open-source BERT model and over three hundred thousand edited sentences from Wikipedia, tagged for part of speech, where only a preposition edit had occurred. To test BERT’s ability to detect errors, a technique known as multi-level masking was used to generate suggestions based on sentence context for every prepositional environment in the test data. These suggestions were compared with the original errors in the data and their known corrections to evaluate BERT’s performance. The suggestions were further analyzed to determine if BERT more often agreed with the judgements of the Wikipedia editors. Both the untrained and fined-tuned models were compared. Finetuning led to a greater rate of error-detection which significantly improved recall, but lowered precision due to an increase in false positives or falsely flagged errors. However, in most cases, these false positives were not errors in preposition usage but merely cases where more than one preposition was possible. Furthermore, when BERT correctly identified an error, the model largely agreed with the Wikipedia editors, suggesting that BERT’s ability to detect misused prepositions is better than previously believed. To evaluate to what extent BERT’s false positives were grammatical suggestions, we plan to do a further crowd-sourcing study to test the grammaticality of BERT’s suggested sentence corrections against native speakers’ judgments.

Keywords: BERT, grammatical error correction, preposition error detection, prepositions

Procedia PDF Downloads 129
841 Processing and Evaluation of Jute Fiber Reinforced Hybrid Composites

Authors: Mohammad W. Dewan, Jahangir Alam, Khurshida Sharmin

Abstract:

Synthetic fibers (carbon, glass, aramid, etc.) are generally utilized to make composite materials for better mechanical and thermal properties. However, they are expensive and non-biodegradable. In the perspective of Bangladesh, jute fibers are available, inexpensive, and comprising good mechanical properties. The improved properties (i.e., low cost, low density, eco-friendly) of natural fibers have made them a promising reinforcement in hybrid composites without sacrificing mechanical properties. In this study, jute and e-glass fiber reinforced hybrid composite materials are fabricated utilizing hand lay-up followed by a compression molding technique. Room temperature cured two-part epoxy resin is used as a matrix. Approximate 6-7 mm thick composite panels are fabricated utilizing 17 layers of woven glass and jute fibers with different fiber layering sequences- only jute, only glass, glass, and jute alternatively (g/j/g/j---) and 4 glass - 9 jute – 4 glass (4g-9j-4g). The fabricated composite panels are analyzed through fiber volume calculation, tensile test, bending test, and water absorption test. The hybridization of jute and glass fiber results in better tensile, bending, and water absorption properties than only jute fiber-reinforced composites, but inferior properties as compared to only glass fiber reinforced composites. Among different fiber layering sequences, 4g-9j-4g fibers layering sequence resulted in better tensile, bending, and water absorption properties. The effect of chemical treatment on the woven jute fiber and chopped glass microfiber infusion are also investigated in this study. Chemically treated jute fiber and 2 wt. % chopped glass microfiber infused hybrid composite shows about 12% improvements in flexural strength as compared to untreated and no micro-fiber infused hybrid composite panel. However, fiber chemical treatment and micro-filler do not have a significant effect on tensile strength.

Keywords: compression molding, chemical treatment, hybrid composites, mechanical properties

Procedia PDF Downloads 143
840 Eucalyptus camaldulensis Leaves Attacked by the Gall Wasp Leptocybe invasa: A Phyto-Volatile Constituents Study

Authors: Maged El-Sayed Mohamed

Abstract:

Eucalyptus camaldulensis is one on the most well-known species of the genus Eucalyptus in the Middle east, its importance relay on the high production of its unique volatile constituents which exhibits many medicinal and pharmacological activities. The gall-forming wasp (Leptocybe invasa) has recently come into sight as the main pest attacking E. camaldulensis and causing severe injury. The wasp lays its eggs in the petiole and midrib of leaves and stems of young shoots of E. camaldulensis, which leads to gall formation. Gall formation by L. invasa damages growing shoot and leaves of Eucalyptus, resulting in abscission of leaves and drying. AIM: This study is an attempt to investigate the effect of the gall wasp (Leptocybe invasa) attack on the volatile constitutes of E. camaldulensis. This could help in the control of this wasp through stimulating plant defenses or production of a new allelochemicals or insecticide. The study of volatile constitutes of Eucalyptus before and after attack by the wasp can help the re-use and recycle of the infected Eucalyptus trees for new pharmacological and medicinal activities. Methodology: The fresh gall wasp-attacked and healthy leaves (100 g each) were cut and immediately subjected to hydrodistillation using Clevenger-type apparatus for 3 hours. The volatile fractions isolated were analyzed using Gas chromatography/mass spectrometry (GC/MS). Kovat’s retention indices (RI) were calculated with respect to a set of co-injected standard hydrocarbons (C10-C28). Compounds were identified by comparing their spectral data and retention indices with Wiley Registry of Mass Spectral Data 10th edition (April 2013), NIST 11 Mass Spectral Library (NIST11/2011/EPA/NIH) and literature data. Results: Fifty-nine components representing 89.13 and 88.60% of the total volatile fraction content respectively were quantitatively analyzed. Twenty-six major compounds at an average concentration greater than 0.1 ± 0.02% have been used for the statistical comparison. From those major components, twenty-one were found in both the attacked and healthy Eucalyptus leaves’ fractions in different concentration and five components, mono terpene p-Mentha-2-4(8) diene and the sesquiterpenes δ-elemene, β-elemene, E-caryophyllene and Bicyclogermacrene, were unique and only produced in the attacked-leaves’ fraction. CONCLUSION: Newly produced components or those commonly found in the volatile fraction and changed in concentration could represent a part of the plant defense mechanisms or might be an element of the plant allelopathic and communication mechanisms. Identification of the components of the gall wasp-damaged leaves can help in their recycling for different physiological, pharmacological and medicinal uses.

Keywords: Eucalyptus camaldulensis, eucalyptus recycling, gall wasp, Leptocybe invasa, plant defense mechanisms, Terpene fraction

Procedia PDF Downloads 342
839 The Impact of Sign Language on Generating and Maintaining a Mental Image

Authors: Yi-Shiuan Chiu

Abstract:

Deaf signers have been found to have better mental image performance than hearing nonsigners. The goal of this study was to investigate the ability to generate mental images, to maintain them, and to manipulate them in deaf signers of Taiwanese Sign Language (TSL). In the visual image task, participants first memorized digits formed in a cell of 4 × 5 grids. After presenting a cue of Chinese digit character shown on the top of a blank cell, participants had to form a corresponding digit. When showing a probe, which was a grid containing a red circle, participants had to decide as quickly as possible whether the probe would have been covered by the mental image of the digit. The ISI (interstimulus interval) between cue and probe was manipulated. In experiment 1, 24 deaf signers and 24 hearing nonsigners were asked to perform image generation tasks (ISI: 200, 400 ms) and image maintenance tasks (ISI: 800, 2000 ms). The results showed that deaf signers had had an enhanced ability to generate and maintain a mental image. To explore the process of mental image, in experiment 2, 30 deaf signers and 30 hearing nonsigners were asked to do visual searching when maintaining a mental image. Between a digit image cue and a red circle probe, participants were asked to search a visual search task to see if a target triangle apex was directed to the right or left. When there was only one triangle in the searching task, the results showed that both deaf signers and hearing non-signers had similar visual searching performance in which the searching targets in the mental image locations got facilitates. However, deaf signers could maintain better and faster mental image performance than nonsigners. In experiment 3, we increased the number of triangles to 4 to raise the difficulty of the visual search task. The results showed that deaf participants performed more accurately in visual search and image maintenance tasks. The results suggested that people may use eye movements as a mnemonic strategy to maintain the mental image. And deaf signers had enhanced abilities to resist the interference of eye movements in the situation of fewer distractors. In sum, these findings suggested that deaf signers had enhanced mental image processing.

Keywords: deaf signers, image maintain, mental image, visual search

Procedia PDF Downloads 141
838 River Network Delineation from Sentinel 1 Synthetic Aperture Radar Data

Authors: Christopher B. Obida, George A. Blackburn, James D. Whyatt, Kirk T. Semple

Abstract:

In many regions of the world, especially in developing countries, river network data are outdated or completely absent, yet such information is critical for supporting important functions such as flood mitigation efforts, land use and transportation planning, and the management of water resources. In this study, a method was developed for delineating river networks using Sentinel 1 imagery. Unsupervised classification was applied to multi-temporal Sentinel 1 data to discriminate water bodies from other land covers then the outputs were combined to generate a single persistent water bodies product. A thinning algorithm was then used to delineate river centre lines, which were converted into vector features and built into a topologically structured geometric network. The complex river system of the Niger Delta was used to compare the performance of the Sentinel-based method against alternative freely available water body products from United States Geological Survey, European Space Agency and OpenStreetMap and a river network derived from a Shuttle Rader Topography Mission Digital Elevation Model. From both raster-based and vector-based accuracy assessments, it was found that the Sentinel-based river network products were superior to the comparator data sets by a substantial margin. The geometric river network that was constructed permitted a flow routing analysis which is important for a variety of environmental management and planning applications. The extracted network will potentially be applied for modelling dispersion of hydrocarbon pollutants in Ogoniland, a part of the Niger Delta. The approach developed in this study holds considerable potential for generating up to date, detailed river network data for the many countries where such data are deficient.

Keywords: Sentinel 1, image processing, river delineation, large scale mapping, data comparison, geometric network

Procedia PDF Downloads 125
837 Enhanced Disk-Based Databases towards Improved Hybrid in-Memory Systems

Authors: Samuel Kaspi, Sitalakshmi Venkatraman

Abstract:

In-memory database systems are becoming popular due to the availability and affordability of sufficiently large RAM and processors in modern high-end servers with the capacity to manage large in-memory database transactions. While fast and reliable in-memory systems are still being developed to overcome cache misses, CPU/IO bottlenecks and distributed transaction costs, disk-based data stores still serve as the primary persistence. In addition, with the recent growth in multi-tenancy cloud applications and associated security concerns, many organisations consider the trade-offs and continue to require fast and reliable transaction processing of disk-based database systems as an available choice. For these organizations, the only way of increasing throughput is by improving the performance of disk-based concurrency control. This warrants a hybrid database system with the ability to selectively apply an enhanced disk-based data management within the context of in-memory systems that would help improve overall throughput. The general view is that in-memory systems substantially outperform disk-based systems. We question this assumption and examine how a modified variation of access invariance that we call enhanced memory access, (EMA) can be used to allow very high levels of concurrency in the pre-fetching of data in disk-based systems. We demonstrate how this prefetching in disk-based systems can yield close to in-memory performance, which paves the way for improved hybrid database systems. This paper proposes a novel EMA technique and presents a comparative study between disk-based EMA systems and in-memory systems running on hardware configurations of equivalent power in terms of the number of processors and their speeds. The results of the experiments conducted clearly substantiate that when used in conjunction with all concurrency control mechanisms, EMA can increase the throughput of disk-based systems to levels quite close to those achieved by in-memory system. The promising results of this work show that enhanced disk-based systems facilitate in improving hybrid data management within the broader context of in-memory systems.

Keywords: in-memory database, disk-based system, hybrid database, concurrency control

Procedia PDF Downloads 398
836 Level Set Based Extraction and Update of Lake Contours Using Multi-Temporal Satellite Images

Authors: Yindi Zhao, Yun Zhang, Silu Xia, Lixin Wu

Abstract:

The contours and areas of water surfaces, especially lakes, often change due to natural disasters and construction activities. It is an effective way to extract and update water contours from satellite images using image processing algorithms. However, to produce optimal water surface contours that are close to true boundaries is still a challenging task. This paper compares the performances of three different level set models, including the Chan-Vese (CV) model, the signed pressure force (SPF) model, and the region-scalable fitting (RSF) energy model for extracting lake contours. After experiment testing, it is indicated that the RSF model, in which a region-scalable fitting (RSF) energy functional is defined and incorporated into a variational level set formulation, is superior to CV and SPF, and it can get desirable contour lines when there are “holes” in the regions of waters, such as the islands in the lake. Therefore, the RSF model is applied to extracting lake contours from Landsat satellite images. Four temporal Landsat satellite images of the years of 2000, 2005, 2010, and 2014 are used in our study. All of them were acquired in May, with the same path/row (121/036) covering Xuzhou City, Jiangsu Province, China. Firstly, the near infrared (NIR) band is selected for water extraction. Image registration is conducted on NIR bands of different temporal images for information update, and linear stretching is also done in order to distinguish water from other land cover types. Then for the first temporal image acquired in 2000, lake contours are extracted via the RSF model with initialization of user-defined rectangles. Afterwards, using the lake contours extracted the previous temporal image as the initialized values, lake contours are updated for the current temporal image by means of the RSF model. Meanwhile, the changed and unchanged lakes are also detected. The results show that great changes have taken place in two lakes, i.e. Dalong Lake and Panan Lake, and RSF can actually extract and effectively update lake contours using multi-temporal satellite image.

Keywords: level set model, multi-temporal image, lake contour extraction, contour update

Procedia PDF Downloads 353
835 E4D-MP: Time-Lapse Multiphysics Simulation and Joint Inversion Toolset for Large-Scale Subsurface Imaging

Authors: Zhuanfang Fred Zhang, Tim C. Johnson, Yilin Fang, Chris E. Strickland

Abstract:

A variety of geophysical techniques are available to image the opaque subsurface with little or no contact with the soil. It is common to conduct time-lapse surveys of different types for a given site for improved results of subsurface imaging. Regardless of the chosen survey methods, it is often a challenge to process the massive amount of survey data. The currently available software applications are generally based on the one-dimensional assumption for a desktop personal computer. Hence, they are usually incapable of imaging the three-dimensional (3D) processes/variables in the subsurface of reasonable spatial scales; the maximum amount of data that can be inverted simultaneously is often very small due to the capability limitation of personal computers. Presently, high-performance or integrating software that enables real-time integration of multi-process geophysical methods is needed. E4D-MP enables the integration and inversion of time-lapsed large-scale data surveys from geophysical methods. Using the supercomputing capability and parallel computation algorithm, E4D-MP is capable of processing data across vast spatiotemporal scales and in near real time. The main code and the modules of E4D-MP for inverting individual or combined data sets of time-lapse 3D electrical resistivity, spectral induced polarization, and gravity surveys have been developed and demonstrated for sub-surface imaging. E4D-MP provides capability of imaging the processes (e.g., liquid or gas flow, solute transport, cavity development) and subsurface properties (e.g., rock/soil density, conductivity) critical for successful control of environmental engineering related efforts such as environmental remediation, carbon sequestration, geothermal exploration, and mine land reclamation, among others.

Keywords: gravity survey, high-performance computing, sub-surface monitoring, electrical resistivity tomography

Procedia PDF Downloads 140
834 Revalidation and Hormonization of Existing IFCC Standardized Hepatic, Cardiac, and Thyroid Function Tests by Precison Optimization and External Quality Assurance Programs

Authors: Junaid Mahmood Alam

Abstract:

Revalidating and harmonizing clinical chemistry analytical principles and optimizing methods through quality control programs and assessments is the preeminent means to attain optimal outcome within the clinical laboratory services. Present study reports revalidation of our existing IFCC regularized analytical methods, particularly hepatic and thyroid function tests, by optimization of precision analyses and processing through external and internal quality assessments and regression determination. Parametric components of hepatic (Bilirubin ALT, γGT, ALP), cardiac (LDH, AST, Trop I) and thyroid/pituitary (T3, T4, TSH, FT3, FT4) function tests were used to validate analytical techniques on automated chemistry and immunological analyzers namely Hitachi 912, Cobas 6000 e601, Cobas c501, Cobas e411 with UV kinetic, colorimetric dry chemistry principles and Electro-Chemiluminescence immunoassay (ECLi) techniques. Process of validation and revalidation was completed with evaluating and assessing the precision analyzed Preci-control data of various instruments plotting against each other with regression analyses R2. Results showed that: Revalidation and optimization of respective parameters that were accredited through CAP, CLSI and NEQAPP assessments depicted 99.0% to 99.8% optimization, in addition to the methodology and instruments used for analyses. Regression R2 analysis of BilT was 0.996, whereas that of ALT, ALP, γGT, LDH, AST, Trop I, T3, T4, TSH, FT3, and FT4 exhibited R2 0.998, 0.997, 0.993, 0.967, 0.970, 0.980, 0.976, 0.996, 0.997, 0.997, and R2 0.990, respectively. This confirmed marked harmonization of analytical methods and instrumentations thus revalidating optimized precision standardization as per IFCC recommended guidelines. It is concluded that practices of revalidating and harmonizing the existing or any new services should be followed by all clinical laboratories, especially those associated with tertiary care hospital. This is will ensure deliverance of standardized, proficiency tested, optimized services for prompt and better patient care that will guarantee maximum patients’ confidence.

Keywords: revalidation, standardized, IFCC, CAP, harmonized

Procedia PDF Downloads 248
833 Molecular Topology and TLC Retention Behaviour of s-Triazines: QSRR Study

Authors: Lidija R. Jevrić, Sanja O. Podunavac-Kuzmanović, Strahinja Z. Kovačević

Abstract:

Quantitative structure-retention relationship (QSRR) analysis was used to predict the chromatographic behavior of s-triazine derivatives by using theoretical descriptors computed from the chemical structure. Fundamental basis of the reported investigation is to relate molecular topological descriptors with chromatographic behavior of s-triazine derivatives obtained by reversed-phase (RP) thin layer chromatography (TLC) on silica gel impregnated with paraffin oil and applied ethanol-water (φ = 0.5-0.8; v/v). Retention parameter (RM0) of 14 investigated s-triazine derivatives was used as dependent variable while simple connectivity index different orders were used as independent variables. The best QSRR model for predicting RM0 value was obtained with simple third order connectivity index (3χ) in the second-degree polynomial equation. Numerical values of the correlation coefficient (r=0.915), Fisher's value (F=28.34) and root mean square error (RMSE = 0.36) indicate that model is statistically significant. In order to test the predictive power of the QSRR model leave-one-out cross-validation technique has been applied. The parameters of the internal cross-validation analysis (r2CV=0.79, r2adj=0.81, PRESS=1.89) reflect the high predictive ability of the generated model and it confirms that can be used to predict RM0 value. Multivariate classification technique, hierarchical cluster analysis (HCA), has been applied in order to group molecules according to their molecular connectivity indices. HCA is a descriptive statistical method and it is the most frequently used for important area of data processing such is classification. The HCA performed on simple molecular connectivity indices obtained from the 2D structure of investigated s-triazine compounds resulted in two main clusters in which compounds molecules were grouped according to the number of atoms in the molecule. This is in agreement with the fact that these descriptors were calculated on the basis of the number of atoms in the molecule of the investigated s-triazine derivatives.

Keywords: s-triazines, QSRR, chemometrics, chromatography, molecular descriptors

Procedia PDF Downloads 377
832 Influence of Biochar Application on Growth, Dry Matter Yield and Nutrition of Corn (Zea mays L.) Grown on Sandy Loam Soils of Gujarat, India

Authors: Pravinchandra Patel

Abstract:

Sustainable agriculture in sandy loam soil generally faces large constraints due to low water holding and nutrient retention capacity, and accelerated mineralization of soil organic matter. There is need to increase soil organic carbon in the soil for higher crop productivity and soil sustainability. Recently biochar is considered as sixth element and work as a catalyst for increasing crop yield, soil fertility, soil sustainability and mitigation of climate change. Biochar was generated at the Sansoli Farm of Anand Agricultural University, Gujarat, India by pyrolysis at temperatures (250-400°C) in absence of oxygen using slow chemical process (using two kilns) from corn stover (Zea mays, L), cluster bean stover (Cyamopsis tetragonoloba) and Prosopis julifera wood. There were 16 treatments; 4 organic sources (3 biochar; corn stover biochar (MS), cluster bean stover (CB) & Prosopis julifera wood (PJ) and one farmyard manure-FYM) with two rate of application (5 & 10 metric tons/ha), so there were eight treatments of organic sources. Eight organic sources was applied with the recommended dose of fertilizers (RDF) (80-40-0 kg/ha N-P-K) while remaining eight organic sources were kept without RDF. Application of corn stover biochar @ 10 metric tons/ha along with RDF (RDF+MS) increased dry matter (DM) yield, crude protein (CP) yield, chlorophyll content and plant height (at 30 and 60 days after sowing) than CB and PJ biochar and FYM. Nutrient uptake of P, K, Ca, Mg, S and Cu were significantly increased with the application of RDF + corn stover @ 10 metric tons/ha while uptake of N and Mn were significantly increased in RDF + corn stover @ 5 metric tons/ha. It was found that soil application of corn stover biochar @ 10 metric tons/ha along with the recommended dose of chemical fertilizers (RDF+MS ) exhibited the highest impact in obtaining significantly higher dry matter and crude protein yields and larger removal of nutrients from the soil and it also beneficial for built up nutrients in soil. It also showed significantly higher organic carbon content and cation exchange capacity in sandy loam soil. The lower dose of corn stover biochar @ 5 metric tons/ha (RDF+ MS) was also remained the second highest for increasing dry matter and crude protein yields of forage corn crop which ultimately resulted in larger removals of nutrients from the soil. This study highlights the importance of mixing of biochar along with recommended dose of fertilizers on its synergistic effect on sandy loam soil nutrient retention, organic carbon content and water holding capacity hence, the amendment value of biochar in sandy loam soil.

Keywords: biochar, corn yield, plant nutrient, fertility status

Procedia PDF Downloads 129
831 An Evaluation of the Influence of Corn Cob Ash on the Strength Parameters of Lateritic SoiLs

Authors: O. A. Apampa, Y. A. Jimoh

Abstract:

The paper reports the investigation of Corn Cob Ash as a chemical stabilizing agent for laterite soils. Corn cob feedstock was obtained from Maya, a rural community in the derived savannah agro-ecological zone of South-Western Nigeria and burnt to ashes of pozzolanic quality. Reddish brown silty clayey sand material characterized as AASHTO A-2-6(3) lateritic material was obtained from a borrow pit in Abeokuta and subjected to strength characterization tests according to BS 1377: 2000. The soil was subsequently mixed with CCA in varying percentages of 0-7.5% at 1.5% intervals. The influence of CCA stabilized soil was determined for the Atterberg limits, compaction characteristics, CBR and the unconfined compression strength. The tests were repeated on laterite cement-soil mixture in order to establish a basis for comparison. The result shows a similarity in the compaction characteristics of soil-cement and soil-CCA. With increasing addition of binder from 1.5% to 7.5%, Maximum Dry Density progressively declined while the OMC steadily increased. For the CBR, the maximum positive impact was observed at 1.5% CCA addition at a value of 85% compared to the control value of 65% for the cement stabilization, but declined steadily thereafter with increasing addition of CCA, while that of soil-cement continued to increase with increasing addition of cement beyond 1.5% though at a relatively slow rate. Similar behavior was observed in the UCS values for the soil-CCA mix, increasing from a control value of 0.4 MN/m2 to 1.0 MN/m2 at 1.5% CCA and declining thereafter, while that for soil-cement continued to increase with increasing cement addition, but at a slower rate. This paper demonstrates that CCA is effective for chemical stabilization of a typical Nigerian AASHTO A-2-6 lateritic soil at maximum stabilizer content limit of 1.5% and therefore recommends its use as a way of finding further application for agricultural waste products and achievement of environmental sustainability in line with the ideals of the millennium development goals because of the economic and technical feasibility of the processing of the cobs from corn.

Keywords: corn cob ash, pozzolan, cement, laterite, stabilizing agent, cation exchange capacity

Procedia PDF Downloads 279
830 Corn Flakes Produced from Different Cultivars of Zea Mays as a Functional Product

Authors: Milenko Košutić, Jelena Filipović, Zvonko Nježić

Abstract:

Extrusion technology is thermal processing that is applied to improve the nutritional, hygienic, and physical-chemical characteristics of the raw material. Overall, the extrusion process is an efficient method for the production of a wide range of food products. It combines heat, pressure, and shear to transform raw materials into finished goods with desired textures, shapes, and nutritional profiles. The extruded products’ quality is remarkably dependent upon feed material composition, barrel temperature profile, feed moisture content, screw speed, and other extrusion system parameters. Given consumer expectations for snack foods, a high expansion index and low bulk density, in addition to crunchy texture and uniform microstructure, are desired. This paper investigates the effects of simultaneous different types of corn (white corn, yellow corn, red corn, and black corn) addition and different screw speed (350, 500, 650 rpm) on the physical, technological, and functional properties of flakes products. Black corn flour and screw speed at 350 rpm positively influenced physical, technological characteristics, mineral composition, and antioxidant properties of flake products with the best total score analysis of 0,59. Overall, the combination of Tukey's HSD test and PCA enables a comprehensive analysis of the observed corn products, allowing researchers to identify them. This research aims to analyze the influence of different types of corn flour (white corn, yellow corn, red corn, and black corn) on the nutritive and sensory properties of the product (quality, texture, and color), as well as the acceptance of the new product by consumers on the territory of Novi Sad. The presented data point that investigated corn flakes from black corn flour at 350 rpm is a product with good physical-technological and functional properties due to a higher level of antioxidant activity.

Keywords: corn types, flakes product, nutritive quality, acceptability

Procedia PDF Downloads 38
829 Enhancing Health Information Management with Smart Rings

Authors: Bhavishya Ramchandani

Abstract:

A little electronic device that is worn on the finger is called a smart ring. It incorporates mobile technology and has features that make it simple to use the device. These gadgets, which resemble conventional rings and are usually made to fit on the finger, are outfitted with features including access management, gesture control, mobile payment processing, and activity tracking. A poor sleep pattern, an irregular schedule, and bad eating habits are all part of the problems with health that a lot of people today are facing. Diets lacking fruits, vegetables, legumes, nuts, and whole grains are common. Individuals in India also experience metabolic issues. In the medical field, smart rings will help patients with problems relating to stomach illnesses and the incapacity to consume meals that are tailored to their bodies' needs. The smart ring tracks all bodily functions, including blood sugar and glucose levels, and presents the information instantly. Based on this data, the ring generates what the body will find to be perfect insights and a workable site layout. In addition, we conducted focus groups and individual interviews as part of our core approach and discussed the difficulties they're having maintaining the right diet, as well as whether or not the smart ring will be beneficial to them. However, everyone was very enthusiastic about and supportive of the concept of using smart rings in healthcare, and they believed that these rings may assist them in maintaining their health and having a well-balanced diet plan. This response came from the primary data, and also working on the Emerging Technology Canvas Analysis of smart rings in healthcare has led to a significant improvement in our understanding of the technology's application in the medical field. It is believed that there will be a growing demand for smart health care as people become more conscious of their health. The majority of individuals will finally utilize this ring after three to four years when demand for it will have increased. Their daily lives will be significantly impacted by it.

Keywords: smart ring, healthcare, electronic wearable, emerging technology

Procedia PDF Downloads 48
828 Algorithm for Automatic Real-Time Electrooculographic Artifact Correction

Authors: Norman Sinnigen, Igor Izyurov, Marina Krylova, Hamidreza Jamalabadi, Sarah Alizadeh, Martin Walter

Abstract:

Background: EEG is a non-invasive brain activity recording technique with a high temporal resolution that allows the use of real-time applications, such as neurofeedback. However, EEG data are susceptible to electrooculographic (EOG) and electromyography (EMG) artifacts (i.e., jaw clenching, teeth squeezing and forehead movements). Due to their non-stationary nature, these artifacts greatly obscure the information and power spectrum of EEG signals. Many EEG artifact correction methods are too time-consuming when applied to low-density EEG and have been focusing on offline processing or handling one single type of EEG artifact. A software-only real-time method for correcting multiple types of EEG artifacts of high-density EEG remains a significant challenge. Methods: We demonstrate an improved approach for automatic real-time EEG artifact correction of EOG and EMG artifacts. The method was tested on three healthy subjects using 64 EEG channels (Brain Products GmbH) and a sampling rate of 1,000 Hz. Captured EEG signals were imported in MATLAB with the lab streaming layer interface allowing buffering of EEG data. EMG artifacts were detected by channel variance and adaptive thresholding and corrected by using channel interpolation. Real-time independent component analysis (ICA) was applied for correcting EOG artifacts. Results: Our results demonstrate that the algorithm effectively reduces EMG artifacts, such as jaw clenching, teeth squeezing and forehead movements, and EOG artifacts (horizontal and vertical eye movements) of high-density EEG while preserving brain neuronal activity information. The average computation time of EOG and EMG artifact correction for 80 s (80,000 data points) 64-channel data is 300 – 700 ms depending on the convergence of ICA and the type and intensity of the artifact. Conclusion: An automatic EEG artifact correction algorithm based on channel variance, adaptive thresholding, and ICA improves high-density EEG recordings contaminated with EOG and EMG artifacts in real-time.

Keywords: EEG, muscle artifacts, ocular artifacts, real-time artifact correction, real-time ICA

Procedia PDF Downloads 155
827 Comparison Of Virtual Non-Contrast To True Non-Contrast Images Using Dual Layer Spectral Computed Tomography

Authors: O’Day Luke

Abstract:

Purpose: To validate virtual non-contrast reconstructions generated from dual-layer spectral computed tomography (DL-CT) data as an alternative for the acquisition of a dedicated true non-contrast dataset during multiphase contrast studies. Material and methods: Thirty-three patients underwent a routine multiphase clinical CT examination, using Dual-Layer Spectral CT, from March to August 2021. True non-contrast (TNC) and virtual non-contrast (VNC) datasets, generated from both portal venous and arterial phase imaging were evaluated. For every patient in both true and virtual non-contrast datasets, a region-of-interest (ROI) was defined in aorta, liver, fluid (i.e. gallbladder, urinary bladder), kidney, muscle, fat and spongious bone, resulting in 693 ROIs. Differences in attenuation for VNC and TNV images were compared, both separately and combined. Consistency between VNC reconstructions obtained from the arterial and portal venous phase was evaluated. Results: Comparison of CT density (HU) on the VNC and TNC images showed a high correlation. The mean difference between TNC and VNC images (excluding bone results) was 5.5 ± 9.1 HU and > 90% of all comparisons showed a difference of less than 15 HU. For all tissues but spongious bone, the mean absolute difference between TNC and VNC images was below 10 HU. VNC images derived from the arterial and the portal venous phase showed a good correlation in most tissue types. The aortic attenuation was somewhat dependent however on which dataset was used for reconstruction. Bone evaluation with VNC datasets continues to be a problem, as spectral CT algorithms are currently poor in differentiating bone and iodine. Conclusion: Given the increasing availability of DL-CT and proven accuracy of virtual non-contrast processing, VNC is a promising tool for generating additional data during routine contrast-enhanced studies. This study shows the utility of virtual non-contrast scans as an alternative for true non-contrast studies during multiphase CT, with potential for dose reduction, without loss of diagnostic information.

Keywords: dual-layer spectral computed tomography, virtual non-contrast, true non-contrast, clinical comparison

Procedia PDF Downloads 130
826 Design of Traffic Counting Android Application with Database Management System and Its Comparative Analysis with Traditional Counting Methods

Authors: Muhammad Nouman, Fahad Tiwana, Muhammad Irfan, Mohsin Tiwana

Abstract:

Traffic congestion has been increasing significantly in major metropolitan areas as a result of increased motorization, urbanization, population growth and changes in the urban density. Traffic congestion compromises efficiency of transport infrastructure and causes multiple traffic concerns; including but not limited to increase of travel time, safety hazards, air pollution, and fuel consumption. Traffic management has become a serious challenge for federal and provincial governments, as well as exasperated commuters. Effective, flexible, efficient and user-friendly traffic information/database management systems characterize traffic conditions by making use of traffic counts for storage, processing, and visualization. While, the emerging data collection technologies continue to proliferate, its accuracy can be guaranteed through the comparison of observed data with the manual handheld counters. This paper presents the design of tablet based manual traffic counting application and framework for development of traffic database management system for Pakistan. The database management system comprises of three components including traffic counting android application; establishing online database and its visualization using Google maps. Oracle relational database was chosen to develop the data structure whereas structured query language (SQL) was adopted to program the system architecture. The GIS application links the data from the database and projects it onto a dynamic map for traffic conditions visualization. The traffic counting device and example of a database application in the real-world problem provided a creative outlet to visualize the uses and advantages of a database management system in real time. Also, traffic data counts by means of handheld tablet/ mobile application can be used for transportation planning and forecasting.

Keywords: manual count, emerging data sources, traffic information quality, traffic surveillance, traffic counting device, android; data visualization, traffic management

Procedia PDF Downloads 178
825 Addressing Supply Chain Data Risk with Data Security Assurance

Authors: Anna Fowler

Abstract:

When considering assets that may need protection, the mind begins to contemplate homes, cars, and investment funds. In most cases, the protection of those assets can be covered through security systems and insurance. Data is not the first thought that comes to mind that would need protection, even though data is at the core of most supply chain operations. It includes trade secrets, management of personal identifiable information (PII), and consumer data that can be used to enhance the overall experience. Data is considered a critical element of success for supply chains and should be one of the most critical areas to protect. In the supply chain industry, there are two major misconceptions about protecting data: (i) We do not manage or store confidential/personally identifiable information (PII). (ii) Reliance on Third-Party vendor security. These misconceptions can significantly derail organizational efforts to adequately protect data across environments. These statistics can be exciting yet overwhelming at the same time. The first misconception, “We do not manage or store confidential/personally identifiable information (PII)” is dangerous as it implies the organization does not have proper data literacy. Enterprise employees will zero in on the aspect of PII while neglecting trade secret theft and the complete breakdown of information sharing. To circumvent the first bullet point, the second bullet point forges an ideology that “Reliance on Third-Party vendor security” will absolve the company from security risk. Instead, third-party risk has grown over the last two years and is one of the major causes of data security breaches. It is important to understand that a holistic approach should be considered when protecting data which should not involve purchasing a Data Loss Prevention (DLP) tool. A tool is not a solution. To protect supply chain data, start by providing data literacy training to all employees and negotiating the security component of contracts with vendors to highlight data literacy training for individuals/teams that may access company data. It is also important to understand the origin of the data and its movement to include risk identification. Ensure processes effectively incorporate data security principles. Evaluate and select DLP solutions to address specific concerns/use cases in conjunction with data visibility. These approaches are part of a broader solutions framework called Data Security Assurance (DSA). The DSA Framework looks at all of the processes across the supply chain, including their corresponding architecture and workflows, employee data literacy, governance and controls, integration between third and fourth-party vendors, DLP as a solution concept, and policies related to data residency. Within cloud environments, this framework is crucial for the supply chain industry to avoid regulatory implications and third/fourth party risk.

Keywords: security by design, data security architecture, cybersecurity framework, data security assurance

Procedia PDF Downloads 75
824 The Politics of Foreign Direct Investment for Socio-Economic Development in Nigeria: An Assessment of the Fourth Republic Strategies (1999 - 2014)

Authors: Muritala Babatunde Hassan

Abstract:

In the contemporary global political economy, foreign direct investment (FDI) is gaining currency on daily basis. Notably, the end of the Cold War has brought about the dominance of neoliberal ideology with its mantra of private-sector-led economy. As such, nation-states now see FDI attraction as an important element in their approach to national development. Governments and policy makers are preoccupying themselves with unraveling the best strategies to not only attract more FDI but also to attain the desired socio-economic development status. In Nigeria, the perceived development potentials of FDI have brought about aggressive hunt for foreign investors, most especially since transition to civilian rule in May 1999. Series of liberal and market oriented strategies are being adopted not only to attract foreign investors but largely to stimulate private sector participation in the economy. It is on this premise that this study interrogates the politics of FDI attraction for domestic development in Nigeria between 1999 and 2014, with the ultimate aim of examining the nexus between regime type and the ability of a state to attract and benefit from FDI. Building its analysis within the framework of institutional utilitarianism, the study posits that the essential FDI strategies for achieving the greatest happiness for the greatest number of Nigerians are political not economic. Both content analysis and descriptive survey methodology were employed in carrying out the study. Content analysis involves desk review of literatures that culminated in the development of the study’s conceptual and theoretical framework of analysis. The study finds no significant relationship between transition to democracy and FDI inflows in Nigeria, as most of the attracted investments during the period of the study were market and resource seeking as was the case during the military regime, thereby contributing minimally to the socio-economic development of the country. It is also found that the country placed much emphasis on liberalization and incentives for FDI attraction at the neglect of improving the domestic investment environment. Consequently, poor state of infrastructure, weak institutional capability and insecurity were identified as the major factors seriously hindering the success of Nigeria in exploiting FDI for domestic development. Given the reality of the currency of FDI as a vector of economic globalization and that Nigeria is trailing the line of private-sector-led approach to development, it is recommended that emphasis should be placed on those measures aimed at improving the infrastructural facilities, building solid institutional framework, enhancing skill and technological transfer and coordinating FDI promotion activities by different agencies and at different levels of government.

Keywords: foreign capital, politics, socio-economic development, FDI attraction strategies

Procedia PDF Downloads 147
823 Effect of Steam Explosion of Crop Residues on Chemical Compositions and Efficient Energy Values

Authors: Xin Wu, Yongfeng Zhao, Qingxiang Meng

Abstract:

In China, quite low proportion of crop residues were used as feedstuff because of its poor palatability and low digestibility. Steam explosion is a physical and chemical feed processing technology which has great potential to improve sapidity and digestibility of crop residues. To investigate the effect of the steam explosion on chemical compositions and efficient energy values, crop residues (rice straw, wheat straw and maize stover) were processed by steam explosion (steam temperature 120-230°C, steam pressure 2-26kg/cm², 40min). Steam-exploded crop residues were regarded as treatment groups and untreated ones as control groups, nutritive compositions were analyzed and effective energy values were calculated by prediction model in INRA (1988, 2010) for both groups. Results indicated that the interaction between treatment and variety has a significant effect on chemical compositions of crop residues. Steam explosion treatment of crop residues decreased neutral detergent fiber (NDF) significantly (P < 0.01), and compared with untreated material, NDF content of rice straw, wheat straw, and maize stover lowered 21.46%, 32.11%, 28.34% respectively. Acid detergent lignin (ADL) of crop residues increased significantly after the steam explosion (P < 0.05). The content of crude protein (CP), ether extract (EE) and Ash increased significantly after steam explosion (P < 0.05). Moreover, predicted effective energy values of each steam-exploded residue were higher than that of untreated ones. The digestible energy (DE), metabolizable energy (ME), net energy for maintenance (NEm) and net energy for gain (NEg)of steam-exploded rice straw were 3.06, 2.48, 1.48and 0.29 MJ/kg respectively and increased 46.21%, 46.25%, 49.56% and 110.92% compared with untreated ones(P < 0.05). Correspondingly, the energy values of steam-exploded wheat straw were 2.18, 1.76, 1.03 and 0.15 MJ/kg, which were 261.78%, 261.29%, 274.59% and 1014.69% greater than that of wheat straw (P < 0.05). The above predicted energy values of steam exploded maize stover were 5.28, 4.30, 2.67 and 0.82 MJ/kg and raised 109.58%, 107.71%, 122.57% and 332.64% compared with the raw material(P < 0.05). In conclusion, steam explosion treatment could significantly decrease NDF content, increase ADL, CP, EE, Ash content and effective energy values of crop residues. The effect of steam explosion was much more obvious for wheat straw than the other two kinds of residues under the same condition.

Keywords: chemical compositions, crop residues, efficient energy values, steam explosion

Procedia PDF Downloads 233
822 Effects of Sintering Temperature on Microstructure and Mechanical Properties of Nanostructured Ni-17Cr Alloy

Authors: B. J. Babalola, M. B. Shongwe

Abstract:

Spark Plasma Sintering technique is a novel processing method that produces limited grain growth and highly dense variety of materials; alloys, superalloys, and carbides just to mention a few. However, initial particle size and spark plasma sintering parameters are factors which influence the grain growth and mechanical properties of sintered materials. Ni-Cr alloys are regarded as the most promising alloys for aerospace turbine blades, owing to the fact that they meet the basic requirements of desirable mechanical strength at high temperatures and good resistance to oxidation. The conventional method of producing this alloy often results in excessive grain growth and porosity levels that are detrimental to its mechanical properties. The effect of sintering temperature was evaluated on the microstructure and mechanical properties of the nanostructured Ni-17Cr alloy. Nickel and chromium powder were milled using high energy ball milling independently for 30 hours, milling speed of 400 revs/min and ball to powder ratio (BPR) of 10:1. The milled powders were mixed in the composition of Nickel having 83 wt % and chromium, 17 wt %. This was sintered at varied temperatures from 800°C, 900°C, 1000°C, 1100°C and 1200°C. The structural characteristics such as porosity, grain size, fracture surface and hardness were analyzed by scan electron microscopy and X-ray diffraction, Archimedes densitometry, micro-hardness tester. The corresponding results indicated an increase in the densification and hardness property of the alloy as the temperature increases. The residual porosity of the alloy reduces with respect to the sintering temperature and in contrast, the grain size was enhanced. The study of the mechanical properties, including hardness, densification shows that optimum properties were obtained for the sintering temperature of 1100°C. The advantages of high sinterability of Ni-17Cr alloy using milled powders and microstructural details were discussed.

Keywords: densification, grain growth, milling, nanostructured materials, sintering temperature

Procedia PDF Downloads 395
821 Development and Validation of a Carbon Dioxide TDLAS Sensor for Studies on Fermented Dairy Products

Authors: Lorenzo Cocola, Massimo Fedel, Dragiša Savić, Bojana Danilović, Luca Poletto

Abstract:

An instrument for the detection and evaluation of gaseous carbon dioxide in the headspace of closed containers has been developed in the context of Packsensor Italian-Serbian joint project. The device is based on Tunable Diode Laser Absorption Spectroscopy (TDLAS) with a Wavelength Modulation Spectroscopy (WMS) technique in order to accomplish a non-invasive measurement inside closed containers of fermented dairy products (yogurts and fermented cheese in cups and bottles). The purpose of this instrument is the continuous monitoring of carbon dioxide concentration during incubation and storage of products over a time span of the whole shelf life of the product, in the presence of different microorganisms. The instrument’s optical front end has been designed to be integrated in a thermally stabilized incubator. An embedded computer provides processing of spectral artifacts and storage of an arbitrary set of calibration data allowing a properly calibrated measurement on many samples (cups and bottles) of different shapes and sizes commonly found in the retail distribution. A calibration protocol has been developed in order to be able to calibrate the instrument on the field also on containers which are notoriously difficult to seal properly. This calibration protocol is described and evaluated against reference measurements obtained through an industry standard (sampling) carbon dioxide metering technique. Some sets of validation test measurements on different containers are reported. Two test recordings of carbon dioxide concentration evolution are shown as an example of instrument operation. The first demonstrates the ability to monitor a rapid yeast growth in a contaminated sample through the increase of headspace carbon dioxide. Another experiment shows the dissolution transient with a non-saturated liquid medium in presence of a carbon dioxide rich headspace atmosphere.

Keywords: TDLAS, carbon dioxide, cups, headspace, measurement

Procedia PDF Downloads 303
820 Primary School Students’ Modeling Processes: Crime Problem

Authors: Neslihan Sahin Celik, Ali Eraslan

Abstract:

As a result of PISA (Program for International Student Assessments) survey that tests how well students can apply the knowledge and skills they have learned at school to real-life challenges, the new and redesigned mathematics education programs in many countries emphasize the necessity for the students to face complex and multifaceted problem situations and gain experience in this sense allowing them to develop new skills and mathematical thinking to prepare them for their future life after school. At this point, mathematical models and modeling approaches can be utilized in the analysis of complex problems which represent real-life situations in which students can actively participate. In particular, model eliciting activities that bring about situations which allow the students to create solutions to problems and which involve mathematical modeling must be used right from primary school years, allowing them to face such complex, real-life situations from early childhood period. A qualitative study was conducted in a university foundation primary school in the city center of a big province in 2013-2014 academic years. The participants were 4th grade students in a primary school. After a four-week preliminary study applied to a fourth-grade classroom, three students included in the focus group were selected using criterion sampling technique. A focus group of three students was videotaped as they worked on the Crime Problem. The conversation of the group was transcribed, examined with students’ written work and then analyzed through the lens of Blum and Ferri’s modeling processing cycle. The results showed that primary fourth-grade students can successfully work with model eliciting problem while they encounter some difficulties in the modeling processes. In particular, they developed new ideas based on different assumptions, identified the patterns among variables and established a variety of models. On the other hand, they had trouble focusing on problems and occasionally had breaks in the process.

Keywords: primary school, modeling, mathematical modeling, crime problem

Procedia PDF Downloads 386
819 Natural Monopolies and Their Regulation in Georgia

Authors: Marina Chavleishvili

Abstract:

Introduction: Today, the study of monopolies, including natural monopolies, is topical. In real life, pure monopolies are natural monopolies. Natural monopolies are used widely and are regulated by the state. In particular, the prices and rates are regulated. The paper considers the problems associated with the operation of natural monopolies in Georgia, in particular, their microeconomic analysis, pricing mechanisms, and legal mechanisms of their operation. The analysis was carried out on the example of the power industry. The rates of natural monopolies in Georgia are controlled by the Georgian National Energy and Water Supply Regulation Commission. The paper analyzes the positive role and importance of the regulatory body and the issues of improving the legislative base that will support the efficient operation of the branch. Methodology: In order to highlight natural monopolies market tendencies, the domestic and international markets are studied. An analysis of monopolies is carried out based on the endogenous and exogenous factors that determine the condition of companies, as well as the strategies chosen by firms to increase the market share. According to the productivity-based competitiveness assessment scheme, the segmentation opportunities, business environment, resources, and geographical location of monopolist companies are revealed. Main Findings: As a result of the analysis, certain assessments and conclusions were made. Natural monopolies are quite a complex and versatile economic element, and it is important to specify and duly control their frame conditions. It is important to determine the pricing policy of natural monopolies. The rates should be transparent, should show the level of life in the country, and should correspond to the incomes. The analysis confirmed the significance of the role of the Antimonopoly Service in the efficient management of natural monopolies. The law should adapt to reality and should be applied only to regulate the market. The present-day differential electricity tariffs varying depending on the consumed electrical power need revision. The effects of the electricity price discrimination are important, segmentation in different seasons in particular. Consumers use more electricity in winter than in summer, which is associated with extra capacities and maintenance costs. If the price of electricity in winter is higher than in summer, the electricity consumption will decrease in winter. The consumers will start to consume the electricity more economically, what will allow reducing extra capacities. Conclusion: Thus, the practical realization of the views given in the paper will contribute to the efficient operation of natural monopolies. Consequently, their activity will be oriented not on the reduction but on the increase of increments of the consumers or producers. Overall, the optimal management of the given fields will allow for improving the well-being throughout the country. In the article, conclusions are made, and the recommendations are developed to deliver effective policies and regulations toward the natural monopolies in Georgia.

Keywords: monopolies, natural monopolies, regulation, antimonopoly service

Procedia PDF Downloads 74
818 Evaluation of Microwave-Assisted Pretreatment for Spent Coffee Grounds

Authors: Shady S. Hassan, Brijesh K. Tiwari, Gwilym A. Williams, Amit K. Jaiswal

Abstract:

Waste materials from a wide range of agro-industrial processes may be used as substrates for microbial growth, and subsequently the production of a range of high value products and bioenergy. In addition, utilization of these agro-residues in bioprocesses has the dual advantage of providing alternative substrates, as well as solving their disposal problems. Spent coffee grounds (SCG) are a by-product (45%) of coffee processing. SCG is a lignocellulosic material, which is composed mainly of cellulose, hemicelluloses, and lignin. Thus, a pretreatment process is required to facilitate an efficient enzymatic hydrolysis of such carbohydrates. In this context, microwave pretreatment of lignocellulosic biomass without the addition of harsh chemicals represents a green technology. Moreover, microwave treatment has a high heating efficiency and is easy to implement. Thus, microwave pretreatment of SCG without adding of harsh chemicals investigated as a green technology to enhance enzyme hydrolysis. In the present work, microwave pretreatment experiments were conducted on SCG at varying power levels (100, 250, 440, 600, and 1000 W) for 60 s. By increasing microwave power to a certain level (which vary by varying biomass), reducing sugar increases, then reducing sugar from biomass start to decrease with microwave power increase beyond this level. Microwave pretreatment of SCG at 60s followed by enzymatic hydrolysis resulted in total reducing sugars of 91.6 ± 7.0 mg/g of biomass (at microwave power of 100 w). Fourier transform Infrared Spectroscopy (FTIR) was employed to investigate changes in functional groups of biomass after pretreatment, while high-performance liquid chromatography (HPLC) was employed for determination of glucose. Pretreatment of lignocellulose using microwave was found to be an effective and energy efficient technology to improve saccharification and glucose yield. Energy performance will be evaluated for the microwave pretreatment, and the enzyme hydrolysate will be used as media component substitute for the production of ethanol and other high value products.

Keywords: lignocellulose, microwave, pretreatment, spent coffee grounds

Procedia PDF Downloads 399
817 Understanding and Measuring Stigma, Barriers and Attitudes Associated with Seeking Psychological Help Among Young Adults in Czech Republic

Authors: Tereza Hruskova

Abstract:

200 million people globally experience serious mental health problems, and only one third seek professional help, and help-seeking is described as a last resort. Adolescents and young adults have a high prevalence of mental illness. Mental stigma is a key element in the decision to seek help and is divided into (i) self-stigma (self-stigmatization), including internal beliefs, low self-esteem, and lower quality of life, and (ii) public stigma (social stigma) containing stereotypes, beliefs and society's disapproval of help-seeking having a negative effect on help-seeking and our attitudes. Previous research has mainly focused on examining the construct of help seeking, avoidance, and delaying separately and trying to find out why people do not seek help in time and what obstacles stand in the way. Barriers are not static and may change over time and the stage of help-seeking. Attitudes are closely related to self-stigma and social stigma and predict whether a person will seek help. Barriers (stigmatization, a sense of humiliation, insufficient recognition of the problem, preferences, solving it alone, and distrust of a professional) and facilitators (previous experience with mental problems, social support, and help from others) are factors influencing help-seeking. The current research on the Czech population of young adults responds to the gap between a person with mental health problems and actually seeking professional help. The aim of the study is to describe in detail the individual constructs and factors, to understand the person seeking help, and to define possible obstacles on this path of seeking help. A sample of approximately 250 participants (age 18-35) would take part in the online questionnaire, conducted in May-June 2023, and would be administered a demographic questionnaire and four scales measuring attitudes (Attitudes Toward Seeking Professional Psychological Help – Short form), barriers (Barrier to Help Seeking Scale), self-stigma (Self Stigma of Seeking Help) and stigmatization (Perceptions of Stigmatization by Others for seeking help). Firstly, all four scales would be translated into the Czech language. The aim is (I) to determine the validity and reliability of the Czech translation of the scales, (II) to examine the factors of the scales on the Czech population and compare them retrospectively with the results of reliability and validity from the original language of the scales and (III) to examine the connections between attitudes towards seeking, avoidance or delaying the search for professional psychological help due to the demographic and individual differences of the participants, barriers, self-stigmatization and social stigmatization. We expect to carry out the first study on the given topic in the Czech Republic, to identify and better understand the factors leading to the avoidance of seeking professional help and to reveal the relationships between stigmatization, attitudes and barriers leading to the avoidance or postponement of seeking professional help. The belief is to find out whether the Czech population of young adults differs from the data found on the foreign population in individual constructs, as cultural differences in individual countries were found.

Keywords: mental health, stigma, problems, seeking psychological help

Procedia PDF Downloads 62
816 A Geochemical Perspective on A-Type Granites of Khanak and Devsar Areas, Haryana, India: Implications for Petrogenesis

Authors: Naresh Kumar, Radhika Sharma, A. K. Singh

Abstract:

Granites from Khanak and Devsar areas, a part of Malani Igneous Suite (MIS) were investigated for their geochemical characteristics to understand the petrogenetic aspect of the research area. Neoproterozoic rocks of MIS are well exposed in Jhunjhunu, Jodhpur, Pali, Barmer, Jalor, Jaisalmer districts of Rajasthan and Bhiwani district of Haryana and also occur at Kirana hills of Pakistan. The MIS predominantly consists of acidic volcanic with acidic plutonic (granite of various types), mafic volcanic, mafic intrusive and minor amount of pyroclasts. Based on the field and petrographical studies, 28 samples were selected and analyzed for geochemical analysis of major, trace and rare earth elements at the Wadia Institute of Himalayan Geology, Dehradun by X-Ray Fluorescence Spectrometer (XRF) and ICP-MS (Inductively Coupled Plasma- Mass Spectrometry). Granites from the studied areas are categorized as grey, green and pink. Khanak granites consist of quartz, k-feldspar, plagioclase, and biotite as essential minerals and hematite, zircon, annite, monazite & rutile as accessory minerals. In Devsar granites, plagioclase is replaced by perthite and occurs as dominantly. Geochemically, granites from Khanak and Devsar areas exhibit typical A-type granites characteristics with their enrichment in SiO2, Na2O+K2O, Fe/Mg, Rb, Zr, Y, Th, U, REE (except Eu) and significant depletion in MgO, CaO, Sr, P, Ti, Ni, Cr, V and Eu suggested about A-type affinities in Northwestern Peninsular India. The amount of heat production (HP) in green and grey granites of Devsar area varies upto 9.68 & 11.70 μWm-3 and total heat generation unit (HGU) i.e. 23.04 & 27.86 respectively. Pink granites of Khanak area display a higher enrichment of HP (16.53 μWm-3) and HGU (39.37) than the granites from Devsar area. Overall, they have much higher values of HP and HGU than the average value of continental crust (3.8 HGU), which imply a possible linear relationship among the surface heat flow and crustal heat generation in the rocks of MIS. Chondrite-normalized REE patterns show enriched LREE, moderate to strong negative Eu anomalies and more or less flat heavy REE. In primitive mantle-normalized multi-element variation diagrams, the granites show pronounced depletions in the high-field-strength elements (HFSE) Nb, Zr, Sr, P, and Ti. Geochemical characteristics (major, trace and REE) along with the use of various discrimination schemes revealed their probable correspondence to magma derived from the crustal origin by a different degree of partial melting.

Keywords: A-type granite, neoproterozoic, Malani igneous suite, Khanak, Devsar

Procedia PDF Downloads 262
815 Waste Utilization by Combustion in the Composition of Gel Fuels

Authors: Dmitrii Glushkov, Aleksandr G. Nigay, Olga S. Yashutina

Abstract:

In recent years, due to the intensive development of the Arctic and Antarctic areas, the actual task is to develop technology for the effective utilization of solid and liquid combustible wastes in an environment with low temperatures. Firstly, such technology will help to prevent the dumping of waste into the World Ocean and reduce the risks of causing environmental damage to the Far North areas. Secondly, promising actions will help to prepare fuel compositions from the waste in the places of their production. Such kind of fuels can be used as energy resources. It will reduce waste utilization costs when transporting them to the mainland. In the present study, we suggest a solution to the problem of waste utilization by the preparation of gel fuels based on solid and liquid combustible components with the addition of the thickener. Such kind of fuels is characterized by ease of preparation, storage, transportation and use (as energy resources). The main regularities and characteristics of physical and chemical processes are established with varying parameters of gel fuels and heating sources in wide ranges. The obtained results let us conclude about the prospects of gel fuels practical application for combustible wastes utilization. Appropriate technology will be characterized by positive environmental, operational and economic effects. The composition of the gel fuels can vary in a wide range. The fuels preparation based on one type of a combustible liquid or a several liquids mixture with the finely dispersed components addition makes it possible to obtain compositions with predicted rheological, energy or environmental characteristics. Besides, gel fuels have a lower level of the fire hazard compared to common solid and liquid fuels. This makes them convenient for storage and transportation. In such conditions, it is not necessary to transport combustible wastes from the territory of the Arctic and the Antarctic to the mainland for processing, which is now quite an expensive procedure. The research was funded by the Russian Science Foundation (project No. 18-13-00031).

Keywords: combustible liquid waste, gel fuel, ignition and combustion, utilization

Procedia PDF Downloads 100