Search results for: real time kernel preemption
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 20469

Search results for: real time kernel preemption

14259 Shifting of Global Energy Security: A Comparative Analysis of Indonesia and China’s Renewable Energy Policies

Authors: Widhi Hanantyo Suryadinata

Abstract:

Efforts undertaken by Indonesia and China to shift the strategies and security of renewable energy on a global stage involve approaches through policy construction related to rare minerals processing or value-adding in Indonesia and manufacturing policies through the New Energy Vehicles (NEVs) policy in China. Both policies encompass several practical regulations and policies that can be utilized for the implementation of Indonesia and China's grand efforts and ideas. Policy development in Indonesia and China can be analyzed using a comparative analysis method, as well as employing a pyramid illustration to identify policy construction phases based on the real conditions of the domestic market and implemented policies. This approach also helps to identify the potential integration of policies needed to enhance the policy development phase of a country within the pyramid. It also emphasizes the significance of integration policy to redefine renewable energy strategy and security on the global stage.

Keywords: global renewable energy security, global energy security, policy development, comparative analysis, shifting of global energy security, Indonesia, China

Procedia PDF Downloads 48
14258 Comparative Operating Speed and Speed Differential Day and Night Time Models for Two Lane Rural Highways

Authors: Vinayak Malaghan, Digvijay Pawar

Abstract:

Speed is the independent parameter which plays a vital role in the highway design. Design consistency of the highways is checked based on the variation in the operating speed. Often the design consistency fails to meet the driver’s expectation which results in the difference between operating and design speed. Literature reviews have shown that significant crashes take place in horizontal curves due to lack of design consistency. The paper focuses on continuous speed profile study on tangent to curve transition for both day and night daytime. Data is collected using GPS device which gives continuous speed profile and other parameters such as acceleration, deceleration were analyzed along with Tangent to Curve Transition. In this present study, models were developed to predict operating speed on tangents and horizontal curves as well as model indicating the speed reduction from tangent to curve based on continuous speed profile data. It is observed from the study that vehicle tends to decelerate from approach tangent to between beginning of the curve and midpoint of the curve and then accelerates from curve to tangent transition. The models generated were compared for both day and night and can be used in the road safety improvement by evaluating the geometric design consistency.

Keywords: operating speed, design consistency, continuous speed profile data, day and night time

Procedia PDF Downloads 146
14257 The Effect of Smartphones on Human Health Relative to User’s Addiction: A Study on a Wide Range of Audiences in Jordan

Authors: T. Qasim, M. Obeidat, S. Al-Sharairi

Abstract:

The objective of this study is to investigate the effect of the excessive use of smartphones. Smartphones have enormous effects on the human body in that some musculoskeletal disorders (MSDs) and health problems might evolve. These days, there is a wide use of the smartphones among all age groups of society, thus, the focus on smartphone effects on human behavior and health, especially on the young and elderly people, becomes a crucial issue. This study was conducted in Jordan on smartphone users for different genders and ages, by conducting a survey to collect data related to the symptoms and MSDs that are resulted from the excessive use of smartphones. A total of 357 responses were used in the analysis. The main related symptoms were numbness, fingers pain, and pain in arm, all linked to age and gender for comparative reasons. A statistical analysis was performed to find the effects of extensive usage of a smartphone for long periods of time on the human body. Results show that the significant variables were the vision problems and the time spent when using the smartphone that cause vision problems. Other variables including age of user and ear problems due to the use of the headsets were found to be a border line significant.

Keywords: smart phone, age group, musculoskeletal disorders (MSDs), health problems

Procedia PDF Downloads 243
14256 Dielectric Spectroscopy Investigation of Hydrophobic Silica Aerogel

Authors: Deniz Bozoglu, Deniz Deger, Kemal Ulutas, Sahin Yakut

Abstract:

In recent years, silica aerogels have attracted great attention due to their outstanding properties, and their wide variety of potential applications such as microelectronics, nuclear and high-energy physics, optics and acoustics, superconductivity, space-physics. Hydrophobic silica aerogels were successfully synthesized in one-step by surface modification at ambient pressure. FT-IR result confirmed that Si-OH groups were successfully converted into hydrophobic and non-polar Si-CH3 groups by surface modification using trimethylchloro silane (TMCS) as co-precursor. Using Alpha-A High-Resolution Dielectric, Conductivity and Impedance Analyzer, AC conductivity of samples were examined at temperature range 293-423 K and measured over frequency range between 1-106 Hz. The characteristic relaxation time decreases with increasing temperature. The AC conductivity follows σ_AC (ω)=σ_t-σ_DC=Aω^s relation at frequencies higher than 10 Hz, and the dominant conduction mechanism is found to obey the Correlated Barrier Hopping (CBH) mechanism. At frequencies lower than 10 Hz, the electrical conduction is found to be in accordance with DC conduction mechanism. The activation energies obtained from AC conductivity results and it was observed two relaxation regions.

Keywords: aerogel, synthesis, dielectric constant, dielectric loss, relaxation time

Procedia PDF Downloads 178
14255 Assessment of Genotoxic Effects of a Fungicide (Propiconazole) in Freshwater Fish Gambusia Affinis Using Alkaline Single-Cell Gel Electrophoresis (Comet Essay)

Authors: Bourenane Bouhafs Naziha

Abstract:

ARTEA330EC is a fungicide used to inhibit the growth of many types of fungi on and cereals and rice, it is the single largest selling agrochemical that has been widely detected in surface waters in our area (Northeast Algerian). The studies on long-term genotoxic effects of fugicides in different tissues of fish using genotoxic biomarkers are limited. Therefore, in the present study DNA damage by propiconazole in freshwater fish Gambusia affinis by comet assays was investigated. The LC(50)- 96 h of the fungicide was estimated for the fish in a semi-static system. On this basis of LC(50) value sublethal and nonlethal concentrations were determined (25; 50; 75; and 100 ppm). The DNA damage was measured in erythrocytes as the percentage of DNA in comet tails of fishes exposed to above concentrations the fungicide. In general,non significant effects for both the concentrations and time of exposure were observed in treated fish compared with the controls. However It was found that the highest DNA damage was observed at the highest concentration and the longest time of exposure (day 12). The study indicated comet assay to be sensitive and rapid method to detect genotoxicity of propiconasol and other pesticides in fishes.

Keywords: genotoxicity, fungicide, propiconazole, freshwater, Gambusia affinis, alkaline single-cell gel electrophoresis

Procedia PDF Downloads 285
14254 International Trade, Manufacturing and Employment: The First Two Decades of South African Democracy

Authors: Phillip F. Blaauw, Anna M. Pretorius

Abstract:

South Africa re-entered the international economy in the early 1990s, after Apartheid, at a time when globalisation was gathering momentum. Globalisation led to a more open economy, increased export volumes and a changed export mix. Manufacturing goods gained ground relative to mining products. After 21 years of democracy, South African researchers and policymakers need to evaluate the impact of international trade on the level of employment and compensation of employees in the South African manufacturing industry. This is important given the consistent and high levels of unemployment in South Africa. This paper has this evaluation as its aim. Two complimenting approaches are utilised. The 27 sub divisions of the South African manufacturing industry are classified according to capital/labour ratios. Possible trends in employment levels and employee compensation for these categories are then identified when comparing levels in 1995 to those in 2014. The supplementing empirical approach is cross-sectional and panel data regressions for the same period. The aim of the regression analysis is to explain the observed changes in employment and employee compensation levels between 1995 and 2014. The first part of the empirical approach revealed that over the 20-year period the intermediate capital intensive, labour intensive an ultra-labour intensive manufacturing industries all showed massive declines in overall employment. Only three of the 19 industries for these classifications showed marginal overall employment gains. The only meaningful gains were recorded in three of the eight capital intensive manufacturing industries. The overall performance of the South African manufacturing industry is therefore dismal at best. This scenario plays itself out for the skilled section of the intermediate capital intensive, labour intensive an ultra-labour intensive manufacturing industries as well. 18 out of the 19 industries displayed declines even for the skilled section of the labour force. The formal regression analysis supplements the above results. Real production growth is a statistically significant (95 per cent confidence level) explanatory variable of the overall employment level for the period under consideration, albeit with a small positive coefficient. The variables with the most significant negative relationship with changes in overall employment were the dummy variables for intermediate capital intensive and labour intensive manufacturing goods. Disaggregating overall changes in employment further in terms of skill levels revealed that skilled employment in particular responded negatively to increases in the ratio between imported and local inputs for manufacturing. The dummy variable for the labour intensive sectors remained negative and statistically significant, indicating that the labour intensive sectors of South African manufacturing remain vulnerable to the loss of employment opportunities. Whereas the first period (1995 to 2001) after the opening of the South African economy brought positive changes for skilled employment, continued increases in imported inputs displaced some of the skilled labour as well, putting further pressure on the South African economy with already high and persistent unemployment levels. Given the negative for the world commodity cycle and a stagnant local manufacturing sector, the challenge for policymakers is getting even more pronounced after South Africa’s political coming of age.

Keywords: capital/labour ratios, employment, employee compensation, manufacturing

Procedia PDF Downloads 208
14253 Predicting Consolidation Coefficient of Busan Clay by Time-Displacement-Velocity Methods

Authors: Thang Minh Le, Hadi Khabbaz

Abstract:

The coefficient of consolidation is a parameter governing the rate at which saturated soil particularly clay undergoes consolidation when subjected to an increase in pressure. The rate and amount of compression in soil varies with the rate that pore water is lost; and hence depends on soil permeability. Over many years, various methods have been proposed to determine the coefficient of consolidation, cv, which is an indication of the rate of foundation settlement on soft ground. However, defining this parameter is often problematic and heavily relies on graphical techniques, which are subject to some uncertainties. This paper initially presents an overview of many well-established methods to determine the vertical coefficient of consolidation from the incremental loading consolidation tests. An array of consolidation tests was conducted on the undisturbed clay samples, collected at various depths from a site in Nakdong river delta, Busan, South Korea. The consolidation test results on these soft sensitive clay samples were employed to evaluate the targeted methods to predict the settlement rate of Busan clay. In relationship of time-displacement-velocity, a total of 3 method groups from 10 common procedures were classified and compared together. Discussions on study results will be also provided.

Keywords: Busan clay, coefficient of consolidation, constant rate of strain, incremental loading

Procedia PDF Downloads 172
14252 Psychological Reactance to Anti-Piracy Messages Explained by Gender and Attitudes

Authors: Kate Whitman, Zahra Murad, Joe Cox

Abstract:

Digital piracy is costly to creative economies across the world. Anti-piracy messages can cause people to pirate more rather than less, suggesting the presence of psychological reactance. Gender differences in message reactance and the moderating impact of attitudes have not been explored. In this paper, we examine whether messages based on real-world anti-piracy campaigns cause reactance and whether this effect is explained by gender and attitudes. An experiment compares two threatening and one prosocial message against a control group, with changes in piracy intention from past behavior for digital TV/film analysed. The results suggest that the prosocial message is ineffective for both genders. However, the threatening messages have significantly opposing effects on men and women. One threatening message influences women to reduce their piracy intentions by over 50% and men to increase it by 18%. Gender effects are moderated by pre-existing attitudes, with men and women who report the most favorable attitudes towards piracy having the most polarised changes in piracy intentions. The results suggest that men and women process threatening messages differently and that the creative industries should take care when targeting their messages.

Keywords: piracy, reactance, persuasive-messages, TV/film, gender

Procedia PDF Downloads 76
14251 Religious Beliefs and Their Effects on the Use of Contraceptives in Female College Students

Authors: Amy Kless, Peter Reuter

Abstract:

The purpose of this study was to explore the association between the teachings of religious doctrine on the use of contraceptives and its influence on the behavior of female college students. The religious doctrine of both Christian and non-Christian religions states that sexual intercourse shall only take place between people that are married. Additionally, the teachings of most Christian and non-Christian religions prohibit the use of contraceptives during sexual intercourse. Being away from home for the first time, students that grew up in religious households may stop attending church services or stop practicing religion entirety. The college years are also a time for sexual exploration. The desire for sexual exploration leaves many students, both religious and non-religious, with having to choose between abstaining from sexual intercourse or using a form of contraceptive to prevent pregnancy. Of 1,130 female students anonymously surveyed at a southern university between Spring 2016 and Fall 2020, 50% reported having religious beliefs. Less than 50% of the students who reported having religious beliefs attend church services on a regular basis. Nearly 75% of the same students reported having participated in sexual intercourse with close to 60% utilizing some form of contraceptive to prevent pregnancy. The data suggest that female college students do not follow religious teachings on abstinence from premarital sex or the ban on the use of contraceptives.

Keywords: contraceptives, females, intercourse, religion

Procedia PDF Downloads 257
14250 Modeling and Performance Evaluation of an Urban Corridor under Mixed Traffic Flow Condition

Authors: Kavitha Madhu, Karthik K. Srinivasan, R. Sivanandan

Abstract:

Indian traffic can be considered as mixed and heterogeneous due to the presence of various types of vehicles that operate with weak lane discipline. Consequently, vehicles can position themselves anywhere in the traffic stream depending on availability of gaps. The choice of lateral positioning is an important component in representing and characterizing mixed traffic. The field data provides evidence that the trajectory of vehicles in Indian urban roads have significantly varying longitudinal and lateral components. Further, the notion of headway which is widely used for homogeneous traffic simulation is not well defined in conditions lacking lane discipline. From field data it is clear that following is not strict as in homogeneous and lane disciplined conditions and neighbouring vehicles ahead of a given vehicle and those adjacent to it could also influence the subject vehicles choice of position, speed and acceleration. Given these empirical features, the suitability of using headway distributions to characterize mixed traffic in Indian cities is questionable, and needs to be modified appropriately. To address these issues, this paper attempts to analyze the time gap distribution between consecutive vehicles (in a time-sense) crossing a section of roadway. More specifically, to characterize the complex interactions noted above, the influence of composition, manoeuvre types, and lateral placement characteristics on time gap distribution is quantified in this paper. The developed model is used for evaluating various performance measures such as link speed, midblock delay and intersection delay which further helps to characterise the vehicular fuel consumption and emission on urban roads of India. Identifying and analyzing exact interactions between various classes of vehicles in the traffic stream is essential for increasing the accuracy and realism of microscopic traffic flow modelling. In this regard, this study aims to develop and analyze time gap distribution models and quantify it by lead lag pair, manoeuvre type and lateral position characteristics in heterogeneous non-lane based traffic. Once the modelling scheme is developed, this can be used for estimating the vehicle kilometres travelled for the entire traffic system which helps to determine the vehicular fuel consumption and emission. The approach to this objective involves: data collection, statistical modelling and parameter estimation, simulation using calibrated time-gap distribution and its validation, empirical analysis of simulation result and associated traffic flow parameters, and application to analyze illustrative traffic policies. In particular, video graphic methods are used for data extraction from urban mid-block sections in Chennai, where the data comprises of vehicle type, vehicle position (both longitudinal and lateral), speed and time gap. Statistical tests are carried out to compare the simulated data with the actual data and the model performance is evaluated. The effect of integration of above mentioned factors in vehicle generation is studied by comparing the performance measures like density, speed, flow, capacity, area occupancy etc under various traffic conditions and policies. The implications of the quantified distributions and simulation model for estimating the PCU (Passenger Car Units), capacity and level of service of the system are also discussed.

Keywords: lateral movement, mixed traffic condition, simulation modeling, vehicle following models

Procedia PDF Downloads 330
14249 Reliability of Swine Estrous Detector Probe in Dairy Cattle Breeding

Authors: O. O. Leigh, L. C. Agbugba, A. O. Oyewunmi, A. E. Ibiam, A. Hassan

Abstract:

Accuracy of insemination timing is a key determinant of high pregnancy rates in livestock breeding stations. The estrous detector probes are a recent introduction into the Nigerian livestock farming sector. Many of these probes are species-labeled and they measure changes in the vaginal mucus resistivity (VMR) during the stages of the estrous cycle. With respect to size and shaft conformation, the Draminski® swine estrous detector probe (sEDP) is quite similar to the bovine estrous detector probe. We investigated the reliability of the sEDP at insemination time on two farms designated as FM A and FM B. Cows (Bunaji, n=20 per farm) were evaluated for VMR at 16th h post standard OvSynch protocol, with concurrent insemination on FM B only. The difference in the mean VMR between FM A (221 ± 24.36) Ohms and FM B (254 ± 35.59) Ohms was not significant (p > 0.05). Sixteen cows (80%) at FM B were later (day 70) confirmed pregnant via rectal palpation and calved at term. These findings suggest consistency in VMR evaluated with sEDP at insemination as well as a high predictability for VMR associated with good pregnancy rates in dairy cattle. We conclude that Draminski® swine estrous detector probe is reliable in determining time of insemination in cattle breeding stations.

Keywords: dairy cattle, insemination, swine estrous probe, vaginal mucus resistivity

Procedia PDF Downloads 113
14248 Holomorphic Prioritization of Sets within Decagram of Strategic Decision Making of POSM Using Operational Research (OR): Analytic Hierarchy Process (AHP) Analysis

Authors: Elias Ogutu Azariah Tembe, Hussain Abdullah Habib Al-Salamin

Abstract:

There is decagram of strategic decisions of operations and production/service management (POSM) within operational research (OR) which must collate, namely: design, inventory, quality, location, process and capacity, layout, scheduling, maintain ace, and supply chain. This paper presents an architectural configuration conceptual framework of a decagram of sets decisions in a form of mathematical complete graph and abelian graph. Mathematically, a complete graph is undirected (UDG), and directed (DG) a relationship where every pair of vertices are connected, collated, confluent, and holomorphic. There has not been any study conducted which, however, prioritizes the holomorphic sets which of POMS within OR field of study. The study utilizes OR structured technique known as The Analytic Hierarchy Process (AHP) analysis for organizing, sorting and prioritizing (ranking) the sets within the decagram of POMS according to their attribution (propensity), and provides an analysis how the prioritization has real-world application within the 21st century.

Keywords: holomorphic, decagram, decagon, confluent, complete graph, AHP analysis, SCM, HRM, OR, OM, abelian graph

Procedia PDF Downloads 390
14247 Removal of Nickel Ions from Industrial Effluents by Batch and Column Experiments: A Comparison of Activated Carbon with Pinus Roxburgii Saw Dust

Authors: Sardar Khana, Zar Ali Khana

Abstract:

Rapid industrial development and urbanization contribute a lot to wastewater discharge. The wastewater enters into natural aquatic ecosystems from industrial activities and considers as one of the main sources of water pollution. Discharge of effluents loaded with heavy metals into the surrounding environment has become a key issue regarding human health risk, environment, and food chain contamination. Nickel causes fatigue, cancer, headache, heart problems, skin diseases (Nickel Itch), and respiratory disorders. Nickel compounds such as Nickel Sulfide and Nickel oxides in industrial environment, if inhaled, have an association with an increased risk of lung cancer. Therefore the removal of Nickel from effluents before discharge is necessary. Removal of Nickel by low-cost biosorbents is an efficient method. This study was aimed to investigate the efficiency of activated carbon and Pinusroxburgiisaw dust for the removal of Nickel from industrial effluents using commercial Activated Carbon, and raw P.roxburgii saw dust. Batch and column adsorption experiments were conducted for the removal of Nickel. The study conducted indicates that removal of Nickel greatly dependent on pH, contact time, Nickel concentration, and adsorbent dose. Maximum removal occurred at pH 9, contact time of 600 min, and adsorbent dose of 1 g/100 mL. The highest removal was 99.62% and 92.39% (pH based), 99.76% and 99.9% (dose based), 99.80% and 100% (agitation time), 92% and 72.40% (Ni Conc. based) for P.roxburgii saw dust and activated Carbon, respectively. Similarly, the Ni removal in column adsorption was 99.77% and 99.99% (bed height based), 99.80% and 99.99% (Concentration based), 99.98%, and 99.81% (flow rate based) during column studies for Nickel using P.Roxburgiisaw dust and activated carbon, respectively. Results were compared with Freundlich isotherm model, which showed “r2” values of 0.9424 (Activated carbon) and 0.979 (P.RoxburgiiSaw Dust). While Langmuir isotherm model values were 0.9285 (Activated carbon) and 0.9999 (P.RoxburgiiSaw Dust), the experimental results were fitted to both the models. But the results were in close agreement with Langmuir isotherm model.

Keywords: nickel removal, batch, and column, activated carbon, saw dust, plant uptake

Procedia PDF Downloads 116
14246 Comparison of Susceptibility to Measles in Preterm Infants versus Term Infants

Authors: Joseph L. Mathew, Shourjendra N. Banerjee, R. K. Ratho, Sourabh Dutta, Vanita Suri

Abstract:

Background: In India and many other developing countries, a single dose of measles vaccine is administered to infants at 9 months of age. This is based on the assumption that maternal transplacentally transferred antibodies will protect infants until that age. However, our previous data showed that most infants lose maternal anti-measles antibodies before 6 months of age, making them susceptible to measles before vaccination at 9 months. Objective: This prospective study was designed to compare susceptibility in pre-term vs term infants, at different time points. Material and Methods: Following Institutional Ethics Committee approval and a formal informed consent process, venous blood was drawn from a cohort of 45 consecutive term infants and 45 consecutive pre-term infants (both groups delivered by the vaginal route); at birth, 3 months, 6 months and 9 months (prior to measles vaccination). Serum was separated and anti-measles IgG antibody levels were measured by quantitative ELISA kits (with sensitivity and specificity > 95%). Susceptibility to measles was defined as antibody titre < 200mIU/ml. The mean antibody levels were compared between the two groups at the four time points. Results: The mean gestation of term babies was 38.5±1.2 weeks; and pre-term babies 34.7±2.8 weeks. The respective mean birth weights were 2655±215g and 1985±175g. Reliable maternal vaccination record was available in only 7 of the 90 mothers. Mean anti-measles IgG antibody (±SD) in terms babies was 3165±533 IU/ml at birth, 1074±272 IU/ml at 3 months, 314±153 IU/ml at 6 months, and 68±21 IU/ml at 9 months. The corresponding levels in pre-term babies were 2875±612 IU/ml, 948±377 IU/ml, 265±98 IU/ml, and 72±33 IU/ml at 9 months (p > 0.05 for all inter-group comparisons). The proportion of susceptible term infants at birth, 3months, 6months and 9months was 0%, 16%, 67% and 96%. The corresponding proportions in the pre-term infants were 0%, 29%, 82%, and 100% (p > 0.05 for all inter-group comparisons). Conclusion: Majority of infants are susceptible to measles before 9 months of age suggesting the need to anticipate measles vaccination, but there was no statistically significant difference between the proportion of susceptible term and pre-term infants, at any of the four-time points. A larger study is required to confirm these findings and compare sero-protection if vaccination is anticipated to be administered between 6 and 9 months.

Keywords: measles, preterm, susceptibility, term infant

Procedia PDF Downloads 254
14245 Specification and Unification of All Fundamental Forces Exist in Universe in the Theoretical Perspective – The Universal Mechanics

Authors: Surendra Mund

Abstract:

At the beginning, the physical entity force was defined mathematically by Sir Isaac Newton in his Principia Mathematica as F ⃗=(dp ⃗)/dt in form of his second law of motion. Newton also defines his Universal law of Gravitational force exist in same outstanding book, but at the end of 20th century and beginning of 21st century, we have tried a lot to specify and unify four or five Fundamental forces or Interaction exist in universe, but we failed every time. Usually, Gravity creates problems in this unification every single time, but in my previous papers and presentations, I defined and derived Field and force equations for Gravitational like Interactions for each and every kind of central systems. This force is named as Variational Force by me, and this force is generated by variation in the scalar field density around the body. In this particular paper, at first, I am specifying which type of Interactions are Fundamental in Universal sense (or in all type of central systems or bodies predicted by my N-time Inflationary Model of Universe) and then unify them in Universal framework (defined and derived by me as Universal Mechanics in a separate paper) as well. This will also be valid in Universal dynamical sense which includes inflations and deflations of universe, central system relativity, Universal relativity, ϕ-ψ transformation and transformation of spin, physical perception principle, Generalized Fundamental Dynamical Law and many other important Generalized Principles of Generalized Quantum Mechanics (GQM) and Central System Theory (CST). So, In this article, at first, I am Generalizing some Fundamental Principles, and then Unifying Variational Forces (General form of Gravitation like Interactions) and Flow Generated Force (General form of EM like Interactions), and then Unify all Fundamental Forces by specifying Weak and Strong Interactions in form of more basic terms - Variational, Flow Generated and Transformational Interactions.

Keywords: Central System Force, Disturbance Force, Flow Generated Forces, Generalized Nuclear Force, Generalized Weak Interactions, Generalized EM-Like Interactions, Imbalance Force, Spin Generated Forces, Transformation Generated Force, Unified Force, Universal Mechanics, Uniform And Non-Uniform Variational Interactions, Variational Interactions

Procedia PDF Downloads 35
14244 Legal Pluralism and Ideology: The Recognition of the Indigenous Justice Administration in Bolivia through the "Indigenismo" and "Decolonisation" Discourses

Authors: Adriana Pereira Arteaga

Abstract:

In many Latin American countries the transition towards legal pluralism - has developed as part of what is called Latin-American-Constitutionalism over the last thirty years. The aim of this paper is to discuss how legal pluralism in its current form in Bolivia may produce exclusion and violence. Legal sources and discourse analysis - as an approach to examine written language on discourse documentation- will be used to develop this paper. With the constitution of 2009, Bolivia was symbolically "re-founded" into a multi-nation state. This shift goes hand in hand with the "indigenista" and "decolonisation" ideologies developing since the early 20th century. Discourses based on these ideologies reflect the rejection of liberal and western premises on which the Bolivian republic was originally built after independence. According to the "indigenista" movements, the liberal nation-state generates institutions corresponding to a homogenous society. These liberal institutions not only ignore the Bolivian multi-nation reality, but also maintain the social structures originating form the colony times, based on prejudices against the indigenous. The described statements were elaborated through the image: the indigenous people humiliated by a cruel western system as highlighted by the constitution's preamble. This narrative had a considerable impact on the sensitivity of people and received great social support. Therefore the proposal for changing structures of the nation-state, is charged with an emancipatory message of restoring even the pre-Columbian order. An order at times romantically described as the perfect order. Legally this connotes a rejection of the positivistic national legal system based on individual rights and the promotion of constitutional recognition of indigenous justice administration. The pluralistic Constitution is supposed to promote tolerance and a peaceful coexistence among nations, so that the unity and integrity of the country could be maintained. In its current form, legal pluralism in Bolivia is justified on pre-existing rights contained for example in the International - Labour - Organization - Convention 169, but it is more developed on the described discursive constructions. Over time these discursive constructions created inconsistencies in terms of putting indigenous justice administration into practice: First, because legal pluralism has been more developed on level of political discourse, so a real interaction between the national and the indigenous jurisdiction cannot be observed. There are no clear coordination and cooperation mechanisms. Second, since the recently reformed constitution is based on deep sensitive experiences, little is said about the general legal principles on which a pluralistic administration of justice in Bolivia should be based. Third, basic rights, liberties, and constitutional guarantees are also affected by the antagonized image of the national justice administration. As a result, fundamental rights could be violated on a large scale because many indigenous justice administration practices run counter to these constitutional rules. These problems are not merely Bolivian but may also be encountered in other regional countries with similar backgrounds, like Ecuador.

Keywords: discourse, indigenous justice, legal pluralism, multi-nation

Procedia PDF Downloads 435
14243 A Concept for Flexible Battery Cell Manufacturing from Low to Medium Volumes

Authors: Tim Giesen, Raphael Adamietz, Pablo Mayer, Philipp Stiefel, Patrick Alle, Dirk Schlenker

Abstract:

The competitiveness and success of new electrical energy storages such as battery cells are significantly dependent on a short time-to-market. Producers who decide to supply new battery cells to the market need to be easily adaptable in manufacturing with respect to the early customers’ needs in terms of cell size, materials, delivery time and quantity. In the initial state, the required output rates do not yet allow the producers to have a fully automated manufacturing line nor to supply handmade battery cells. Yet there was no solution for manufacturing battery cells in low to medium volumes in a reproducible way. Thus, in terms of cell format and output quantity, a concept for the flexible assembly of battery cells was developed by the Fraunhofer-Institute for Manufacturing Engineering and Automation. Based on clustered processes, the modular system platform can be modified, enlarged or retrofitted in a short time frame according to the ordered product. The paper shows the analysis of the production steps from a conventional battery cell assembly line. Process solutions were found by using I/O-analysis, functional structures, and morphological boxes. The identified elementary functions were subsequently clustered by functional coherences for automation solutions and thus the single process cluster was generated. The result presented in this paper enables to manufacture different cell products on the same production system using seven process clusters. The paper shows the solution for a batch-wise flexible battery cell production using advanced process control. Further, the performed tests and benefits by using the process clusters as cyber-physical systems for an integrated production and value chain are discussed. The solution lowers the hurdles for SMEs to launch innovative cell products on the global market.

Keywords: automation, battery production, carrier, advanced process control, cyber-physical system

Procedia PDF Downloads 317
14242 Microstracture of Iranian Processed Cheese

Authors: R. Ezzati, M. Dezyani, H. Mirzaei

Abstract:

The effects of the concentration of trisodium citrate (TSC) emulsifying salt (0.25 to 2.75%) and holding time (0 to 20 min) on the textural, rheological, and microstructural properties of Iranian Processed Cheese Cheddar cheese were studied using a central composite rotatable design. The loss tangent parameter (from small amplitude oscillatory rheology), extent of flow, and melt area (from the Schreiber test) all indicated that the meltability of process cheese decreased with increased concentration of TSC and that holding time led to a slight reduction in meltability. Hardness increased as the concentration of TSC increased. Fluorescence micrographs indicated that the size of fat droplets decreased with an increase in the concentration of TSC and with longer holding times. Acid-base titration curves indicated that the buffering peak at pH 4.8, which is due to residual colloidal calcium phosphate, decreased as the concentration of TSC increased. The soluble phosphate content increased as concentration of TSC increased. However, the insoluble Ca decreased with increasing concentration of TSC. The results of this study suggest that TSC chelated Ca from colloidal calcium phosphate and dispersed casein; the citrate-Ca complex remained trapped within the process cheese matrix. Increasing the concentration of TSC helped to improve fat emulsification and casein dispersion during cooking, both of which probably helped to reinforce the structure of process cheese.

Keywords: Iranian processed cheese, cheddar cheese, emulsifying salt, rheology

Procedia PDF Downloads 431
14241 Development of Biosensor Chip for Detection of Specific Antibodies to HSV-1

Authors: Zatovska T. V., Nesterova N. V., Baranova G. V., Zagorodnya S. D.

Abstract:

In recent years, biosensor technologies based on the phenomenon of surface plasmon resonance (SPR) are becoming increasingly used in biology and medicine. Their application facilitates exploration in real time progress of binding of biomolecules and identification of agents that specifically interact with biologically active substances immobilized on the biosensor surface (biochips). Special attention is paid to the use of Biosensor analysis in determining the antibody-antigen interaction in the diagnostics of diseases caused by viruses and bacteria. According to WHO, the diseases that are caused by the herpes simplex virus (HSV), take second place (15.8%) after influenza as a cause of death from viral infections. Current diagnostics of HSV infection include PCR and ELISA assays. The latter allows determination the degree of immune response to viral infection and respective stages of its progress. In this regard, the searches for new and available diagnostic methods are very important. This work was aimed to develop Biosensor chip for detection of specific antibodies to HSV-1 in the human blood serum. The proteins of HSV1 (strain US) were used as antigens. The viral particles were accumulated in cell culture MDBK and purified by differential centrifugation in cesium chloride density gradient. Analysis of the HSV1 proteins was performed by polyacrylamide gel electrophoresis and ELISA. The protein concentration was measured using De Novix DS-11 spectrophotometer. The device for detection of antigen-antibody interactions was an optoelectronic two-channel spectrometer ‘Plasmon-6’, using the SPR phenomenon in the Krechman optical configuration. It was developed at the Lashkarev Institute of Semiconductor Physics of NASU. The used carrier was a glass plate covered with 45 nm gold film. Screening of human blood serums was performed using the test system ‘HSV-1 IgG ELISA’ (GenWay, USA). Development of Biosensor chip included optimization of conditions of viral antigen sorption and analysis steps. For immobilization of viral proteins 0.2% solution of Dextran 17, 200 (Sigma, USA) was used. Sorption of antigen took place at 4-8°C within 18-24 hours. After washing of chip, three times with citrate buffer (pH 5,0) 1% solution of BSA was applied to block the sites not occupied by viral antigen. It was found direct dependence between the amount of immobilized HSV1 antigen and SPR response. Using obtained biochips, panels of 25 positive and 10 negative for the content of antibodies to HSV-1 human sera were analyzed. The average value of SPR response was 185 a.s. for negative sera and from 312 to. 1264 a.s. for positive sera. It was shown that SPR data were agreed with ELISA results in 96% of samples proving the great potential of SPR in such researches. It was investigated the possibility of biochip regeneration and it was shown that application of 10 mM NaOH solution leads to rupture of intermolecular bonds. This allows reuse the chip several times. Thus, in this study biosensor chip for detection of specific antibodies to HSV1 was successfully developed expanding a range of diagnostic methods for this pathogen.

Keywords: biochip, herpes virus, SPR

Procedia PDF Downloads 410
14240 Artificial Intelligence and Distributed System Computing: Application and Practice in Real Life

Authors: Lai Junzhe, Wang Lihao, Burra Venkata Durga Kumar

Abstract:

In recent years, due to today's global technological advances, big data and artificial intelligence technologies have been widely used in various industries and fields, playing an important role in reducing costs and increasing efficiency. Among them, artificial intelligence has derived another branch in its own continuous progress and the continuous development of computer personnel, namely distributed artificial intelligence computing systems. Distributed AI is a method for solving complex learning, decision-making, and planning problems, characterized by the ability to take advantage of large-scale computation and the spatial distribution of resources, and accordingly, it can handle problems with large data sets. Nowadays, distributed AI is widely used in military, medical, and human daily life and brings great convenience and efficient operation to life. In this paper, we will discuss three areas of distributed AI computing systems in vision processing, blockchain, and smart home to introduce the performance of distributed systems and the role of AI in distributed systems.

Keywords: distributed system, artificial intelligence, blockchain, IoT, visual information processing, smart home

Procedia PDF Downloads 96
14239 Platform-as-a-Service Sticky Policies for Privacy Classification in the Cloud

Authors: Maha Shamseddine, Amjad Nusayr, Wassim Itani

Abstract:

In this paper, we present a Platform-as-a-Service (PaaS) model for controlling the privacy enforcement mechanisms applied on user data when stored and processed in Cloud data centers. The proposed architecture consists of establishing user configurable ‘sticky’ policies on the Graphical User Interface (GUI) data-bound components during the application development phase to specify the details of privacy enforcement on the contents of these components. Various privacy classification classes on the data components are formally defined to give the user full control on the degree and scope of privacy enforcement including the type of execution containers to process the data in the Cloud. This not only enhances the privacy-awareness of the developed Cloud services, but also results in major savings in performance and energy efficiency due to the fact that the privacy mechanisms are solely applied on sensitive data units and not on all the user content. The proposed design is implemented in a real PaaS cloud computing environment on the Microsoft Azure platform.

Keywords: privacy enforcement, platform-as-a-service privacy awareness, cloud computing privacy

Procedia PDF Downloads 207
14238 Functionality Based Composition of Web Services to Attain Maximum Quality of Service

Authors: M. Mohemmed Sha Mohamed Kunju, Abdalla A. Al-Ameen Abdurahman, T. Manesh Thankappan, A. Mohamed Mustaq Ahmed Hameed

Abstract:

Web service composition is an effective approach to complete the web based tasks with desired quality. A single web service with limited functionality is inadequate to execute a specific task with series of action. So, it is very much required to combine multiple web services with different functionalities to reach the target. Also, it will become more and more challenging, when these services are from different providers with identical functionalities and varying QoS, so while composing the web services, the overall QoS is considered to be the major factor. Also, it is not true that the expected QoS is always attained when the task is completed. A single web service in the composed chain may affect the overall performance of the task. So care should be taken in different aspects such as functionality of the service, while composition. Dynamic and automatic service composition is one of the main option available. But to achieve the actual functionality of the task, quality of the individual web services are also important. Normally the QoS of the individual service can be evaluated by using the non-functional parameters such as response time, throughput, reliability, availability, etc. At the same time, the QoS is not needed to be at the same level for all the composed services. So this paper proposes a framework that allows composing the services in terms of QoS by setting the appropriate weight to the non-functional parameters of each individual web service involved in the task. Experimental results show that the importance given to the non-functional parameter while composition will definitely improve the performance of the web services.

Keywords: composition, non-functional parameters, quality of service, web service

Procedia PDF Downloads 315
14237 Identifying the Structural Components of Old Buildings from Floor Plans

Authors: Shi-Yu Xu

Abstract:

The top three risk factors that have contributed to building collapses during past earthquake events in Taiwan are: "irregular floor plans or elevations," "insufficient columns in single-bay buildings," and the "weak-story problem." Fortunately, these unsound structural characteristics can be directly identified from the floor plans. However, due to the vast number of old buildings, conducting manual inspections to identify these compromised structural features in all existing structures would be time-consuming and prone to human errors. This study aims to develop an algorithm that utilizes artificial intelligence techniques to automatically pinpoint the structural components within a building's floor plans. The obtained spatial information will be utilized to construct a digital structural model of the building. This information, particularly regarding the distribution of columns in the floor plan, can then be used to conduct preliminary seismic assessments of the building. The study employs various image processing and pattern recognition techniques to enhance detection efficiency and accuracy. The study enables a large-scale evaluation of structural vulnerability for numerous old buildings, providing ample time to arrange for structural retrofitting in those buildings that are at risk of significant damage or collapse during earthquakes.

Keywords: structural vulnerability detection, object recognition, seismic capacity assessment, old buildings, artificial intelligence

Procedia PDF Downloads 72
14236 Improved Multi–Objective Firefly Algorithms to Find Optimal Golomb Ruler Sequences for Optimal Golomb Ruler Channel Allocation

Authors: Shonak Bansal, Prince Jain, Arun Kumar Singh, Neena Gupta

Abstract:

Recently nature–inspired algorithms have widespread use throughout the tough and time consuming multi–objective scientific and engineering design optimization problems. In this paper, we present extended forms of firefly algorithm to find optimal Golomb ruler (OGR) sequences. The OGRs have their one of the major application as unequally spaced channel–allocation algorithm in optical wavelength division multiplexing (WDM) systems in order to minimize the adverse four–wave mixing (FWM) crosstalk effect. The simulation results conclude that the proposed optimization algorithm has superior performance compared to the existing conventional computing and nature–inspired optimization algorithms to find OGRs in terms of ruler length, total optical channel bandwidth and computation time.

Keywords: channel allocation, conventional computing, four–wave mixing, nature–inspired algorithm, optimal Golomb ruler, lévy flight distribution, optimization, improved multi–objective firefly algorithms, Pareto optimal

Procedia PDF Downloads 303
14235 Parametric Study of Ball and Socket Joint for Bio-Mimicking Exoskeleton

Authors: Mukesh Roy, Basant Singh Sikarwar, Ravi Prakash, Priya Ranjan, Ayush Goyal

Abstract:

More than 11% of people suffer from weakness in the bone resulting in inability in walking or climbing stairs or from limited upper body and limb immobility. This motivates a fresh bio-mimicking solution to the design of an exo-skeleton to support human movement in the case of partial or total immobility either due to congenital or genetic factors or due to some accident or due to geratological factors. A deeper insight and detailed understanding is required into the workings of the ball and socket joints. Our research is to mimic ball and socket joints to design snugly fitting exoskeletons. Our objective is to design an exoskeleton which is comfortable and the presence of which is not felt if not in use. Towards this goal, a parametric study is conducted to provide detailed design parameters to fabricate an exoskeleton. This work builds up on real data of the design of the exoskeleton, so that the designed exo-skeleton will be able to provide required strength and support to the subject.

Keywords: bio-mimicking, exoskeleton, ball joint, socket joint, artificial limb, patient rehabilitation, joints, human-machine interface, wearable robotics

Procedia PDF Downloads 276
14234 Polarization as a Proxy of Misinformation Spreading

Authors: Michela Del Vicario, Walter Quattrociocchi, Antonio Scala, Ana Lucía Schmidt, Fabiana Zollo

Abstract:

Information, rumors, and debates may shape and impact public opinion heavily. In the latest years, several concerns have been expressed about social influence on the Internet and the outcome that online debates might have on real-world processes. Indeed, on online social networks users tend to select information that is coherent to their system of beliefs and to form groups of like-minded people –i.e., echo chambers– where they reinforce and polarize their opinions. In this way, the potential benefits coming from the exposure to different points of view may be reduced dramatically, and individuals' views may become more and more extreme. Such a context fosters misinformation spreading, which has always represented a socio-political and economic risk. The persistence of unsubstantiated rumors –e.g., the hypothetical and hazardous link between vaccines and autism– suggests that social media do have the power to misinform, manipulate, or control public opinion. As an example, current approaches such as debunking efforts or algorithmic-driven solutions based on the reputation of the source seem to prove ineffective against collective superstition. Indeed, experimental evidence shows that confirmatory information gets accepted even when containing deliberately false claims while dissenting information is mainly ignored, influences users’ emotions negatively and may even increase group polarization. Moreover, confirmation bias has been shown to play a pivotal role in information cascades, posing serious warnings about the efficacy of current debunking efforts. Nevertheless, mitigation strategies have to be adopted. To generalize the problem and to better understand social dynamics behind information spreading, in this work we rely on a tight quantitative analysis to investigate the behavior of more than 300M users w.r.t. news consumption on Facebook over a time span of six years (2010-2015). Through a massive analysis on 920 news outlets pages, we are able to characterize the anatomy of news consumption on a global and international scale. We show that users tend to focus on a limited set of pages (selective exposure) eliciting a sharp and polarized community structure among news outlets. Moreover, we find similar patterns around the Brexit –the British referendum to leave the European Union– debate, where we observe the spontaneous emergence of two well segregated and polarized groups of users around news outlets. Our findings provide interesting insights into the determinants of polarization and the evolution of core narratives on online debating. Our main aim is to understand and map the information space on online social media by identifying non-trivial proxies for the early detection of massive informational cascades. Furthermore, by combining users traces, we are finally able to draft the main concepts and beliefs of the core narrative of an echo chamber and its related perceptions.

Keywords: information spreading, misinformation, narratives, online social networks, polarization

Procedia PDF Downloads 274
14233 From Text to Data: Sentiment Analysis of Presidential Election Political Forums

Authors: Sergio V Davalos, Alison L. Watkins

Abstract:

User generated content (UGC) such as website post has data associated with it: time of the post, gender, location, type of device, and number of words. The text entered in user generated content (UGC) can provide a valuable dimension for analysis. In this research, each user post is treated as a collection of terms (words). In addition to the number of words per post, the frequency of each term is determined by post and by the sum of occurrences in all posts. This research focuses on one specific aspect of UGC: sentiment. Sentiment analysis (SA) was applied to the content (user posts) of two sets of political forums related to the US presidential elections for 2012 and 2016. Sentiment analysis results in deriving data from the text. This enables the subsequent application of data analytic methods. The SASA (SAIL/SAI Sentiment Analyzer) model was used for sentiment analysis. The application of SASA resulted with a sentiment score for each post. Based on the sentiment scores for the posts there are significant differences between the content and sentiment of the two sets for the 2012 and 2016 presidential election forums. In the 2012 forums, 38% of the forums started with positive sentiment and 16% with negative sentiment. In the 2016 forums, 29% started with positive sentiment and 15% with negative sentiment. There also were changes in sentiment over time. For both elections as the election got closer, the cumulative sentiment score became negative. The candidate who won each election was in the more posts than the losing candidates. In the case of Trump, there were more negative posts than Clinton’s highest number of posts which were positive. KNIME topic modeling was used to derive topics from the posts. There were also changes in topics and keyword emphasis over time. Initially, the political parties were the most referenced and as the election got closer the emphasis changed to the candidates. The performance of the SASA method proved to predict sentiment better than four other methods in Sentibench. The research resulted in deriving sentiment data from text. In combination with other data, the sentiment data provided insight and discovery about user sentiment in the US presidential elections for 2012 and 2016.

Keywords: sentiment analysis, text mining, user generated content, US presidential elections

Procedia PDF Downloads 173
14232 System of Quality Automation for Documents (SQAD)

Authors: R. Babi Saraswathi, K. Divya, A. Habeebur Rahman, D. B. Hari Prakash, S. Jayanth, T. Kumar, N. Vijayarangan

Abstract:

Document automation is the design of systems and workflows, assembling repetitive documents to meet the specific business needs. In any organization or institution, documenting employee’s information is very important for both employees as well as management. It shows an individual’s progress to the management. Many documents of the employee are in the form of papers, so it is very difficult to arrange and for future reference we need to spend more time in getting the exact document. Also, it is very tedious to generate reports according to our needs. The process gets even more difficult on getting approvals and hence lacks its security aspects. This project overcomes the above-stated issues. By storing the details in the database and maintaining the e-documents, the automation system reduces the manual work to a large extent. Then the approval process of some important documents can be done in a much-secured manner by using Digital Signature and encryption techniques. Details are maintained in the database and e-documents are stored in specific folders and generation of various kinds of reports is possible. Moreover, an efficient search method is implemented is used in the database. Automation supporting document maintenance in many aspects is useful for minimize data entry, reduce the time spent on proof-reading, avoids duplication, and reduce the risks associated with the manual error, etc.

Keywords: e-documents, automation, digital signature, encryption

Procedia PDF Downloads 377
14231 Crisis Management and Corporate Political Activism: A Qualitative Analysis of Online Reactions toward Tesla

Authors: Roxana D. Maiorescu-Murphy

Abstract:

In the US, corporations have recently embraced political stances in an attempt to respond to the external pressure exerted by activist groups. To date, research in this area remains in its infancy, and few studies have been conducted on the way stakeholder groups respond to corporate political advocacy in general and in the immediacy of such a corporate announcement in particular. The current study aims to fill in this research void. In addition, the study contributes to an emerging trajectory in the field of crisis management by focusing on the delineation between crises (unexpected events related to products and services) and scandals (crises that spur moral outrage). The present study looked at online reactions in the aftermath of Elon Musk’s endorsement of the Republican party on Twitter. Two data sets were collected from Twitter following two political endorsements made by Elon Musk on May 18, 2022, and June 15, 2022, respectively. The total sample of analysis stemming from the data two sets consisted of N=1,374 user comments written as a response to Musk’s initial tweets. Given the paucity of studies in the preceding research areas, the analysis employed a case study methodology, used in circumstances in which the phenomena to be studied had not been researched before. According to the case study methodology, which answers the questions of how and why a phenomenon occurs, this study responded to the research questions of how online users perceived Tesla and why they did so. The data were analyzed in NVivo by the use of the grounded theory methodology, which implied multiple exposures to the text and the undertaking of an inductive-deductive approach. Through multiple exposures to the data, the researcher ascertained the common themes and subthemes in the online discussion. Each theme and subtheme were later defined and labeled. Additional exposures to the text ensured that these were exhaustive. The results revealed that the CEO’s political endorsements triggered moral outrage, leading to Tesla’s facing a scandal as opposed to a crisis. The moral outrage revolved around the stakeholders’ predominant rejection of a perceived intrusion of an influential figure on a domain reserved for voters. As expected, Musk’s political endorsements led to polarizing opinions, and those who opposed his views engaged in online activism aimed to boycott the Tesla brand. These findings reveal that the moral outrage that characterizes a scandal requires communication practices that differ from those that practitioners currently borrow from the field of crisis management. Specifically, because scandals flourish in online settings, practitioners should regularly monitor stakeholder perceptions and address them in real-time. While promptness is essential when managing crises, it becomes crucial to respond immediately as a scandal is flourishing online. Finally, attempts should be made to distance a brand, its products, and its CEO from the latter’s political views.

Keywords: crisis management, communication management, Tesla, corporate political activism, Elon Musk

Procedia PDF Downloads 79
14230 Entrepreneurship as a Strategy for National Development and Attainment of Millennium Development Goals (MDGs)

Authors: Udokporo Emeka Leonard

Abstract:

The thrust of this paper is to examine how entrepreneurship can assist in the attainment of the first goal among the MDGs – eradication of extreme poverty and hunger in Nigeria. The paper discusses how national development can be driven through employment creation and wealth generation that can lead to reduction in widespread poverty so as to attain one crucial target, in fewer years. The task before Nigeria is certainly a herculean one; it is, in fact a race against time. However, in view of the clear and present danger that the increasing rate of poverty portends for our democracy and our nation, is a race we must; for it is a time bomb on our hands. The paper has been structured into sections; with the introduction as section one. Section two discusses the concept of entrepreneurship; Section three examines the link between entrepreneurship and economic development, while section four examines the challenges facing entrepreneurship in Nigeria. In section five, measures and recommendations to boost entrepreneurship that can drive economic development that translates into poverty reduction and employment creation in Nigeria are suggested. This work is a literature review with some understanding of current trends and situations. It outlines some of the difficulties facing entrepreneurship in Nigeria as the operating environment, inadequate understanding and skewed incentive. It also makes recommendations on possible ways to significantly reduce poverty in 2015.

Keywords: development, entrepreneur, Nigeria, poverty

Procedia PDF Downloads 266