Search results for: data interpolating empirical orthogonal function
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 29461

Search results for: data interpolating empirical orthogonal function

26371 Exploring Data Stewardship in Fog Networking Using Blockchain Algorithm

Authors: Ruvaitha Banu, Amaladhithyan Krishnamoorthy

Abstract:

IoT networks today solve various consumer problems, from home automation systems to aiding in driving autonomous vehicles with the exploration of multiple devices. For example, in an autonomous vehicle environment, multiple sensors are available on roads to monitor weather and road conditions and interact with each other to aid the vehicle in reaching its destination safely and timely. IoT systems are predominantly dependent on the cloud environment for data storage, and computing needs that result in latency problems. With the advent of Fog networks, some of this storage and computing is pushed to the edge/fog nodes, saving the network bandwidth and reducing the latency proportionally. Managing the data stored in these fog nodes becomes crucial as it might also store sensitive information required for a certain application. Data management in fog nodes is strenuous because Fog networks are dynamic in terms of their availability and hardware capability. It becomes more challenging when the nodes in the network also live a short span, detaching and joining frequently. When an end-user or Fog Node wants to access, read, or write data stored in another Fog Node, then a new protocol becomes necessary to access/manage the data stored in the fog devices as a conventional static way of managing the data doesn’t work in Fog Networks. The proposed solution discusses a protocol that acts by defining sensitivity levels for the data being written and read. Additionally, a distinct data distribution and replication model among the Fog nodes is established to decentralize the access mechanism. In this paper, the proposed model implements stewardship towards the data stored in the Fog node using the application of Reinforcement Learning so that access to the data is determined dynamically based on the requests.

Keywords: IoT, fog networks, data stewardship, dynamic access policy

Procedia PDF Downloads 43
26370 An Automated Approach to Consolidate Galileo System Availability

Authors: Marie Bieber, Fabrice Cosson, Olivier Schmitt

Abstract:

Europe's Global Navigation Satellite System, Galileo, provides worldwide positioning and navigation services. The satellites in space are only one part of the Galileo system. An extensive ground infrastructure is essential to oversee the satellites and ensure accurate navigation signals. High reliability and availability of the entire Galileo system are crucial to continuously provide positioning information of high quality to users. Outages are tracked, and operational availability is regularly assessed. A highly flexible and adaptive tool has been developed to automate the Galileo system availability analysis. Not only does it enable a quick availability consolidation, but it also provides first steps towards improving the data quality of maintenance tickets used for the analysis. This includes data import and data preparation, with a focus on processing strings used for classification and identifying faulty data. Furthermore, the tool allows to handle a low amount of data, which is a major constraint when the aim is to provide accurate statistics.

Keywords: availability, data quality, system performance, Galileo, aerospace

Procedia PDF Downloads 147
26369 Design Flood Estimation in Satluj Basin-Challenges for Sunni Dam Hydro Electric Project, Himachal Pradesh-India

Authors: Navneet Kalia, Lalit Mohan Verma, Vinay Guleria

Abstract:

Introduction: Design Flood studies are essential for effective planning and functioning of water resource projects. Design flood estimation for Sunni Dam Hydro Electric Project located in State of Himachal Pradesh, India, on the river Satluj, was a big challenge in view of the river flowing in the Himalayan region from Tibet to India, having a large catchment area of varying topography, climate, and vegetation. No Discharge data was available for the part of the river in Tibet, whereas, for India, it was available only at Khab, Rampur, and Luhri. The estimation of Design Flood using standard methods was not possible. This challenge was met using two different approaches for upper (snow-fed) and lower (rainfed) catchment using Flood Frequency Approach and Hydro-metrological approach. i) For catchment up to Khab Gauging site (Sub-Catchment, C1), Flood Frequency approach was used. Around 90% of the catchment area (46300 sqkm) up to Khab is snow-fed which lies above 4200m. In view of the predominant area being snow-fed area, 1 in 10000 years return period flood estimated using Flood Frequency analysis at Khab was considered as Probable Maximum Flood (PMF). The flood peaks were taken from daily observed discharges at Khab, which were increased by 10% to make them instantaneous. Design Flood of 4184 cumec thus obtained was considered as PMF at Khab. ii) For catchment between Khab and Sunni Dam (Sub-Catchment, C2), Hydro-metrological approach was used. This method is based upon the catchment response to the rainfall pattern observed (Probable Maximum Precipitation - PMP) in a particular catchment area. The design flood computation mainly involves the estimation of a design storm hyetograph and derivation of the catchment response function. A unit hydrograph is assumed to represent the response of the entire catchment area to a unit rainfall. The main advantage of the hydro-metrological approach is that it gives a complete flood hydrograph which allows us to make a realistic determination of its moderation effect while passing through a reservoir or a river reach. These studies were carried out to derive PMF for the catchment area between Khab and Sunni Dam site using a 1-day and 2-day PMP values of 232 and 416 cm respectively. The PMF so obtained was 12920.60 cumec. Final Result: As the Catchment area up to Sunni Dam has been divided into 2 sub-catchments, the Flood Hydrograph for the Catchment C1 has been routed through the connecting channel reach (River Satluj) using Muskingum method and accordingly, the Design Flood was computed after adding the routed flood ordinates with flood ordinates of catchment C2. The total Design Flood (i.e. 2-Day PMF) with a peak of 15473 cumec was obtained. Conclusion: Even though, several factors are relevant while deciding the method to be used for design flood estimation, data availability and the purpose of study are the most important factors. Since, generally, we cannot wait for the hydrological data of adequate quality and quantity to be available, flood estimation has to be done using whatever data is available. Depending upon the type of data available for a particular catchment, the method to be used is to be selected.

Keywords: design flood, design storm, flood frequency, PMF, PMP, unit hydrograph

Procedia PDF Downloads 311
26368 Compressible Lattice Boltzmann Method for Turbulent Jet Flow Simulations

Authors: K. Noah, F.-S. Lien

Abstract:

In Computational Fluid Dynamics (CFD), there are a variety of numerical methods, of which some depend on macroscopic model representatives. These models can be solved by finite-volume, finite-element or finite-difference methods on a microscopic description. However, the lattice Boltzmann method (LBM) is considered to be a mesoscopic particle method, with its scale lying between the macroscopic and microscopic scales. The LBM works well for solving incompressible flow problems, but certain limitations arise from solving compressible flows, particularly at high Mach numbers. An improved lattice Boltzmann model for compressible flow problems is presented in this research study. A higher-order Taylor series expansion of the Maxwell equilibrium distribution function is used to overcome limitations in LBM when solving high-Mach-number flows. Large eddy simulation (LES) is implemented in LBM to simulate turbulent jet flows. The results have been validated with available experimental data for turbulent compressible free jet flow at subsonic speeds.

Keywords: compressible lattice Boltzmann method, multiple relaxation times, large eddy simulation, turbulent jet flows

Procedia PDF Downloads 261
26367 Adaptive Control of Magnetorheological Damper Using Duffing-Like Model

Authors: Hung-Jiun Chi, Cheng-En Tsai, Jia-Ying Tu

Abstract:

Semi-active control of Magnetorheological (MR) dampers for vibration reduction of structural systems has received considerable attention in civil and earthquake engineering, because the effective stiffness and damping properties of MR fluid can change in a very short time in reaction to external loading, requiring only a low level of power. However, the inherent nonlinear dynamics of hysteresis raise challenges in the modeling and control processes. In order to control the MR damper, an innovative Duffing-like equation is proposed to approximate the hysteresis dynamics in a deterministic and systematic manner than previously has been possible. Then, the model-reference adaptive control technique based on the Duffing-like model and the Lyapunov method is discussed. Parameter identification work with experimental data is presented to show the effectiveness of the Duffing-like model. In addition, simulation results show that the resulting adaptive gains enable the MR damper force to track the desired response of the reference model satisfactorily, verifying the effectiveness of the proposed modeling and control techniques.

Keywords: magnetorheological damper, duffing equation, model-reference adaptive control, Lyapunov function, hysteresis

Procedia PDF Downloads 355
26366 Performance Comparison of Different Regression Methods for a Polymerization Process with Adaptive Sampling

Authors: Florin Leon, Silvia Curteanu

Abstract:

Developing complete mechanistic models for polymerization reactors is not easy, because complex reactions occur simultaneously; there is a large number of kinetic parameters involved and sometimes the chemical and physical phenomena for mixtures involving polymers are poorly understood. To overcome these difficulties, empirical models based on sampled data can be used instead, namely regression methods typical of machine learning field. They have the ability to learn the trends of a process without any knowledge about its particular physical and chemical laws. Therefore, they are useful for modeling complex processes, such as the free radical polymerization of methyl methacrylate achieved in a batch bulk process. The goal is to generate accurate predictions of monomer conversion, numerical average molecular weight and gravimetrical average molecular weight. This process is associated with non-linear gel and glass effects. For this purpose, an adaptive sampling technique is presented, which can select more samples around the regions where the values have a higher variation. Several machine learning methods are used for the modeling and their performance is compared: support vector machines, k-nearest neighbor, k-nearest neighbor and random forest, as well as an original algorithm, large margin nearest neighbor regression. The suggested method provides very good results compared to the other well-known regression algorithms.

Keywords: batch bulk methyl methacrylate polymerization, adaptive sampling, machine learning, large margin nearest neighbor regression

Procedia PDF Downloads 293
26365 Teacher's Professional Burnout and Its Relationship with the Power of Self-Efficacy and Perceived Stress

Authors: Vilma Zydziunaite, Ausra Rutkiene

Abstract:

In modern society, problems related to the teacher's personality, mental and physical health, teacher's emotions and competencies are becoming more and more relevant. In Lithuania, compared to other European countries, teachers experience specific difficulties at work: they have to work in conditions of constant reforms and changes and face growing competition due to the decrease in students and schools. Professional burnout, teacher’s self-efficacy and perceived stress are interrelated personally and/or organisationally. So, the relationship between teachers' professional burnout, self-efficacy, and perceived stress in the school environment seems to be a relatively underresearched area in Lithuania. The research aim was to reveal and characterize teacher burnout, self-efficacy, and perceived stress in the Lithuanian school context. The quantitative research design with a questioning survey was chosen for the study. The sample size consisted of 427 Lithuanian teachers. Research results revealed the highest scores for exhaustion and the lowest for cynicism; at a time when the teacher experiences professional burnout, cynicism is observed as the weakest characteristic; no significant differences were found according to educational level work experience; significant differences were identified according to age for exhaustion and overall burnout level among teachers; the most of teachers in Lithuanian sample perceive the moderate stress level in school environment; overall burnout has a significant correlation with self-efficacy and stress among Lithuanian teachers. This study has empirical and practical implications: it is relevant to study the problems of teacher's professional burnout, stress, and self-efficacy in connection with contextual qualitative variables and specify the interrelationships between variables in order to be able to identify specific problems and provide empirical evidence to practically solve them. From a practical point of view, the results show that the socio-emotional state of teachers should not be dismissed as an insignificant aspect. Therefore, the school administration must make efforts to develop a positive school climate that supports the socio-emotional state of the teacher. At the same time, school administration must pay great attention to the development of teachers' socio-emotional competencies without ignoring their importance in the teacher's professional life.

Keywords: Lithuania, perceived stress, professional burnout, self-efficacy, teacher

Procedia PDF Downloads 36
26364 The Impact of the General Data Protection Regulation on Human Resources Management in Schools

Authors: Alexandra Aslanidou

Abstract:

The General Data Protection Regulation (GDPR), concerning the protection of natural persons within the European Union with regard to the processing of personal data and on the free movement of such data, became applicable in the European Union (EU) on 25 May 2018 and transformed the way personal data were being treated under the Data Protection Directive (DPD) regime, generating sweeping organizational changes to both public sector and business. A social practice that is considerably influenced in the way of its day-to-day operations is Human Resource (HR) management, for which the importance of GDPR cannot be underestimated. That is because HR processes personal data coming in all shapes and sizes from many different systems and sources. The significance of the proper functioning of an HR department, specifically in human-centered, service-oriented environments such as the education field, is decisive due to the fact that HR operations in schools, conducted effectively, determine the quality of the provided services and consequently have a considerable impact on the success of the educational system. The purpose of this paper is to analyze the decisive role that GDPR plays in HR departments that operate in schools and in order to practically evaluate the aftermath of the Regulation during the first months of its applicability; a comparative use cases analysis in five highly dynamic schools, across three EU Member States, was attempted.

Keywords: general data protection regulation, human resource management, educational system

Procedia PDF Downloads 89
26363 Real-Time Data Stream Partitioning over a Sliding Window in Real-Time Spatial Big Data

Authors: Sana Hamdi, Emna Bouazizi, Sami Faiz

Abstract:

In recent years, real-time spatial applications, like location-aware services and traffic monitoring, have become more and more important. Such applications result dynamic environments where data as well as queries are continuously moving. As a result, there is a tremendous amount of real-time spatial data generated every day. The growth of the data volume seems to outspeed the advance of our computing infrastructure. For instance, in real-time spatial Big Data, users expect to receive the results of each query within a short time period without holding in account the load of the system. But with a huge amount of real-time spatial data generated, the system performance degrades rapidly especially in overload situations. To solve this problem, we propose the use of data partitioning as an optimization technique. Traditional horizontal and vertical partitioning can increase the performance of the system and simplify data management. But they remain insufficient for real-time spatial Big data; they can’t deal with real-time and stream queries efficiently. Thus, in this paper, we propose a novel data partitioning approach for real-time spatial Big data named VPA-RTSBD (Vertical Partitioning Approach for Real-Time Spatial Big data). This contribution is an implementation of the Matching algorithm for traditional vertical partitioning. We find, firstly, the optimal attribute sequence by the use of Matching algorithm. Then, we propose a new cost model used for database partitioning, for keeping the data amount of each partition more balanced limit and for providing a parallel execution guarantees for the most frequent queries. VPA-RTSBD aims to obtain a real-time partitioning scheme and deals with stream data. It improves the performance of query execution by maximizing the degree of parallel execution. This affects QoS (Quality Of Service) improvement in real-time spatial Big Data especially with a huge volume of stream data. The performance of our contribution is evaluated via simulation experiments. The results show that the proposed algorithm is both efficient and scalable, and that it outperforms comparable algorithms.

Keywords: real-time spatial big data, quality of service, vertical partitioning, horizontal partitioning, matching algorithm, hamming distance, stream query

Procedia PDF Downloads 147
26362 Characterization and Correlation of Neurodegeneration and Biological Markers of Model Mice with Traumatic Brain Injury and Alzheimer's Disease

Authors: J. DeBoard, R. Dietrich, J. Hughes, K. Yurko, G. Harms

Abstract:

Alzheimer’s disease (AD) is a predominant type of dementia and is likely a major cause of neural network impairment. The pathogenesis of this neurodegenerative disorder has yet to be fully elucidated. There are currently no known cures for the disease, and the best hope is to be able to detect it early enough to impede its progress. Beyond age and genetics, another prevalent risk factor for AD might be traumatic brain injury (TBI), which has similar neurodegenerative hallmarks. Our research focuses on obtaining information and methods to be able to predict when neurodegenerative effects might occur at a clinical level by observation of events at a cellular and molecular level in model mice. First, we wish to introduce our evidence that brain damage can be observed via brain imaging prior to the noticeable loss of neuromuscular control in model mice of AD. We then show our evidence that some blood biomarkers might be able to be early predictors of AD in the same model mice. Thus, we were interested to see if we might be able to predict which mice might show long-term neurodegenerative effects due to differing degrees of TBI and what level of TBI causes further damage and earlier death to the AD model mice. Upon application of TBIs via an apparatus to effectively induce extremely mild to mild TBIs, wild-type (WT) mice and AD mouse models were tested for cognition, neuromuscular control, olfactory ability, blood biomarkers, and brain imaging. Experiments are currently still in process, and more results are therefore forthcoming. Preliminary data suggest that neuromotor control diminishes as well as olfactory function for both AD and WT mice after the administration of five consecutive mild TBIs. Also, seizure activity increases significantly for both AD and WT after the administration of the five TBI treatment. If future data supports these findings, important implications about the effect of TBI on those at risk for AD might be possible.

Keywords: Alzheimer's disease, blood biomarker, neurodegeneration, neuromuscular control, olfaction, traumatic brain injury

Procedia PDF Downloads 134
26361 Project Management Framework and Influencing Factors

Authors: Mehrnoosh Askarizadeh

Abstract:

The increasing variations of the business world correspond with a high diversity of theoretical perspectives used in project management research. This diversity is reflected by a variety of influencing factors, which have been the subject of empirical studies. This article aims to systemize the different streams of research on the basis of a literature review and at developing a research framework influencing factors. We will identify fundamental elements of a project management theory. The framework consists of three dimensions: design, context, and goal. Its purpose is to support the combination of different perspectives and the development of strategies for further research.

Keywords: project, goal, project management, influencing factors

Procedia PDF Downloads 525
26360 The Change of Urban Land Use/Cover Using Object Based Approach for Southern Bali

Authors: I. Gusti A. A. Rai Asmiwyati, Robert J. Corner, Ashraf M. Dewan

Abstract:

Change on land use/cover (LULC) dominantly affects spatial structure and function. It can have such impacts by disrupting social culture practice and disturbing physical elements. Thus, it has become essential to understand of the dynamics in time and space of LULC as it can be used as a critical input for developing sustainable LULC. This study was an attempt to map and monitor the LULC change in Bali Indonesia from 2003 to 2013. Using object based classification to improve the accuracy, and change detection, multi temporal land use/cover data were extracted from a set of ASTER satellite image. The overall accuracies of the classification maps of 2003 and 2013 were 86.99% and 80.36%, respectively. Built up area and paddy field were the dominant type of land use/cover in both years. Patch increase dominantly in 2003 illustrated the rapid paddy field fragmentation and the huge occurring transformation. This approach is new for the case of diverse urban features of Bali that has been growing fast and increased the classification accuracy than the manual pixel based classification.

Keywords: land use/cover, urban, Bali, ASTER

Procedia PDF Downloads 534
26359 A Joint Possibilistic-Probabilistic Tool for Load Flow Uncertainty Assessment-Part I: Formulation

Authors: Morteza Aien, Masoud Rashidinejad, Mahmud Fotuhi-Firuzabad

Abstract:

As energetic and environmental issues are getting more and more attention all around the world, the penetration of distributed energy resources (DERs) mainly those harvesting renewable energies (REs) ascends with an unprecedented rate. This matter causes more uncertainties to appear in the power system context; ergo, the uncertainty analysis of the system performance is an obligation. The uncertainties of any system can be represented probabilistically or possibilistically. Since sufficient historical data about all the system variables is not available, therefore, they do not have a probability density function (PDF) and must be represented possibilistiacally. When some of system uncertain variables are probabilistic and some are possibilistic, neither the conventional pure probabilistic nor pure possibilistic methods can be implemented. Hence, a combined solution is appealed. The first of this two-paper series formulates a new possibilistic-probabilistic tool for the load flow uncertainty assessment. The proposed methodology is based on the evidence theory and joint propagation of possibilistic and probabilistic uncertainties. This possibilistic- probabilistic formulation is solved in the second companion paper in an uncertain load flow (ULF) study problem.

Keywords: probabilistic uncertainty modeling, possibilistic uncertainty modeling, uncertain load flow, wind turbine generator

Procedia PDF Downloads 546
26358 Predicting Groundwater Areas Using Data Mining Techniques: Groundwater in Jordan as Case Study

Authors: Faisal Aburub, Wael Hadi

Abstract:

Data mining is the process of extracting useful or hidden information from a large database. Extracted information can be used to discover relationships among features, where data objects are grouped according to logical relationships; or to predict unseen objects to one of the predefined groups. In this paper, we aim to investigate four well-known data mining algorithms in order to predict groundwater areas in Jordan. These algorithms are Support Vector Machines (SVMs), Naïve Bayes (NB), K-Nearest Neighbor (kNN) and Classification Based on Association Rule (CBA). The experimental results indicate that the SVMs algorithm outperformed other algorithms in terms of classification accuracy, precision and F1 evaluation measures using the datasets of groundwater areas that were collected from Jordanian Ministry of Water and Irrigation.

Keywords: classification, data mining, evaluation measures, groundwater

Procedia PDF Downloads 265
26357 Jurisdictional Issues between Competition Law and Data Protection Law in Protection of Privacy of Online Consumers

Authors: Pankhudi Khandelwal

Abstract:

The revenue models of digital giants such as Facebook and Google, use targeted advertising for revenues. Such a model requires huge amounts of consumer data. While the data protection law deals with the protection of personal data, however, this data is acquired by the companies on the basis of consent, performance of a contract, or legitimate interests. This paper analyses the role that competition law can play in evading these loopholes for the protection of data and privacy of online consumers. Digital markets have certain distinctive features such as network effects and feedback loop, which gives incumbents of these markets a first-mover advantage. This creates a situation where the winner takes it all, thus creating entry barriers and concentration in the market. It has been also seen that this dominant position is then used by the undertakings for leveraging in other markets. This can be harmful to the consumers in form of less privacy, less choice, and stifling innovation, as seen in the cases of Facebook Cambridge Analytica, Google Shopping, and Google Android. Therefore, the article aims to provide a legal framework wherein the data protection law and competition law can come together to provide a balance in regulating digital markets. The issue has become more relevant in light of the Facebook decision by German competition authority, where it was held that Facebook had abused its dominant position by not complying with data protection rules, which constituted an exploitative practice. The paper looks into the jurisdictional boundaries that the data protection and competition authorities can work from and suggests ex ante regulation through data protection law and ex post regulation through competition law. It further suggests a change in the consumer welfare standard where harm to privacy should be considered as an indicator of low quality.

Keywords: data protection, dominance, ex ante regulation, ex post regulation

Procedia PDF Downloads 159
26356 Q Slope Rock Mass Classification and Slope Stability Assessment Methodology Application in Steep Interbedded Sedimentary Rock Slopes for a Motorway Constructed North of Auckland, New Zealand

Authors: Azariah Sosa, Carlos Renedo Sanchez

Abstract:

The development of a new motorway north of Auckland (New Zealand) includes steep rock cuts, from 63 up to 85 degrees, in an interbedded sandstone and siltstone rock mass of the geological unit Waitemata Group (Pakiri Formation), which shows sub-horizontal bedding planes, various sub-vertical joint sets, and a diverse weathering profile. In this kind of rock mass -that can be classified as a weak rock- the definition of the stable maximum geometry is not only governed by discontinuities and defects evident in the rock but is important to also consider the global stability of the rock slope, including (in the analysis) the rock mass characterisation, influence of the groundwater, the geological evolution, and the weathering processes. Depending on the weakness of the rock and the processes suffered, the global stability could, in fact, be a more restricting element than the potential instability of individual blocks through discontinuities. This paper discusses those elements that govern the stability of the rock slopes constructed in a rock formation with favourable bedding and distribution of discontinuities (horizontal and vertical) but with a weak behaviour in terms of global rock mass characterisation. In this context, classifications as Q-Slope and slope stability assessment methodology (SSAM) have been demonstrated as important tools which complement the assessment of the global stability together with the analytical tools related to the wedge-type failures and limit equilibrium methods. The paper focuses on the applicability of these two new empirical classifications to evaluate the slope stability in 18 already excavated rock slopes in the Pakiri formation through comparison between the predicted and observed stability issues and by reviewing the outcome of analytical methods (Rocscience slope stability software suite) compared against the expected stability determined from these rock classifications. This exercise will help validate such findings and correlations arising from the two empirical methods in order to adjust the methods to the nature of this specific kind of rock mass and provide a better understanding of the long-term stability of the slopes studied.

Keywords: Pakiri formation, Q-slope, rock slope stability, SSAM, weak rock

Procedia PDF Downloads 197
26355 Application of Knowledge Discovery in Database Techniques in Cost Overruns of Construction Projects

Authors: Mai Ghazal, Ahmed Hammad

Abstract:

Cost overruns in construction projects are considered as worldwide challenges since the cost performance is one of the main measures of success along with schedule performance. To overcome this problem, studies were conducted to investigate the cost overruns' factors, also projects' historical data were analyzed to extract new and useful knowledge from it. This research is studying and analyzing the effect of some factors causing cost overruns using the historical data from completed construction projects. Then, using these factors to estimate the probability of cost overrun occurrence and predict its percentage for future projects. First, an intensive literature review was done to study all the factors that cause cost overrun in construction projects, then another review was done for previous researcher papers about mining process in dealing with cost overruns. Second, a proposed data warehouse was structured which can be used by organizations to store their future data in a well-organized way so it can be easily analyzed later. Third twelve quantitative factors which their data are frequently available at construction projects were selected to be the analyzed factors and suggested predictors for the proposed model.

Keywords: construction management, construction projects, cost overrun, cost performance, data mining, data warehousing, knowledge discovery, knowledge management

Procedia PDF Downloads 354
26354 Analysis of Formyl Peptide Receptor 1 Protein Value as an Indicator of Neutrophil Chemotaxis Dysfunction in Aggressive Periodontitis

Authors: Prajna Metta, Yanti Rusyanti, Nunung Rusminah, Bremmy Laksono

Abstract:

The decrease of neutrophil chemotaxis function may cause increased susceptibility to aggressive periodontitis (AP). Neutrophil chemotaxis is affected by formyl peptide receptor 1 (FPR1), which when activated will respond to bacterial chemotactic peptide formyl methionyl leusyl phenylalanine (FMLP). FPR1 protein value is decreased in response to a wide number of inflammatory stimuli in AP patients. This study was aimed to assess the alteration of FPR1 protein value in AP patients and if FPR1 protein value could be used as an indicator of neutrophil chemotaxis dysfunction in AP. This is a case control study with 20 AP patients and 20 control subjects. Three milliliters of peripheral blood were drawn and analyzed for FPR1 protein value with ELISA. The data were statistically analyzed with Mann-Whitney test (p>0,05). Results showed that the mean value of FPR1 protein value in AP group is 0,353 pg/mL (0,11 to 1,18 pg/mL) and the mean value of FPR1 protein value in control group is 0,296 pg/mL (0,05 to 0,88 pg/mL). P value 0,787 > 0,05 suggested that there is no significant difference of FPR1 protein value in both groups. The present study suggests that FPR1 protein value has no significance alteration in AP patients and could not be used as an indicator of neutrophil chemotaxis dysfunction.

Keywords: aggressive periodontitis, chemotaxis dysfunction, FPR1 protein value, neutrophil

Procedia PDF Downloads 203
26353 Sampling Error and Its Implication for Capture Fisheries Management in Ghana

Authors: Temiloluwa J. Akinyemi, Denis W. Aheto, Wisdom Akpalu

Abstract:

Capture fisheries in developing countries provide significant animal protein and directly supports the livelihoods of several communities. However, the misperception of biophysical dynamics owing to a lack of adequate scientific data has contributed to the suboptimal management in marine capture fisheries. This is because yield and catch potentials are sensitive to the quality of catch and effort data. Yet, studies on fisheries data collection practices in developing countries are hard to find. This study investigates the data collection methods utilized by fisheries technical officers within the four fishing regions of Ghana. We found that the officers employed data collection and sampling procedures which were not consistent with the technical guidelines curated by FAO. For example, 50 instead of 166 landing sites were sampled, while 290 instead of 372 canoes were sampled. We argue that such sampling errors could result in the over-capitalization of capture fish stocks and significant losses in resource rents.

Keywords: Fisheries data quality, fisheries management, Ghana, Sustainable Fisheries

Procedia PDF Downloads 76
26352 Influence of Servant Leadership on Faculty Retention in Higher Education Institutes: Mediating Role of Job Satisfaction

Authors: Aneela Sheikh

Abstract:

Private higher education institutes are challenged for their resilience and competitive edge in the globalized knowledge-based economy in the 21st century. Faculty retention plays an important role as a catalyst for addressing the current mega-developmental phenomenon in higher education institutes faced by developing countries. This study intends to explore the influence of servant leadership practice on faculty retention through the intervening role of job satisfaction towards minimizing the high faculty turnover in private higher education institutes, with the mediating role of job satisfaction. A sample of 341 faculty members from ten private higher education institutes in Lahore city of Pakistan, was selected through a stratified proportionate random sampling technique. A descriptive survey research approach was employed to collect data from 341 faculty members by administering a close-ended questionnaire based on a seven-point Likert scale as a self-administered research instrument. The study was conducted under the domain of the Leader-Member Exchange (LMX) theory. The mediating role of job satisfaction was measured by bootstrapping technique. The results revealed that servant leadership has a statistically significant influence on faculty retention, with a statistically significant mediating role of job satisfaction, in private higher education institutes in Pakistan. Further, up to the best of the authors’ knowledge, this is the first systematic and empirical study on faculty retention conducted against the backdrop of servant leadership in an Eastern context, particularly in Pakistan.

Keywords: servant leadership, faculty retention, job satisfaction, higher education institutes

Procedia PDF Downloads 64
26351 Rational Bureaucracy and E-Government: A Philosophical Study of Universality of E-Government

Authors: Akbar Jamali

Abstract:

Hegel is the first great political philosopher who specifically contemplates on bureaucracy. For Hegel bureaucracy is the function of the state. Since state, essentially is a rational organization, its function; namely, bureaucracy must be rational. Since, what is rational is universal; Hegel had to explain how the bureaucracy could be understood as universal. Hegel discusses bureaucracy in his treatment of ‘executive power’. He analyses modern bureaucracy as a form of political organization, its constituent members, and its relation to the social environment. Therefore, the essence of bureaucracy in Hegel’s philosophy is the implementation of law and rules. Hegel argues that unlike the other social classes that are particular because they look for their own private interest, bureaucracy as a class is a ‘universal’ because their orientation is the interest of the state. State for Hegel is essentially rational and universal. It is the actualization of ‘objective Spirit’. Marx criticizes Hegel’s argument on the universality of state and bureaucracy. For Marx state is equal to bureaucracy, it constitutes a social class that based on the interest of bourgeois class that dominates the society and exploits proletarian class. Therefore, the main disagreement between these political philosophers is: whether the state (bureaucracy) is universal or particular. Growing e-government in modern state as an important aspect of development leads us to contemplate on the particularity and universality of e-government. In this article, we will argue that e-government essentially is universal. E-government, in itself, is impartial; therefore, it cannot be particular. The development of e-government eliminates many side effects of the private, personal or particular interest of the individuals who work as bureaucracy. Finally, we will argue that more a state is developed more it is universal. Therefore, development of e-government makes the state a more universal and affects the modern philosophical debate on the particularity or universality of bureaucracy and state.

Keywords: particularity, universality, rational bureaucracy, impartiality

Procedia PDF Downloads 230
26350 Comparing Perceived Restorativeness in Natural and Urban Environment: A Meta-Analysis

Authors: Elisa Menardo, Margherita Pasini, Margherita Brondino

Abstract:

A growing body of empirical research from different areas of inquiry suggests that brief contact with natural environment restore mental resources. The Attention Restoration Theory (ART) is the widespread used and empirical founded theory developed to explain why exposure to nature helps people to recovery cognitive resources. It assumes that contact with nature allows people to free (and then recovery) voluntary attention resources and thus allows them to recover from a cognitive fatigue situation. However, it was suggested that some people could have more cognitive benefit after exposure to urban environment. The objective of this study is to report the results of a meta-analysis on studies (peer-reviewed articles) comparing the restorativeness (the quality to be restorative) perceived in natural environments than those perceived in urban environments. This meta-analysis intended to estimate how much nature environments (forests, parks, boulevards) are perceived to be more restorativeness than urban ones (i.e., the magnitude of the perceived restorativeness’ difference). Moreover, given the methodological difference between study, it studied the potential role of moderator variables as participants (student or other), instrument used (Perceived Restorativeness Scale or other), and procedure (in laboratory or in situ). PsycINFO, PsycARTICLES, Scopus, SpringerLINK, Web of Science online database were used to identify all peer-review articles on restorativeness published to date (k = 167). Reference sections of obtained papers were examined for additional studies. Only 22 independent studies (with a total of 1371 participants) met inclusion criteria (direct exposure to environment, comparison between one outdoor environment with natural element and one without natural element, and restorativeness measured by self-report scale) and were included in meta-analysis. To estimate the average effect size, a random effect model (Restricted Maximum-likelihood estimator) was used because the studies included in the meta-analysis were conducted independently and using different methods in different populations, so no common effect-size was expected. The presence of publication bias was checked using trim and fill approach. Univariate moderator analysis (mixed effect model) were run to determine whether the variable coded moderated the perceived restorativeness difference. Results show that natural environments are perceived to be more restorativeness than urban environments, confirming from an empirical point of view what is now considered a knowledge gained in environmental psychology. The relevant information emerging from this study is the magnitude of the estimated average effect size, which is particularly high (d = 1.99) compared to those that are commonly observed in psychology. Significant heterogeneity between study was found (Q(19) = 503.16, p < 0.001;) and studies’ variability was very high (I2[C.I.] = 96.97% [94.61 - 98.62]). Subsequent univariate moderator analyses were not significant. Methodological difference (participants, instrument, and procedure) did not explain variability between study. Other methodological difference (e.g., research design, environment’s characteristics, light’s condition) could explain this variability between study. In the mine while, studies’ variability could be not due to methodological difference but to individual difference (age, gender, education level) and characteristics (connection to nature, environmental attitude). Furthers moderator analysis are working in progress.

Keywords: meta-analysis, natural environments, perceived restorativeness, urban environments

Procedia PDF Downloads 154
26349 Integrating Service Learning into a Business Analytics Course: A Comparative Investigation

Authors: Gokhan Egilmez, Erika Hatfield, Julie Turner

Abstract:

In this study, we investigated the impacts of service-learning integration on an undergraduate level business analytics course from multiple perspectives, including academic proficiency, community awareness, engagement, social responsibility, and reflection. We assessed the impact of the service-learning experience by using a survey developed primarily based on the literature review and secondarily on an ad hoc group of researchers. Then, we implemented the survey in two sections, where one of the sections was a control group. We compared the results of the empirical survey visually and statistically.

Keywords: business analytics, service learning, experiential education, statistical analysis, survey research

Procedia PDF Downloads 95
26348 Antiviral Activity of Interleukin-11 in Response to Porcine Epidemic Diarrhea Virus Infection

Authors: Li Yuchen, Wu Qingxin, Jin Yuxing, Yang Qian

Abstract:

Interleukin-11 (IL-11), a well-known anti-inflammatory factor, helps to protect against intestinal epithelium damage caused by physical or chemical factors. However, little is known about the role of IL-11 during viral infection. Herein, high mRNA and protein levels of IL-11 were found in epithelial cells and jejunum of piglets during porcine epidemic diarrhea virus (PEDV) infection, and IL-11 expression was positively correlated with the level of viral infection. Pretreatment with recombinant porcine IL-11 (pIL-11) suppressed PEDV replication in Vero E6 cells, while IL-11 knockdown promoted viral infection. Furthermore, pIL-11 inhibited viral infection by preventing PEDV-mediated apoptosis of cells through activating the IL-11/STAT3 signal pathway. Conversely, application of a STAT3 phosphorylation inhibitor significantly antagonized the anti-apoptosis function of pIL-11 and counteracted its inhibition of PEDV. Our data suggested that that IL-11 is a novel PEDV-inducible cytokine, and its production enhances the anti-apoptosis ability of epithelial cells against PEDV infection. The potential uses of IL-11 as a novel therapeutic against devastating viral diarrhea in piglets deserves more attention and study.

Keywords: Interleukin-11, Porcine epidemic diarrhea virus, STAT3, anti-apoptosis

Procedia PDF Downloads 121
26347 Improvement of Data Transfer over Simple Object Access Protocol (SOAP)

Authors: Khaled Ahmed Kadouh, Kamal Ali Albashiri

Abstract:

This paper presents a designed algorithm involves improvement of transferring data over Simple Object Access Protocol (SOAP). The aim of this work is to establish whether using SOAP in exchanging XML messages has any added advantages or not. The results showed that XML messages without SOAP take longer time and consume more memory, especially with binary data.

Keywords: JAX-WS, SMTP, SOAP, web service, XML

Procedia PDF Downloads 484
26346 A Study on Computational Fluid Dynamics (CFD)-Based Design Optimization Techniques Using Multi-Objective Evolutionary Algorithms (MOEA)

Authors: Ahmed E. Hodaib, Mohamed A. Hashem

Abstract:

In engineering applications, a design has to be as fully perfect as possible in some defined case. The designer has to overcome many challenges in order to reach the optimal solution to a specific problem. This process is called optimization. Generally, there is always a function called “objective function” that is required to be maximized or minimized by choosing input parameters called “degrees of freedom” within an allowed domain called “search space” and computing the values of the objective function for these input values. It becomes more complex when we have more than one objective for our design. As an example for Multi-Objective Optimization Problem (MOP): A structural design that aims to minimize weight and maximize strength. In such case, the Pareto Optimal Frontier (POF) is used, which is a curve plotting two objective functions for the best cases. At this point, a designer should make a decision to choose the point on the curve. Engineers use algorithms or iterative methods for optimization. In this paper, we will discuss the Evolutionary Algorithms (EA) which are widely used with Multi-objective Optimization Problems due to their robustness, simplicity, suitability to be coupled and to be parallelized. Evolutionary algorithms are developed to guarantee the convergence to an optimal solution. An EA uses mechanisms inspired by Darwinian evolution principles. Technically, they belong to the family of trial and error problem solvers and can be considered global optimization methods with a stochastic optimization character. The optimization is initialized by picking random solutions from the search space and then the solution progresses towards the optimal point by using operators such as Selection, Combination, Cross-over and/or Mutation. These operators are applied to the old solutions “parents” so that new sets of design variables called “children” appear. The process is repeated until the optimal solution to the problem is reached. Reliable and robust computational fluid dynamics solvers are nowadays commonly utilized in the design and analyses of various engineering systems, such as aircraft, turbo-machinery, and auto-motives. Coupling of Computational Fluid Dynamics “CFD” and Multi-Objective Evolutionary Algorithms “MOEA” has become substantial in aerospace engineering applications, such as in aerodynamic shape optimization and advanced turbo-machinery design.

Keywords: mathematical optimization, multi-objective evolutionary algorithms "MOEA", computational fluid dynamics "CFD", aerodynamic shape optimization

Procedia PDF Downloads 245
26345 Enhancing Healthcare Data Protection and Security

Authors: Joseph Udofia, Isaac Olufadewa

Abstract:

Everyday, the size of Electronic Health Records data keeps increasing as new patients visit health practitioner and returning patients fulfil their appointments. As these data grow, so is their susceptibility to cyber-attacks from criminals waiting to exploit this data. In the US, the damages for cyberattacks were estimated at $8 billion (2018), $11.5 billion (2019) and $20 billion (2021). These attacks usually involve the exposure of PII. Health data is considered PII, and its exposure carry significant impact. To this end, an enhancement of Health Policy and Standards in relation to data security, especially among patients and their clinical providers, is critical to ensure ethical practices, confidentiality, and trust in the healthcare system. As Clinical accelerators and applications that contain user data are used, it is expedient to have a review and revamp of policies like the Payment Card Industry Data Security Standard (PCI DSS), the Health Insurance Portability and Accountability Act (HIPAA), the Fast Healthcare Interoperability Resources (FHIR), all aimed to ensure data protection and security in healthcare. FHIR caters for healthcare data interoperability, FHIR caters to healthcare data interoperability, as data is being shared across different systems from customers to health insurance and care providers. The astronomical cost of implementation has deterred players in the space from ensuring compliance, leading to susceptibility to data exfiltration and data loss on the security accuracy of protected health information (PHI). Though HIPAA hones in on the security accuracy of protected health information (PHI) and PCI DSS on the security of payment card data, they intersect with the shared goal of protecting sensitive information in line with industry standards. With advancements in tech and the emergence of new technology, it is necessary to revamp these policies to address the complexity and ambiguity, cost barrier, and ever-increasing threats in cyberspace. Healthcare data in the wrong hands is a recipe for disaster, and we must enhance its protection and security to protect the mental health of the current and future generations.

Keywords: cloud security, healthcare, cybersecurity, policy and standard

Procedia PDF Downloads 70
26344 The Effect of Particulate Matter on Cardiomyocyte Apoptosis Through Mitochondrial Fission

Authors: Tsai-chun Lai, Szu-ju Fu, Tzu-lin Lee, Yuh-Lien Chen

Abstract:

There is much evidence that exposure to fine particulate matter (PM) from air pollution increases the risk of cardiovascular morbidity and mortality. According to previous reports, PM in the air enters the respiratory tract, contacts the alveoli, and enters the blood circulation, leading to the progression of cardiovascular disease. PM pollution may also lead to cardiometabolic disturbances, increasing the risk of cardiovascular disease. The effects of PM on cardiac function and mitochondrial damage are currently unknown. We used mice and rat cardiomyocytes (H9c2) as animal and in vitro cell models, respectively, to simulate an air pollution environment using PM. These results indicate that the apoptosis-related factor PUMA, a regulator of apoptosis upregulated by p53, is increased in mice treated with PM. Apoptosis was aggravated in cardiomyocytes treated with PM, as measured by TUNEL assay and Annexin V/PI. Western blot results showed that CASPASE3 was significantly increased and BCL2 (B-cell lymphoid 2) was significantly decreased under PM treatment. Concurrent exposure to PM increases mitochondrial reactive oxygen species (ROS) production by MitoSOX Red staining. Furthermore, using Mitotracker staining, PM treatment significantly shortened mitochondrial length, indicating mitochondrial fission. The expression of mitochondrial fission-related proteins p-DRP1 (phosphodynamics-related protein 1) and FIS1 (mitochondrial fission 1 protein) was significantly increased. Based on these results, the exposure to PM worsens mitochondrial function and leads to cardiomyocyte apoptosis.

Keywords: particulate matter, cardiomyocyte, apoptosis, mitochondria

Procedia PDF Downloads 87
26343 Public Service Ethics in Public Administration: An Empirical Investigation

Authors: Kalsoom Sumra

Abstract:

The increasing concern of public sector reforms brings new challenges to public service ethics in developing countries not only at central level but also at local level. This paper aims to identify perceptions on public service ethics of public officials and examines more generally the understanding of public servants in Pakistan towards public service ethics in local public organizations. The study uses an independently administered structured questionnaire to collect data to know the extent of the recognition of public service ethics in local organizations. A total of 150 completed questionnaires are analyzed received from public servants working at the local level in Pakistan. The analysis explores how traditional, social patterns and cultural ethics can provide us with a rounded picture of the main antecedents, moderators of public service ethics in Pakistan. Moreover, the findings of this study contribute in association of public service ethics which are crucial in ongoing political and administrative culture of Pakistan, the most crucial core for public organizational ethical climate. This study also has numerous implications for local public administration and it highlights the importance of expanding research agenda on public service ethics in developing settings with challenging institutional contexts with imperfect training and operating environments. This study may well be particularly important for practice of public service ethics in developing countries in public administration. To the best of author’s knowledge, this study is the first of its kind to provide an initial step in practical implications to emphasize relevant public service ethics in public administration in developing transparent and accountable organization.

Keywords: public service ethics, accountability and transparency, public service reforms, public administration, organizational ethical climate

Procedia PDF Downloads 333
26342 An Empirical Investigation of Uncertainty and the Lumpy Investment Channel of Monetary Policy

Authors: Min Fang, Jiaxi Yang

Abstract:

Monetary policy could be less effective at stimulating investment during periods of elevated volatility than during normal times. In this paper, we argue that elevated volatility leads to a decrease in extensive margin investment incentive so that nominal stimulus generates less aggregate investment. To do this, we first empirically document that high volatility weakens firms’ investment responses to monetary stimulus. Such effects depend on the lumpiness nature of the firm-level investment. The findings are that the channel exists for all of the physical investment, innovation investment, and organization investment.

Keywords: investment, irreversibility, volatility, uncertainty, firm heterogeneity, monetary policy

Procedia PDF Downloads 89