Search results for: distributed generators
1642 Internet of Assets: A Blockchain-Inspired Academic Program
Authors: Benjamin Arazi
Abstract:
Blockchain is the technology behind cryptocurrencies like Bitcoin. It revolutionizes the meaning of trust in the sense of offering total reliability without relying on any central entity that controls or supervises the system. The Wall Street Journal states: “Blockchain Marks the Next Step in the Internet’s Evolution”. Blockchain was listed as #1 in Linkedin – The Learning Blog “most in-demand hard skills needed in 2020”. As stated there: “Blockchain’s novel way to store, validate, authorize, and move data across the internet has evolved to securely store and send any digital asset”. GSMA, a leading Telco organization of mobile communications operators, declared that “Blockchain has the potential to be for value what the Internet has been for information”. Motivated by these seminal observations, this paper presents the foundations of a Blockchain-based “Internet of Assets” academic program that joins under one roof leading application areas that are characterized by the transfer of assets over communication lines. Two such areas, which are pillars of our economy, are Fintech – Financial Technology and mobile communications services. The next application in line is Healthcare. These challenges are met based on available extensive professional literature. Blockchain-based assets communication is based on extending the principle of Bitcoin, starting with the basic question: If digital money that travels across the universe can ‘prove its own validity’, can this principle be applied to digital content. A groundbreaking positive answer here led to the concept of “smart contract” and consequently to DLT - Distributed Ledger Technology, where the word ‘distributed’ relates to the non-existence of reliable central entities or trusted third parties. The terms Blockchain and DLT are frequently used interchangeably in various application areas. The World Bank Group compiled comprehensive reports, analyzing the contribution of DLT/Blockchain to Fintech. The European Central Bank and Bank of Japan are engaged in Project Stella, “Balancing confidentiality and auditability in a distributed ledger environment”. 130 DLT/Blockchain focused Fintech startups are now operating in Switzerland. Blockchain impact on mobile communications services is treated in detail by leading organizations. The TM Forum is a global industry association in the telecom industry, with over 850 member companies, mainly mobile operators, that generate US$2 trillion in revenue and serve five billion customers across 180 countries. From their perspective: “Blockchain is considered one of the digital economy’s most disruptive technologies”. Samples of Blockchain contributions to Fintech (taken from a World Bank document): Decentralization and disintermediation; Greater transparency and easier auditability; Automation & programmability; Immutability & verifiability; Gains in speed and efficiency; Cost reductions; Enhanced cyber security resilience. Samples of Blockchain contributions to the Telco industry. Establishing identity verification; Record of transactions for easy cost settlement; Automatic triggering of roaming contract which enables near-instantaneous charging and reduction in roaming fraud; Decentralized roaming agreements; Settling accounts per costs incurred in accordance with agreement tariffs. This clearly demonstrates an academic education structure where fundamental technologies are studied in classes together with these two application areas. Advanced courses, treating specific implementations then follow separately. All are under the roof of “Internet of Assets”.Keywords: blockchain, education, financial technology, mobile telecommunications services
Procedia PDF Downloads 1801641 Microstructure and Mechanical Properties of Low Alloy Steel with Double Austenitizing Tempering Heat Treatment
Authors: Jae-Ho Jang, Jung-Soo Kim, Byung-Jun Kim, Dae-Geun Nam, Uoo-Chang Jung, Yoon-Suk Choi
Abstract:
Low alloy steels are widely used for pressure vessels, spent fuel storage, and steam generators required to withstand the internal pressure and prevent unexpected failure in nuclear power plants, which these may suffer embrittlement by high levels of radiation and heat for a long period. Therefore, it is important to improve mechanical properties of low alloy steels for the integrity of structure materials at an early stage of fabrication. Recently, it showed that a double austenitizing and tempering (DTA) process resulted in a significant improvement of strength and toughness by refinement of prior austenite grains. In this study, it was investigated that the mechanism of improving mechanical properties according to the change of microstructure by the second fully austenitizing temperature of the DAT process for low alloy steel required the structural integrity. Compared to conventional single austenitizing and tempering (SAT) process, the tensile elongation properties have improved about 5%, DBTTs have obtained result in reduction of about -65℃, and grain size has decreased by about 50% in the DAT process conditions. Grain refinement has crack propagation interference effect due to an increase of the grain boundaries and amount of energy absorption at low temperatures. The higher first austenitizing temperature in the DAT process, the more increase the spheroidized carbides and strengthening the effect of fine precipitates in the ferrite grain. The area ratio of the dimple in the transition area has increased by proportion to the effect of spheroidized carbides. This may the primary mechanisms that can improve low-temperature toughness and elongation while maintaining a similar hardness and strength.Keywords: double austenitizing, Ductile Brittle transition temperature, grain refinement, heat treatment, low alloy steel, low-temperature toughness
Procedia PDF Downloads 5101640 The Postcognitivist Era in Cognitive Psychology
Authors: C. Jameke
Abstract:
During the cognitivist era in cognitive psychology, a theory of internal rules and symbolic representations was posited as an account of human cognition. This type of cognitive architecture had its heyday during the 1970s and 80s, but it has now been largely abandoned in favour of subsymbolic architectures (e.g. connectionism), non-representational frameworks (e.g. dynamical systems theory), and statistical approaches such as Bayesian theory. In this presentation I describe this changing landscape of research, and comment on the increasing influence of neuroscience on cognitive psychology. I then briefly review a few recent developments in connectionism, and neurocomputation relevant to cognitive psychology, and critically discuss the assumption made by some researchers in these frameworks that higher-level aspects of human cognition are simply emergent properties of massively large distributed neural networksKeywords: connectionism, emergentism, postocgnitivist, representations, subsymbolic archiitecture
Procedia PDF Downloads 5781639 Comparing Xbar Charts: Conventional versus Reweighted Robust Estimation Methods for Univariate Data Sets
Authors: Ece Cigdem Mutlu, Burak Alakent
Abstract:
Maintaining the quality of manufactured products at a desired level depends on the stability of process dispersion and location parameters and detection of perturbations in these parameters as promptly as possible. Shewhart control chart is the most widely used technique in statistical process monitoring to monitor the quality of products and control process mean and variability. In the application of Xbar control charts, sample standard deviation and sample mean are known to be the most efficient conventional estimators in determining process dispersion and location parameters, respectively, based on the assumption of independent and normally distributed datasets. On the other hand, there is no guarantee that the real-world data would be normally distributed. In the cases of estimated process parameters from Phase I data clouded with outliers, efficiency of traditional estimators is significantly reduced, and performance of Xbar charts are undesirably low, e.g. occasional outliers in the rational subgroups in Phase I data set may considerably affect the sample mean and standard deviation, resulting a serious delay in detection of inferior products in Phase II. For more efficient application of control charts, it is required to use robust estimators against contaminations, which may exist in Phase I. In the current study, we present a simple approach to construct robust Xbar control charts using average distance to the median, Qn-estimator of scale, M-estimator of scale with logistic psi-function in the estimation of process dispersion parameter, and Harrell-Davis qth quantile estimator, Hodge-Lehmann estimator and M-estimator of location with Huber psi-function and logistic psi-function in the estimation of process location parameter. Phase I efficiency of proposed estimators and Phase II performance of Xbar charts constructed from these estimators are compared with the conventional mean and standard deviation statistics both under normality and against diffuse-localized and symmetric-asymmetric contaminations using 50,000 Monte Carlo simulations on MATLAB. Consequently, it is found that robust estimators yield parameter estimates with higher efficiency against all types of contaminations, and Xbar charts constructed using robust estimators have higher power in detecting disturbances, compared to conventional methods. Additionally, utilizing individuals charts to screen outlier subgroups and employing different combination of dispersion and location estimators on subgroups and individual observations are found to improve the performance of Xbar charts.Keywords: average run length, M-estimators, quality control, robust estimators
Procedia PDF Downloads 1901638 Working Mode and Key Technology of Thermal Vacuum Test Software for Spacecraft Test
Authors: Zhang Lei, Zhan Haiyang, Gu Miao
Abstract:
A universal software platform is developed for improving the defects in the practical one. This software platform has distinct advantages in modularization, information management, and the interfaces. Several technologies such as computer technology, virtualization technology, network technology, etc. are combined together in this software platform, and four working modes are introduced in this article including single mode, distributed mode, cloud mode, and the centralized mode. The application area of the software platform is extended through the switch between these working modes. The software platform can arrange the thermal vacuum test process automatically. This function can improve the reliability of thermal vacuum test.Keywords: software platform, thermal vacuum test, control and measurement, work mode
Procedia PDF Downloads 4141637 Volunteered Geographic Information Coupled with Wildfire Fire Progression Maps: A Spatial and Temporal Tool for Incident Storytelling
Authors: Cassandra Hansen, Paul Doherty, Chris Ferner, German Whitley, Holly Torpey
Abstract:
Wildfire is a natural and inevitable occurrence, yet changing climatic conditions have increased the severity, frequency, and risk to human populations in the wildland/urban interface (WUI) of the Western United States. Rapid dissemination of accurate wildfire information is critical to both the Incident Management Team (IMT) and the affected community. With the advent of increasingly sophisticated information systems, GIS can now be used as a web platform for sharing geographic information in new and innovative ways, such as virtual story map applications. Crowdsourced information can be extraordinarily useful when coupled with authoritative information. Information abounds in the form of social media, emergency alerts, radio, and news outlets, yet many of these resources lack a spatial component when first distributed. In this study, we describe how twenty-eight volunteer GIS professionals across nine Geographic Area Coordination Centers (GACC) sourced, curated, and distributed Volunteered Geographic Information (VGI) from authoritative social media accounts focused on disseminating information about wildfires and public safety. The combination of fire progression maps with VGI incident information helps answer three critical questions about an incident, such as: where the first started. How and why the fire behaved in an extreme manner and how we can learn from the fire incident's story to respond and prepare for future fires in this area. By adding a spatial component to that shared information, this team has been able to visualize shared information about wildfire starts in an interactive map that answers three critical questions in a more intuitive way. Additionally, long-term social and technical impacts on communities are examined in relation to situational awareness of the disaster through map layers and agency links, the number of views in a particular region of a disaster, community involvement and sharing of this critical resource. Combined with a GIS platform and disaster VGI applications, this workflow and information become invaluable to communities within the WUI and bring spatial awareness for disaster preparedness, response, mitigation, and recovery. This study highlights progression maps as the ultimate storytelling mechanism through incident case studies and demonstrates the impact of VGI and sophisticated applied cartographic methodology make this an indispensable resource for authoritative information sharing.Keywords: storytelling, wildfire progression maps, volunteered geographic information, spatial and temporal
Procedia PDF Downloads 1761636 Sampling Effects on Secondary Voltage Control of Microgrids Based on Network of Multiagent
Authors: M. J. Park, S. H. Lee, C. H. Lee, O. M. Kwon
Abstract:
This paper studies a secondary voltage control framework of the microgrids based on the consensus for a communication network of multiagent. The proposed control is designed by the communication network with one-way links. The communication network is modeled by a directed graph. At this time, the concept of sampling is considered as the communication constraint among each distributed generator in the microgrids. To analyze the sampling effects on the secondary voltage control of the microgrids, by using Lyapunov theory and some mathematical techniques, the sufficient condition for such problem will be established regarding linear matrix inequality (LMI). Finally, some simulation results are given to illustrate the necessity of the consideration of the sampling effects on the secondary voltage control of the microgrids.Keywords: microgrids, secondary control, multiagent, sampling, LMI
Procedia PDF Downloads 3331635 Security of Database Using Chaotic Systems
Authors: Eman W. Boghdady, A. R. Shehata, M. A. Azem
Abstract:
Database (DB) security demands permitting authorized users and prohibiting non-authorized users and intruders actions on the DB and the objects inside it. Organizations that are running successfully demand the confidentiality of their DBs. They do not allow the unauthorized access to their data/information. They also demand the assurance that their data is protected against any malicious or accidental modification. DB protection and confidentiality are the security concerns. There are four types of controls to obtain the DB protection, those include: access control, information flow control, inference control, and cryptographic. The cryptographic control is considered as the backbone for DB security, it secures the DB by encryption during storage and communications. Current cryptographic techniques are classified into two types: traditional classical cryptography using standard algorithms (DES, AES, IDEA, etc.) and chaos cryptography using continuous (Chau, Rossler, Lorenz, etc.) or discreet (Logistics, Henon, etc.) algorithms. The important characteristics of chaos are its extreme sensitivity to initial conditions of the system. In this paper, DB-security systems based on chaotic algorithms are described. The Pseudo Random Numbers Generators (PRNGs) from the different chaotic algorithms are implemented using Matlab and their statistical properties are evaluated using NIST and other statistical test-suits. Then, these algorithms are used to secure conventional DB (plaintext), where the statistical properties of the ciphertext are also tested. To increase the complexity of the PRNGs and to let pass all the NIST statistical tests, we propose two hybrid PRNGs: one based on two chaotic Logistic maps and another based on two chaotic Henon maps, where each chaotic algorithm is running side-by-side and starting from random independent initial conditions and parameters (encryption keys). The resulted hybrid PRNGs passed the NIST statistical test suit.Keywords: algorithms and data structure, DB security, encryption, chaotic algorithms, Matlab, NIST
Procedia PDF Downloads 2651634 Estimation of Train Operation Using an Exponential Smoothing Method
Authors: Taiyo Matsumura, Kuninori Takahashi, Takashi Ono
Abstract:
The purpose of this research is to improve the convenience of waiting for trains at level crossings and stations and to prevent accidents resulting from forcible entry into level crossings, by providing level crossing users and passengers with information that tells them when the next train will pass through or arrive. For this paper, we proposed methods for estimating operation by means of an average value method, variable response smoothing method, and exponential smoothing method, on the basis of open data, which has low accuracy, but for which performance schedules are distributed in real time. We then examined the accuracy of the estimations. The results showed that the application of an exponential smoothing method is valid.Keywords: exponential smoothing method, open data, operation estimation, train schedule
Procedia PDF Downloads 3881633 Facilitating Familial Support of Saudi Arabians Living with HIV/AIDS
Authors: Noor Attar
Abstract:
This paper provides an overview of the current situation of HIV/AIDS patients in the Kingdom of Saudi Arabia (KSA) and a literature review of the concepts of stigma communication, communication of social support. These concepts provide the basis for the proposed methods, which will include conducting a textual analysis of materials that are currently distributed to family members of people living with HIV/AIDS (PLWHIV/A) in KSA and creating an educational brochure. The brochure will aim to help families of PLWHIV/A in KSA (1) understand how stigma shapes the experience of PLWHIV/A, (2) realize the role of positive communication as a helpful social support, and (3) develop the ability to provide positive social support for their loved ones.Keywords: HIV/AIDS, Saudi Arabia, social support, stigma communication
Procedia PDF Downloads 2851632 Public Spending and Economic Growth: An Empirical Analysis of Developed Countries
Authors: Bernur Acikgoz
Abstract:
The purpose of this paper is to investigate the effects of public spending on economic growth and examine the sources of economic growth in developed countries since the 1990s. This paper analyses whether public spending effect on economic growth based on Cobb-Douglas Production Function with the two econometric models with Autoregressive Distributed Lag (ARDL) and Dynamic Fixed Effect (DFE) for 21 developed countries (high-income OECD countries), over the period 1990-2013. Our models results are parallel to each other and the models support that public spending has an important role for economic growth. This result is accurate with theories and previous empirical studies.Keywords: public spending, economic growth, panel data, ARDL models
Procedia PDF Downloads 3701631 Reliability Based Topology Optimization: An Efficient Method for Material Uncertainty
Authors: Mehdi Jalalpour, Mazdak Tootkaboni
Abstract:
We present a computationally efficient method for reliability-based topology optimization under material properties uncertainty, which is assumed to be lognormally distributed and correlated within the domain. Computational efficiency is achieved through estimating the response statistics with stochastic perturbation of second order, using these statistics to fit an appropriate distribution that follows the empirical distribution of the response, and employing an efficient gradient-based optimizer. The proposed algorithm is utilized for design of new structures and the changes in the optimized topology is discussed for various levels of target reliability and correlation strength. Predictions were verified thorough comparison with results obtained using Monte Carlo simulation.Keywords: material uncertainty, stochastic perturbation, structural reliability, topology optimization
Procedia PDF Downloads 6051630 Factors Affecting Mobile Internet Adoption in an Emerging Market
Authors: Maha Mourad, Fady Todros
Abstract:
The objective of this research is to find an explanatory model to define the most important variables and factors that affect the acceptance of Mobile Internet in the Egyptian market. A qualitative exploratory research was conducted to support the conceptual framework followed with a quantitative research in the form of a survey distributed among 411 respondents. It was clear that relative advantage, complexity, compatibility, perceived price level and perceived playfulness have a dominant role in influencing consumers to adopt mobile internet, while observability is correlated to the adoption but when measured with the other factors it lost its value. The perceived price level has a negative relationship with the adoption as well the compatibility.Keywords: innovation, Egypt, communication technologies, diffusion, innovation adoption, emerging market
Procedia PDF Downloads 4521629 Acceptance of Big Data Technologies and Its Influence towards Employee’s Perception on Job Performance
Authors: Jia Yi Yap, Angela S. H. Lee
Abstract:
With the use of big data technologies, organization can get result that they are interested in. Big data technologies simply load all the data that is useful for the organizations and provide organizations a better way of analysing data. The purpose of this research is to get employees’ opinion from films in Malaysia to explore the use of big data technologies in their organization in order to provide how it may affect the perception of the employees on job performance. Therefore, in order to identify will accepting big data technologies in the organization affect the perception of the employee, questionnaire will be distributed to different employee from different Small and medium-sized enterprises (SME) organization listed in Malaysia. The conceptual model proposed will test with other variables in order to see the relationship between variables.Keywords: big data technologies, employee, job performance, questionnaire
Procedia PDF Downloads 2981628 Investigation of Steel Infill Panels under Blast Impulsive Loading
Authors: Seyed M. Zahrai, Saeid Lotfi
Abstract:
If an infill panel does not have enough ductility against the loading, it breaks and gets damaged before depreciation and load transfer. As steel infill panel has appropriate ductility before fracture, it can be used as an alternative to typical infill panels under blast loading. Concerning enough ductility of out-of-plane behavior the infill panel, the impact force enters the horizontal diaphragm and is distributed among the lateral elements which can be made from steel infill panels. This article investigates the behavior of steel infill panels with different thickness and stiffeners using finite element analysis with geometric and material nonlinearities for optimization of the steel plate thickness and stiffeners arrangement to obtain more efficient design for its out-of-plane behavior.Keywords: blast loading, ductility, maximum displacement, steel infill panel
Procedia PDF Downloads 2771627 Mechanochemical Synthesis of Al2O3/Mo Nanocomposite Powders from Molybdenum Oxide
Authors: Behrooz Ghasemi, Bahram Sharijian
Abstract:
Al2O3/Mo nanocomposite powders were successfully synthesized by mechanical milling through mechanochemical reaction between MoO3 and Al. The structural evolutions of powder particles during mechanical milling were studied by X-ray diffractometry (XRD), energy dispersive X-ray spectroscopy(EDX) and scanning electron microscopy (SEM). Results show that Al2O3-Mo was completely obtained after 5 hr of milling. The crystallite sizes of Al2O3 and Mo after milling for 20 hr were about 45 nm and 23 nm, respectively. With longer milling time, the intensities of Al2O3 and Mo peaks decreased and became broad due to the decrease in crystallite size. Morphological features of powders were influenced by the milling time. The resulting Al2O3- Mo nanocomposite powder exhibited an average particle size of 200 nm after 20 hr of milling. Also nanocomposite powder after 10 hr milling had relatively equiaxed shape with uniformly distributed Mo phase in Al2O3 matrix.Keywords: Al2O3/Mo, nanocomposites, mechanochemical, mechanical milling
Procedia PDF Downloads 3681626 Applicability of Linearized Model of Synchronous Generator for Power System Stability Analysis
Authors: J. Ritonja, B. Grcar
Abstract:
For the synchronous generator simulation and analysis and for the power system stabilizer design and synthesis a mathematical model of synchronous generator is needed. The model has to accurately describe dynamics of oscillations, while at the same time has to be transparent enough for an analysis and sufficiently simplified for design of control system. To study the oscillations of the synchronous generator against to the rest of the power system, the model of the synchronous machine connected to an infinite bus through a transmission line having resistance and inductance is needed. In this paper, the linearized reduced order dynamic model of the synchronous generator connected to the infinite bus is presented and analysed in details. This model accurately describes dynamics of the synchronous generator only in a small vicinity of an equilibrium state. With the digression from the selected equilibrium point the accuracy of this model is decreasing considerably. In this paper, the equations’ descriptions and the parameters’ determinations for the linearized reduced order mathematical model of the synchronous generator are explained and summarized and represent the useful origin for works in the areas of synchronous generators’ dynamic behaviour analysis and synchronous generator’s control systems design and synthesis. The main contribution of this paper represents the detailed analysis of the accuracy of the linearized reduced order dynamic model in the entire synchronous generator’s operating range. Borders of the areas where the linearized reduced order mathematical model represents accurate description of the synchronous generator’s dynamics are determined with the systemic numerical analysis. The thorough eigenvalue analysis of the linearized models in the entire operating range is performed. In the paper, the parameters of the linearized reduced order dynamic model of the laboratory salient poles synchronous generator were determined and used for the analysis. The theoretical conclusions were confirmed with the agreement of experimental and simulation results.Keywords: eigenvalue analysis, mathematical model, power system stability, synchronous generator
Procedia PDF Downloads 2451625 Block Mining: Block Chain Enabled Process Mining Database
Authors: James Newman
Abstract:
Process mining is an emerging technology that looks to serialize enterprise data in time series data. It has been used by many companies and has been the subject of a variety of research papers. However, the majority of current efforts have looked at how to best create process mining from standard relational databases. This paper is the first pass at outlining a database custom-built for the minimal viable product of process mining. We present Block Miner, a blockchain protocol to store process mining data across a distributed network. We demonstrate the feasibility of storing process mining data on the blockchain. We present a proof of concept and show how the intersection of these two technologies helps to solve a variety of issues, including but not limited to ransomware attacks, tax documentation, and conflict resolution.Keywords: blockchain, process mining, memory optimization, protocol
Procedia PDF Downloads 1021624 Assessment of Environmental and Socio-Economic Impact of Quarring in Ebonyi State South East Nigeria: A Case Study of Umuoghara Quarry Community
Authors: G. Aloh Obianuju, C. Chukwu Kelvin, Henry Aloh
Abstract:
The study was undertaken to assess the environmental and socio-economic impact of quarrying in Umuoghara quarrying community of Ebonyi State, South East Nigeria. Questionnaires were distributed targeting quarry workers and people living within the community; personal interviews with other key informants were also conducted. All these were used as data gathering instruments. The study reveals that there were actually some benefits as well as marked environmental impacts in the community as a result of quarrying activities. Recommendations that can assist in mitigating these adverse impacts were suggested.Keywords: environment, quarrying, environmental degradation, mitigation
Procedia PDF Downloads 3081623 Influence of Probiotics on Dairy Cows Diet
Authors: V. A. Vieira, M. P. Sforcini, V. Endo, G. C. Magioni, M. D. S. Oliveira
Abstract:
The main goal of this paper was evaluate the effect of diets containing different levels of probiotic on performance and milk composition of lactating cows. Eight Holstein cows were distributed in two 4x4 Latin square. The diets were based on corn silage, concentrate and the treatment (0, 3, 6 or 9 grams of probiotic/animal/day). It was evaluated the dry matter intake of nutrients, milk yield and composition. The use of probiotics did not affect the nutrient intake (p>0.05) neither the daily milk production or corrected to 4% fat (p>0.05). However, it was observed that there was a significant fall in milk composition with higher levels of probiotics supplementation. These results emphasize the need of further studies with different experimental designs or improve the number of Latin square with longer periods of adaptation.Keywords: dairy cow, milk composition, probiotics, daily milk production
Procedia PDF Downloads 2611622 New Segmentation of Piecewise Moving-Average Model by Using Reversible Jump MCMC Algorithm
Authors: Suparman
Abstract:
This paper addresses the problem of the signal segmentation within a Bayesian framework by using reversible jump MCMC algorithm. The signal is modelled by piecewise constant Moving-Average (MA) model where the numbers of segments, the position of change-point, the order and the coefficient of the MA model for each segment are unknown. The reversible jump MCMC algorithm is then used to generate samples distributed according to the joint posterior distribution of the unknown parameters. These samples allow calculating some interesting features of the posterior distribution. The performance of the methodology is illustrated via several simulation results.Keywords: piecewise, moving-average model, reversible jump MCMC, signal segmentation
Procedia PDF Downloads 2271621 Subjective Temporal Resources: On the Relationship Between Time Perspective and Chronic Time Pressure to Burnout
Authors: Diamant Irene, Dar Tamar
Abstract:
Burnout, conceptualized within the framework of stress research, is to a large extent a result of a threat on resources of time or a feeling of time shortage. In reaction to numerous tasks, deadlines, high output, management of different duties encompassing work-home conflicts, many individuals experience ‘time pressure’. Time pressure is characterized as the perception of a lack of available time in relation to the amount of workload. It can be a result of local objective constraints, but it can also be a chronic attribute in coping with life. As such, time pressure is associated in the literature with general stress experience and can therefore be a direct, contributory burnout factor. The present study examines the relation of chronic time pressure – feeling of time shortage and of being rushed, with another central aspect in subjective temporal experience - time perspective. Time perspective is a stable personal disposition, capturing the extent to which people subjectively remember the past, live the present and\or anticipate the future. Based on Hobfoll’s Conservation of Resources Theory, it was hypothesized that individuals with chronic time pressure would experience a permanent threat on their time resources resulting in relatively increased burnout. In addition, it was hypothesized that different time perspective profiles, based on Zimbardo’s typology of five dimensions – Past Positive, Past Negative, Present Hedonistic, Present Fatalistic, and Future, would be related to different magnitudes of chronic time pressure and of burnout. We expected that individuals with ‘Past Negative’ or ‘Present Fatalist’ time perspectives would experience more burnout, with chronic time pressure being a moderator variable. Conversely, individuals with a ‘Present Hedonistic’ - with little concern with the future consequences of actions, would experience less chronic time pressure and less burnout. Another temporal experience angle examined in this study is the difference between the actual distribution of time (as in a typical day) versus desired distribution of time (such as would have been distributed optimally during a day). It was hypothesized that there would be a positive correlation between the gap between these time distributions and chronic time pressure and burnout. Data was collected through an online self-reporting survey distributed on social networks, with 240 participants (aged 21-65) recruited through convenience and snowball sampling methods from various organizational sectors. The results of the present study support the hypotheses and constitute a basis for future debate regarding the elements of burnout in the modern work environment, with an emphasis on subjective temporal experience. Our findings point to the importance of chronic and stable temporal experiences, as time pressure and time perspective, in occupational experience. The findings are also discussed with a view to the development of practical methods of burnout prevention.Keywords: conservation of resources, burnout, time pressure, time perspective
Procedia PDF Downloads 1761620 Determination of the Cooling Rate Dependency of High Entropy Alloys Using a High-Temperature Drop-on-Demand Droplet Generator
Authors: Saeedeh Imani Moqadam, Ilya Bobrov, Jérémy Epp, Nils Ellendt, Lutz Mädler
Abstract:
High entropy alloys (HEAs), having adjustable properties and enhanced stability compared with intermetallic compounds, are solid solution alloys that contain more than five principal elements with almost equal atomic percentage. The concept of producing such alloys pave the way for developing advanced materials with unique properties. However, the synthesis of such alloys may require advanced processes with high cooling rates depending on which alloy elements are used. In this study, the micro spheres of different diameters of HEAs were generated via a drop-on-demand droplet generator and subsequently solidified during free-fall in an argon atmosphere. Such droplet generators can generate individual droplets with high reproducibility regarding droplet diameter, trajectory and cooling while avoiding any interparticle momentum or thermal coupling. Metallography as well as X-ray diffraction investigations for each diameter of the generated metallic droplets where then carried out to obtain information about the microstructural state. To calculate the cooling rate of the droplets, a droplet cooling model was developed and validated using model alloys such as CuSn%6 and AlCu%4.5 for which a correlation of secondary dendrite arm spacing (SDAS) and cooling rate is well-known. Droplets were generated from these alloys and their SDAS was determined using quantitative metallography. The cooling rate was then determined from the SDAS and used to validate the cooling rates obtained from the droplet cooling model. The application of that model on the HEA then leads to the cooling rate dependency and hence to the identification of process windows for the synthesis of these alloys. These process windows were then compared with cooling rates obtained in processes such as powder production, spray forming, selective laser melting and casting to predict if a synthesis is possible with these processes.Keywords: cooling rate, drop-on-demand, high entropy alloys, microstructure, single droplet generation, X-ray Diffractometry
Procedia PDF Downloads 2111619 Effect of MPPT and THD in Grid-Connected Photovoltaic System
Authors: Sajjad Yahaghifar
Abstract:
From the end of the last century, the importance and use of renewable energy sources have gained prominence, due not only by the fossil fuels dependence reduction, but mainly by environmental reasons related to climate change and the effects to the humanity. Consequently, solar energy has been arousing interest in several countries for being a technology considered clean, with reduced environmental impact. The output power of photo voltaic (PV) arrays is always changing with weather conditions,i.e., solar irradiation and atmospheric temperature. Therefore, maximum power point tracking (MPPT) control to extract maximum power from the PV arrays at real time becomes indispensable in PV generation system. This paper Study MPPT and total harmonic distortion (THD) in the city of Tabriz, Iran with the grid-connected PV system as distributed generation.Keywords: MPPT, THD, grid-connected, PV system
Procedia PDF Downloads 3981618 Transmission Line Protection Challenges under High Penetration of Renewable Energy Sources and Proposed Solutions: A Review
Authors: Melake Kuflom
Abstract:
European power networks involve the use of multiple overhead transmission lines to construct a highly duplicated system that delivers reliable and stable electrical energy to the distribution level. The transmission line protection applied in the existing GB transmission network are normally independent unit differential and time stepped distance protection schemes, referred to as main-1 & main-2 respectively, with overcurrent protection as a backup. The increasing penetration of renewable energy sources, commonly referred as “weak sources,” into the power network resulted in the decline of fault level. Traditionally, the fault level of the GB transmission network has been strong; hence the fault current contribution is more than sufficient to ensure the correct operation of the protection schemes. However, numerous conventional coal and nuclear generators have been or about to shut down due to the societal requirement for CO2 emission reduction, and this has resulted in a reduction in the fault level on some transmission lines, and therefore an adaptive transmission line protection is required. Generally, greater utilization of renewable energy sources generated from wind or direct solar energy results in a reduction of CO2 carbon emission and can increase the system security and reliability but reduces the fault level, which has an adverse effect on protection. Consequently, the effectiveness of conventional protection schemes under low fault levels needs to be reviewed, particularly for future GB transmission network operating scenarios. The proposed paper will evaluate the transmission line challenges under high penetration of renewable energy sources andprovides alternative viable protection solutions based on the problem observed. The paper will consider the assessment ofrenewable energy sources (RES) based on a fully rated converter technology. The DIgSILENT Power Factory software tool will be used to model the network.Keywords: fault level, protection schemes, relay settings, relay coordination, renewable energy sources
Procedia PDF Downloads 2061617 Evaluation of Sustainable Business Model Innovation in Increasing the Penetration of Renewable Energy in the Ghana Power Sector
Authors: Victor Birikorang Danquah
Abstract:
Ghana's primary energy supply is heavily reliant on petroleum, biomass, and hydropower. Currently, Ghana gets its energy from hydropower (Akosombo and Bui), thermal power plants powered by crude oil, natural gas, and diesel, solar power, and imports from La Cote d'Ivoire. Until the early 2000s, large hydroelectric dams dominated Ghana's electricity generation. Due to unreliable weather patterns, Ghana increased its reliance on thermal power. However, thermal power contributes the highest percentage in terms of electricity generation in Ghana and is predominantly supplied by Independent Power Producers (IPPs). Ghana's electricity industry operates the corporate utility model as its business model. This model is typically' vertically integrated,' with a single corporation selling the majority of power generated by its generation assets to its retail business, which then sells the electricity to retail market consumers. The corporate utility model has a straightforward value proposition that is based on increasing the number of energy units sold. The unit volume business model drives the entire energy value chain to increase throughput, locking system users into unsustainable practices. This report uses the qualitative research approach to explore the electricity industry in Ghana. There is a need for increasing renewable energy, such as wind and solar, in electricity generation. The research recommends two critical business models for the penetration of renewable energy in Ghana's power sector. The first model is the peer-to-peer electricity trading model, which relies on a software platform to connect consumers and generators in order for them to trade energy directly with one another. The second model is about encouraging local energy generation, incentivizing optimal time-of-use behaviour, and allowing any financial gains to be shared among the community members.Keywords: business model innovation, electricity generation, renewable energy, solar energy, sustainability, wind energy
Procedia PDF Downloads 1811616 Counterfeit Product Detection Using Block Chain
Authors: Sharanya C. H., Pragathi M., Vathsala R. S., Theja K. V., Yashaswini S.
Abstract:
Identifying counterfeit products have become increasingly important in the product manufacturing industries in recent decades. This current ongoing product issue of counterfeiting has an impact on company sales and profits. To address the aforementioned issue, a functional blockchain technology was implemented, which effectively prevents the product from being counterfeited. By utilizing the blockchain technology, consumers are no longer required to rely on third parties to determine the authenticity of the product being purchased. Blockchain is a distributed database that stores data records known as blocks and several databases known as chains across various networks. Counterfeit products are identified using a QR code reader, and the product's QR code is linked to the blockchain management system. It compares the unique code obtained from the customer to the stored unique code to determine whether or not the product is original.Keywords: blockchain, ethereum, QR code
Procedia PDF Downloads 1771615 Multiscale Edge Detection Based on Nonsubsampled Contourlet Transform
Authors: Enqing Chen, Jianbo Wang
Abstract:
It is well known that the wavelet transform provides a very effective framework for multiscale edges analysis. However, wavelets are not very effective in representing images containing distributed discontinuities such as edges. In this paper, we propose a novel multiscale edge detection method in nonsubsampled contourlet transform (NSCT) domain, which is based on the dominant multiscale, multidirection edge expression and outstanding edge location of NSCT. Through real images experiments, simulation results demonstrate that the proposed method is better than other edge detection methods based on Canny operator, wavelet and contourlet. Additionally, the proposed method also works well for noisy images.Keywords: edge detection, NSCT, shift invariant, modulus maxima
Procedia PDF Downloads 4881614 A Contribution to Blockchain Privacy
Authors: Malika Yaici, Feriel Lalaoui, Lydia Belhoul
Abstract:
As a new distributed point-to-point (P2P) technology, blockchain has become a very broad field of research, addressing various challenges, including privacy preserving, as is the case in all other technologies. In this work, a study of the existing solutions to the problems related to private life in general and in blockchains in particular is performed. User anonymity and transaction confidentiality are the two main challenges to the protection of privacy in blockchains. Mixing mechanisms and cryptographic solutions respond to this problem but remain subject to attacks and suffer from shortcomings. Taking into account these imperfections and the synthesis of our study, we present a mixing model without trusted third parties, based on group signatures, allowing reinforcing the anonymity of the users, the confidentiality of the transactions, with minimal turnaround time and without mixing costs.Keywords: anonymity, blockchain, mixing coins, privacy
Procedia PDF Downloads 121613 Cluster Analysis of Customer Churn in Telecom Industry
Authors: Abbas Al-Refaie
Abstract:
The research examines the factors that affect customer churn (CC) in the Jordanian telecom industry. A total of 700 surveys were distributed. Cluster analysis revealed three main clusters. Results showed that CC and customer satisfaction (CS) were the key determinants in forming the three clusters. In two clusters, the center values of CC were high, indicating that the customers were loyal and SC was expensive and time- and energy-consuming. Still, the mobile service provider (MSP) should enhance its communication (COM), and value added services (VASs), as well as customer complaint management systems (CCMS). Finally, for the third cluster the center of the CC indicates a poor level of loyalty, which facilitates customers churn to another MSP. The results of this study provide valuable feedback for MSP decision makers regarding approaches to improving their performance and reducing CC.Keywords: cluster analysis, telecom industry, switching cost, customer churn
Procedia PDF Downloads 323