Search results for: data mining techniques
26608 Talent Management through Integration of Talent Value Chain and Human Capital Analytics Approaches
Authors: Wuttigrai Ngamsirijit
Abstract:
Talent management in today’s modern organizations has become data-driven due to a demand for objective human resource decision making and development of analytics technologies. HR managers have been faced with some obstacles in exploiting data and information to obtain their effective talent management decisions. These include process-based data and records; insufficient human capital-related measures and metrics; lack of capabilities in data modeling in strategic manners; and, time consuming to add up numbers and make decisions. This paper proposes a framework of talent management through integration of talent value chain and human capital analytics approaches. It encompasses key data, measures, and metrics regarding strategic talent management decisions along the organizational and talent value chain. Moreover, specific predictive and prescriptive models incorporating these data and information are recommended to help managers in understanding the state of talent, gaps in managing talent and the organization, and the ways to develop optimized talent strategies.Keywords: decision making, human capital analytics, talent management, talent value chain
Procedia PDF Downloads 18726607 Nondestructive Evaluation of Hidden Delamination in Glass Fiber Composite Using Terahertz Spectroscopy
Authors: Chung-Hyeon Ryu, Do-Hyoung Kim, Hak-Sung Kim
Abstract:
As the use of the composites was increased, the detecting method of hidden damages which have an effect on performance of the composite was important. Terahertz (THz) spectroscopy was assessed as one of the new powerful nondestructive evaluation (NDE) techniques for fiber reinforced composite structures because it has many advantages which can overcome the limitations of conventional NDE techniques such as x-rays or ultrasound. The THz wave offers noninvasive, noncontact and nonionizing methods evaluating composite damages, also it gives a broad range of information about the material properties. In additions, it enables to detect the multiple-delaminations of various nonmetallic materials. In this study, the pulse type THz spectroscopy imaging system was devised and used for detecting and evaluating the hidden delamination in the glass fiber reinforced plastic (GFRP) composite laminates. The interaction between THz and the GFRP composite was analyzed respect to the type of delamination, including their thickness, size and numbers of overlaps among multiple-delaminations in through-thickness direction. Both of transmission and reflection configurations were used for evaluation of hidden delaminations and THz wave propagations through the delaminations were also discussed. From these results, various hidden delaminations inside of the GFRP composite were successfully detected using time-domain THz spectroscopy imaging system and also compared to the results of C-scan inspection. It is expected that THz NDE technique will be widely used to evaluate the reliability of composite structures.Keywords: terahertz, delamination, glass fiber reinforced plastic composites, terahertz spectroscopy
Procedia PDF Downloads 59226606 Geographic Information System Based Multi-Criteria Subsea Pipeline Route Optimisation
Authors: James Brown, Stella Kortekaas, Ian Finnie, George Zhang, Christine Devine, Neil Healy
Abstract:
The use of GIS as an analysis tool for engineering decision making is now best practice in the offshore industry. GIS enables multidisciplinary data integration, analysis and visualisation which allows the presentation of large and intricate datasets in a simple map-interface accessible to all project stakeholders. Presenting integrated geoscience and geotechnical data in GIS enables decision makers to be well-informed. This paper is a successful case study of how GIS spatial analysis techniques were applied to help select the most favourable pipeline route. Routing a pipeline through any natural environment has numerous obstacles, whether they be topographical, geological, engineering or financial. Where the pipeline is subjected to external hydrostatic water pressure and is carrying pressurised hydrocarbons, the requirement to safely route the pipeline through hazardous terrain becomes absolutely paramount. This study illustrates how the application of modern, GIS-based pipeline routing techniques enabled the identification of a single most-favourable pipeline route crossing of a challenging seabed terrain. Conventional approaches to pipeline route determination focus on manual avoidance of primary constraints whilst endeavouring to minimise route length. Such an approach is qualitative, subjective and is liable to bias towards the discipline and expertise that is involved in the routing process. For very short routes traversing benign seabed topography in shallow water this approach may be sufficient, but for deepwater geohazardous sites, the need for an automated, multi-criteria, and quantitative approach is essential. This study combined multiple routing constraints using modern least-cost-routing algorithms deployed in GIS, hitherto unachievable with conventional approaches. The least-cost-routing procedure begins with the assignment of geocost across the study area. Geocost is defined as a numerical penalty score representing hazard posed by each routing constraint (e.g. slope angle, rugosity, vulnerability to debris flows) to the pipeline. All geocosted routing constraints are combined to generate a composite geocost map that is used to compute the least geocost route between two defined terminals. The analyses were applied to select the most favourable pipeline route for a potential gas development in deep water. The study area is geologically complex with a series of incised, potentially active, canyons carved into a steep escarpment, with evidence of extensive debris flows. A similar debris flow in the future could cause significant damage to a poorly-placed pipeline. Protruding inter-canyon spurs offer lower-gradient options for ascending an escarpment but the vulnerability of periodic failure of these spurs is not well understood. Close collaboration between geoscientists, pipeline engineers, geotechnical engineers and of course the gas export pipeline operator guided the analyses and assignment of geocosts. Shorter route length, less severe slope angles, and geohazard avoidance were the primary drivers in identifying the most favourable route.Keywords: geocost, geohazard, pipeline route determination, pipeline route optimisation, spatial analysis
Procedia PDF Downloads 40626605 Impact of Infrastructural Development on Socio-Economic Growth: An Empirical Investigation in India
Authors: Jonardan Koner
Abstract:
The study attempts to find out the impact of infrastructural investment on state economic growth in India. It further tries to determine the magnitude of the impact of infrastructural investment on economic indicator, i.e., per-capita income (PCI) in Indian States. The study uses panel regression technique to measure the impact of infrastructural investment on per-capita income (PCI) in Indian States. Panel regression technique helps incorporate both the cross-section and time-series aspects of the dataset. In order to analyze the difference in impact of the explanatory variables on the explained variables across states, the study uses Fixed Effect Panel Regression Model. The conclusions of the study are that infrastructural investment has a desirable impact on economic development and that the impact is different for different states in India. We analyze time series data (annual frequency) ranging from 1991 to 2010. The study reveals that the infrastructural investment significantly explains the variation of economic indicators.Keywords: infrastructural investment, multiple regression, panel regression techniques, economic development, fixed effect dummy variable model
Procedia PDF Downloads 37126604 A Relative Entropy Regularization Approach for Fuzzy C-Means Clustering Problem
Authors: Ouafa Amira, Jiangshe Zhang
Abstract:
Clustering is an unsupervised machine learning technique; its aim is to extract the data structures, in which similar data objects are grouped in the same cluster, whereas dissimilar objects are grouped in different clusters. Clustering methods are widely utilized in different fields, such as: image processing, computer vision , and pattern recognition, etc. Fuzzy c-means clustering (fcm) is one of the most well known fuzzy clustering methods. It is based on solving an optimization problem, in which a minimization of a given cost function has been studied. This minimization aims to decrease the dissimilarity inside clusters, where the dissimilarity here is measured by the distances between data objects and cluster centers. The degree of belonging of a data point in a cluster is measured by a membership function which is included in the interval [0, 1]. In fcm clustering, the membership degree is constrained with the condition that the sum of a data object’s memberships in all clusters must be equal to one. This constraint can cause several problems, specially when our data objects are included in a noisy space. Regularization approach took a part in fuzzy c-means clustering technique. This process introduces an additional information in order to solve an ill-posed optimization problem. In this study, we focus on regularization by relative entropy approach, where in our optimization problem we aim to minimize the dissimilarity inside clusters. Finding an appropriate membership degree to each data object is our objective, because an appropriate membership degree leads to an accurate clustering result. Our clustering results in synthetic data sets, gaussian based data sets, and real world data sets show that our proposed model achieves a good accuracy.Keywords: clustering, fuzzy c-means, regularization, relative entropy
Procedia PDF Downloads 25926603 Fabrication of Textile-Based Radio Frequency Metasurfaces
Authors: Adria Kajenski, Guinevere Strack, Edward Kingsley, Shahriar Khushrushahi, Alkim Akyurtlu
Abstract:
Radio Frequency (RF) metasurfaces are arrangements of subwavelength elements interacting with electromagnetic radiation. These arrangements affect polarization state, amplitude, and phase of impinged radio waves; for example, metasurface designs are used to produce functional passband and stopband filters. Recent advances in additive manufacturing techniques have enabled the low-cost, rapid fabrication of ultra-thin metasurface elements on flexible substrates such as plastic films, paper, and textiles. Furthermore, scalable manufacturing processes promote the integration of fabric-based RF metasurfaces into the market of sensors and devices within the Internet of Things (IoT). The design and fabrication of metasurfaces on textiles require a multidisciplinary team with expertise in i) textile and materials science, ii) metasurface design and simulation, and iii) metasurface fabrication and testing. In this presentation, we will discuss RF metasurfaces on fabric with an emphasis on how the materials, including fabric and inks, along with fabrication techniques, affect the RF performance. We printed metasurfaces using a direct-write approach onto various woven and non-woven fabrics, as well as on fabrics coated with either thermoplastic or thermoset coatings. Our team also performed a range of tests on the printed structures, including different inks and their curing parameters, wash durability, abrasion resistance, and RF performance over time.Keywords: electronic textiles, metasurface, printed electronics, flexible
Procedia PDF Downloads 19526602 Environmental Education for Sustainable Development in Bangladesh and Its Challenges
Authors: Md. Kamal Uddin
Abstract:
Bangladesh is trying to achieve Sustainable Development Goals (SDGs) by 2030. Environmental Education (EE) is very vital to reaching the agenda of SDGs. However, a lack of environmental awareness and gaps in theoretical knowledge and its practices still exists in Bangladesh. Therefore, this research aims to understand the students’ perceptions of whether and how their behaviour is environment-friendly to achieve SDGs. It also addresses teachers’ perceptions of what are the shortcomings of environmental education in Bangladesh. It uses the qualitative and quantitative techniques of data collection and analysis based on in-depth interviews, surveys among different categories of participants and classroom observation. The paper finds that the level of EE and students’ awareness of the environment is inadequate. Some teachers believe that the EE is not better in Bangladesh due to the absence of practical learning of EE, lack of the motivations and actions, institutional weakness, inadequate policies, poor implementation, and cultural and traditional beliefs. Thus, this paper argues that Bangladeshi EE is not adequate to change the behaviour of the students towards the environment, which makes it difficult for the country to ensure SD. Thus, this research suggests that there is a need to revise the environmental education policy to change the behaviour and structure of the country for sustainable development.Keywords: environmental education, sustainable development, environmental practice, environmental behaviour, Bangladesh
Procedia PDF Downloads 19826601 Sampled-Data Model Predictive Tracking Control for Mobile Robot
Authors: Wookyong Kwon, Sangmoon Lee
Abstract:
In this paper, a sampled-data model predictive tracking control method is presented for mobile robots which is modeled as constrained continuous-time linear parameter varying (LPV) systems. The presented sampled-data predictive controller is designed by linear matrix inequality approach. Based on the input delay approach, a controller design condition is derived by constructing a new Lyapunov function. Finally, a numerical example is given to demonstrate the effectiveness of the presented method.Keywords: model predictive control, sampled-data control, linear parameter varying systems, LPV
Procedia PDF Downloads 30926600 Risk Assessment of Natural Gas Pipelines in Coal Mined Gobs Based on Bow-Tie Model and Cloud Inference
Authors: Xiaobin Liang, Wei Liang, Laibin Zhang, Xiaoyan Guo
Abstract:
Pipelines pass through coal mined gobs inevitably in the mining area, the stability of which has great influence on the safety of pipelines. After extensive literature study and field research, it was found that there are a few risk assessment methods for coal mined gob pipelines, and there is a lack of data on the gob sites. Therefore, the fuzzy comprehensive evaluation method is widely used based on expert opinions. However, the subjective opinions or lack of experience of individual experts may lead to inaccurate evaluation results. Hence the accuracy of the results needs to be further improved. This paper presents a comprehensive approach to achieve this purpose by combining bow-tie model and cloud inference. The specific evaluation process is as follows: First, a bow-tie model composed of a fault tree and an event tree is established to graphically illustrate the probability and consequence indicators of pipeline failure. Second, the interval estimation method can be scored in the form of intervals to improve the accuracy of the results, and the censored mean algorithm is used to remove the maximum and minimum values of the score to improve the stability of the results. The golden section method is used to determine the weight of the indicators and reduce the subjectivity of index weights. Third, the failure probability and failure consequence scores of the pipeline are converted into three numerical features by using cloud inference. The cloud inference can better describe the ambiguity and volatility of the results which can better describe the volatility of the risk level. Finally, the cloud drop graphs of failure probability and failure consequences can be expressed, which intuitively and accurately illustrate the ambiguity and randomness of the results. A case study of a coal mine gob pipeline carrying natural gas has been investigated to validate the utility of the proposed method. The evaluation results of this case show that the probability of failure of the pipeline is very low, the consequences of failure are more serious, which is consistent with the reality.Keywords: bow-tie model, natural gas pipeline, coal mine gob, cloud inference
Procedia PDF Downloads 25026599 Isolation, Preparation and Biological Properties of Soybean-Flaxseed Protein Co-Precipitates
Authors: Muhammad H. Alu’datt, Inteaz Alli
Abstract:
This study was conducted to prepare and evaluate the biological properties of protein co-precipitates from flaxseed and soybean. Protein was prepared by NaOH extraction through the mixing of soybean flour (Sf) and flaxseed flour (Ff) or mixtures of soybean extract (Se) and flaxseed extract (Fe). The protein co-precipitates were precipitated by isoelectric (IEP) and isoelectric-heating (IEPH) co-precipitation techniques. Effects of extraction and co-precipitation techniques on co-precipitate yield were investigated. Native-PAGE, SDS-PAGE were used to study the molecular characterization. Content and antioxidant activity of extracted free and bound phenolic compounds were evaluated for protein co-precipitates. Removal of free and bound phenolic compounds from protein co-precipitates showed little effects on the electrophoretic behavior of the proteins or the protein subunits of protein co-precipitates. Results showed that he highest protein contents and yield were obtained in for Sf-Ff/IEP co-precipitate with values of 53.28 and 25.58% respectively as compared to protein isolates and other co-precipitates. Results revealed that the Sf-Ff/IEP showed a higher content of bound phenolic compounds (53.49% from total phenolic content) as compared to free phenolic compounds (46.51% from total phenolic content). Antioxidant activities of extracted bound phenolic compounds with and without heat treatment from Sf-Ff/IEHP were higher as compared to free phenolic compounds extracted from other protein co-precipitates (29.68 and 22.84%, respectively).Keywords: antioxidant, phenol, protein co-precipitate, yield
Procedia PDF Downloads 24026598 A State-Of-The-Art Review on Web Services Adaptation
Authors: M. Velasco, D. While, P. Raju, J. Krasniewicz, A. Amini, L. Hernandez-Munoz
Abstract:
Web service adaptation involves the creation of adapters that solve Web services incompatibilities known as mismatches. Since the importance of Web services adaptation is increasing because of the frequent implementation and use of online Web services, this paper presents a literature review of web services to investigate the main methods of adaptation, their theoretical underpinnings and the metrics used to measure adapters performance. Eighteen publications were reviewed independently by two researchers. We found that adaptation techniques are needed to solve different types of problems that may arise due to incompatibilities in Web service interfaces, including protocols, messages, data and semantics that affect the interoperability of the services. Although adapters are non-invasive methods that can improve Web services interoperability and there are current approaches for service adaptation; there is, however, not yet one solution that fits all types of mismatches. Our results also show that only a few research projects incorporate theoretical frameworks and that metrics to measure adapters’ performance are very limited. We conclude that further research on software adaptation should improve current adaptation methods in different layers of the service interoperability and that an adaptation theoretical framework that incorporates a theoretical underpinning and measures of qualitative and quantitative performance needs to be created.Keywords: Web Services Adapters, software adaptation, web services mismatches, web services interoperability
Procedia PDF Downloads 29426597 Imputation of Urban Movement Patterns Using Big Data
Authors: Eusebio Odiari, Mark Birkin, Susan Grant-Muller, Nicolas Malleson
Abstract:
Big data typically refers to consumer datasets revealing some detailed heterogeneity in human behavior, which if harnessed appropriately, could potentially revolutionize our understanding of the collective phenomena of the physical world. Inadvertent missing values skew these datasets and compromise the validity of the thesis. Here we discuss a conceptually consistent strategy for identifying other relevant datasets to combine with available big data, to plug the gaps and to create a rich requisite comprehensive dataset for subsequent analysis. Specifically, emphasis is on how these methodologies can for the first time enable the construction of more detailed pictures of passenger demand and drivers of mobility on the railways. These methodologies can predict the influence of changes within the network (like a change in time-table or impact of a new station), explain local phenomena outside the network (like rail-heading) and the other impacts of urban morphology. Our analysis also reveals that our new imputation data model provides for more equitable revenue sharing amongst network operators who manage different parts of the integrated UK railways.Keywords: big-data, micro-simulation, mobility, ticketing-data, commuters, transport, synthetic, population
Procedia PDF Downloads 23126596 Electrodeposition and Selenization of Cuin Alloys for the Synthesis of Photoactive Cu2in1-X Gax Se2 (Cigs) Thin Films
Authors: Mohamed Benaicha, Mahdi Allam
Abstract:
A new two stage electrochemical process as a safe, large area and low processing cost technique for the production of semi-conducting CuInSe2 (CIS) thin films is studied. CuIn precursors were first potentiostatically electrodeposited onto molybdenum substrates from an acidic thiocyanate electrolyte. In a second stage, the prepared metallic CuIn layers were used as substrate in the selenium electrochemical deposition system and subjected to a thermal treatment in vacuum atmosphere, to eliminate binary phase formation by reaction of the Cu2-x Se and InxSey selenides, leading to the formation of CuInSe2 thin film. Electrochemical selenization from aqueous electrolyte is introduced as an alternative to toxic and hazardous H2Se or Se vapor phase selenization used in physical techniques. In this study, the influence of film deposition parameters such as bath composition, temperature and potential on film properties was studied. The electrochemical, morphological, structural and compositional properties of electrodeposited thin films were characterized using various techniques. Results of Cyclic and Stripping-Cyclic Voltammetry (CV, SCV), Scanning Electron Microscopy (SEM) and Energy Dispersive X-Ray microanalysis (EDX) investigations revealed good reproducibility and homogeneity of the film composition. Thereby optimal technological parameters for the electrochemical production of CuIn, Se as precursors for CuInSe2 thin layers are determined.Keywords: photovoltaic, CIGS, copper alloys, electrodeposition, thin films
Procedia PDF Downloads 46426595 Analyzing Data Protection in the Era of Big Data under the Framework of Virtual Property Layer Theory
Authors: Xiaochen Mu
Abstract:
Data rights confirmation, as a key legal issue in the development of the digital economy, is undergoing a transition from a traditional rights paradigm to a more complex private-economic paradigm. In this process, data rights confirmation has evolved from a simple claim of rights to a complex structure encompassing multiple dimensions of personality rights and property rights. Current data rights confirmation practices are primarily reflected in two models: holistic rights confirmation and process rights confirmation. The holistic rights confirmation model continues the traditional "one object, one right" theory, while the process rights confirmation model, through contractual relationships in the data processing process, recognizes rights that are more adaptable to the needs of data circulation and value release. In the design of the data property rights system, there is a hierarchical characteristic aimed at decoupling from raw data to data applications through horizontal stratification and vertical staging. This design not only respects the ownership rights of data originators but also, based on the usufructuary rights of enterprises, constructs a corresponding rights system for different stages of data processing activities. The subjects of data property rights include both data originators, such as users, and data producers, such as enterprises, who enjoy different rights at different stages of data processing. The intellectual property rights system, with the mission of incentivizing innovation and promoting the advancement of science, culture, and the arts, provides a complete set of mechanisms for protecting innovative results. However, unlike traditional private property rights, the granting of intellectual property rights is not an end in itself; the purpose of the intellectual property system is to balance the exclusive rights of the rights holders with the prosperity and long-term development of society's public learning and the entire field of science, culture, and the arts. Therefore, the intellectual property granting mechanism provides both protection and limitations for the rights holder. This perfectly aligns with the dual attributes of data. In terms of achieving the protection of data property rights, the granting of intellectual property rights is an important institutional choice that can enhance the effectiveness of the data property exchange mechanism. Although this is not the only path, the granting of data property rights within the framework of the intellectual property rights system helps to establish fundamental legal relationships and rights confirmation mechanisms and is more compatible with the classification and grading system of data. The modernity of the intellectual property rights system allows it to adapt to the needs of big data technology development through special clauses or industry guidelines, thus promoting the comprehensive advancement of data intellectual property rights legislation. This paper analyzes data protection under the virtual property layer theory and two-fold virtual property rights system. Based on the “bundle of right” theory, this paper establishes specific three-level data rights. This paper analyzes the cases: Google v. Vidal-Hall, Halliday v Creation Consumer Finance, Douglas v Hello Limited, Campbell v MGN and Imerman v Tchenquiz. This paper concluded that recognizing property rights over personal data and protecting data under the framework of intellectual property will be beneficial to establish the tort of misuse of personal information.Keywords: data protection, property rights, intellectual property, Big data
Procedia PDF Downloads 3926594 Downscaling Grace Gravity Models Using Spectral Combination Techniques for Terrestrial Water Storage and Groundwater Storage Estimation
Authors: Farzam Fatolazadeh, Kalifa Goita, Mehdi Eshagh, Shusen Wang
Abstract:
The Gravity Recovery and Climate Experiment (GRACE) is a satellite mission with twin satellites for the precise determination of spatial and temporal variations in the Earth’s gravity field. The products of this mission are monthly global gravity models containing the spherical harmonic coefficients and their errors. These GRACE models can be used for estimating terrestrial water storage (TWS) variations across the globe at large scales, thereby offering an opportunity for surface and groundwater storage (GWS) assessments. Yet, the ability of GRACE to monitor changes at smaller scales is too limited for local water management authorities. This is largely due to the low spatial and temporal resolutions of its models (~200,000 km2 and one month, respectively). High-resolution GRACE data products would substantially enrich the information that is needed by local-scale decision-makers while offering the data for the regions that lack adequate in situ monitoring networks, including northern parts of Canada. Such products could eventually be obtained through downscaling. In this study, we extended the spectral combination theory to simultaneously downscale spatiotemporally the 3o spatial coarse resolution of GRACE to 0.25o degrees resolution and monthly coarse resolution to daily resolution. This method combines the monthly gravity field solution of GRACE and daily hydrological model products in the form of both low and high-frequency signals to produce high spatiotemporal resolution TWSA and GWSA products. The main contribution and originality of this study are to comprehensively and simultaneously consider GRACE and hydrological variables and their uncertainties to form the estimator in the spectral domain. Therefore, it is predicted that we reach downscale products with an acceptable accuracy.Keywords: GRACE satellite, groundwater storage, spectral combination, terrestrial water storage
Procedia PDF Downloads 8326593 The Influence of Housing Choice Vouchers on the Private Rental Market
Authors: Randy D. Colon
Abstract:
Through a freedom of information request, data pertaining to Housing Choice Voucher (HCV) households has been obtained from the Chicago Housing Authority, including rent price and number of bedrooms per HCV household, community area, and zip code from 2013 to the first quarter of 2018. Similar data pertaining to the private rental market will be obtained through public records found through the United States Department of Housing and Urban Development. The datasets will be analyzed through statistical and mapping software to investigate the potential link between HCV households and distorted rent prices. Quantitative data will be supplemented by qualitative data to investigate the lived experience of Chicago residents. Qualitative data will be collected at community meetings in the Chicago Englewood neighborhood through participation in neighborhood meetings and informal interviews with residents and community leaders. The qualitative data will be used to gain insight on the lived experience of community leaders and residents of the Englewood neighborhood in relation to housing, the rental market, and HCV. While there is an abundance of quantitative data on this subject, this qualitative data is necessary to capture the lived experience of local residents effected by a changing rental market. This topic reflects concerns voiced by members of the Englewood community, and this study aims to keep the community relevant in its findings.Keywords: Chicago, housing, housing choice voucher program, housing subsidies, rental market
Procedia PDF Downloads 11826592 The Dynamic Metadata Schema in Neutron and Photon Communities: A Case Study of X-Ray Photon Correlation Spectroscopy
Authors: Amir Tosson, Mohammad Reza, Christian Gutt
Abstract:
Metadata stands at the forefront of advancing data management practices within research communities, with particular significance in the realms of neutron and photon scattering. This paper introduces a groundbreaking approach—dynamic metadata schema—within the context of X-ray Photon Correlation Spectroscopy (XPCS). XPCS, a potent technique unravelling nanoscale dynamic processes, serves as an illustrative use case to demonstrate how dynamic metadata can revolutionize data acquisition, sharing, and analysis workflows. This paper explores the challenges encountered by the neutron and photon communities in navigating intricate data landscapes and highlights the prowess of dynamic metadata in addressing these hurdles. Our proposed approach empowers researchers to tailor metadata definitions to the evolving demands of experiments, thereby facilitating streamlined data integration, traceability, and collaborative exploration. Through tangible examples from the XPCS domain, we showcase how embracing dynamic metadata standards bestows advantages, enhancing data reproducibility, interoperability, and the diffusion of knowledge. Ultimately, this paper underscores the transformative potential of dynamic metadata, heralding a paradigm shift in data management within the neutron and photon research communities.Keywords: metadata, FAIR, data analysis, XPCS, IoT
Procedia PDF Downloads 6226591 Exploring the Relationship among Job Stress, Travel Constraints, and Job Satisfaction of the Employees in Casino Hotels: The Case of Macau
Authors: Tao Zhang
Abstract:
Job stress appears nearly everywhere especially in the hospitality industry because employees in this industry usually have to work long time and try to meet conflicting demands of their customers, managers, and company. To reduce job stress, employees of casino hotels try to perform leisure activities or tourism. However, casino employees often meet many obstacles or constraints when they plan to travel. Until now, there is little understanding as to why casino hotel employees often face many travel constraints or leisure barriers. What is more, few studies explore the relationship between travel constraints and job stress of casino employees. Therefore, this study is to explore the construct of casino hotel employees' travel constraints and the relationship among job stress, travel constraints, and job satisfaction. Using convenient sampling method, this study planned to investigate 500 front line employees and managers of ten casino hotels in Macau. A total of 500 questionnaires were distributed, and 414 valid questionnaires were received. The return rate of valid questionnaires is 82.8%. Several statistical techniques such as factor analysis, t-test, one-way ANOVA, and regression analysis were applied to analyze the collected data. The findings of this study are as follows. Firstly, by using factor analysis, this study found the travel constraints of casino employees include intrapersonal constraints, interpersonal constraints, and structural constraints. Secondly, by using regression analysis, the study found travel constraints are positively related with job stress while negatively related with job satisfaction. This means reducing travel constraints may create a chance for casino employees to travel so that they could reduce job stress, therefore raise their job satisfaction. Thirdly, this research divided the research samples into three groups by the degree of job stress. The three groups are low satisfaction group, medium satisfaction group, and high satisfaction group. The means values of these groups were compared by t-test. Results showed that there are significant differences of the means values of interpersonal constraints between low satisfaction group and high satisfaction group. This suggests positive interpersonal relationship especially good family member relationship reduce not only job stress but also travel constraints of casino employees. Interestingly, results of t-test showed there is not a significant difference of the means values of structural constraints between low satisfaction group and high satisfaction group. This suggests structural constraints are outside variables which may be related with tourism destination marketing. Destination marketing organizations (DMO) need use all kinds of tools and techniques to promote their tourism destinations so as to reduce structural constraints of casino employees. This research is significant for both theoretical and practical fields. From the theoretical perspective, the study found the internal relationship between travel constraints, job stress, and job satisfaction and the different roles of three dimensions of travel constraints. From the practical perspective, the study provides useful methods to reduce travel constraints and job stress, therefore, raise job satisfaction of casino employees.Keywords: hotel, job satisfaction, job stress, travel constraints
Procedia PDF Downloads 25126590 Optimization of Hot Metal Charging Circuit in a Steel Melting Shop Using Industrial Engineering Techniques for Achieving Manufacturing Excellence
Authors: N. Singh, A. Khullar, R. Shrivastava, I. Singh, A. S. Kumar
Abstract:
Steel forms the basis of any modern society and is essential to economic growth. India’s annual crude steel production has seen a consistent increase over the past years and is poised to grow to 300 million tons per annum by 2030-31 from current level of 110-120 million tons per annum. Steel industry is highly capital-intensive industry and to remain competitive, it is imperative that it invests in operational excellence. Due to inherent nature of the industry, there is large amount of variability in its supply chain both internally and externally. Production and productivity of a steel plant is greatly affected by the bottlenecks present in material flow logistics. The internal logistics constituting of transport of liquid metal within a steel melting shop (SMS) presents an opportunity in increasing the throughput with marginal capital investment. The study was carried out at one of the SMS of an integrated steel plant located in the eastern part of India. The plant has three SMS’s and the study was carried out at one of them. The objective of this study was to identify means to optimize SMS hot metal logistics through application of industrial engineering techniques. The study also covered the identification of non-value-added activities and proposed methods to eliminate the delays and improve the throughput of the SMS.Keywords: optimization, steel making, supply chain, throughput enhancement, workforce productivity
Procedia PDF Downloads 11826589 Ambulatory Care Utilization of Individuals with Cerebral Palsy in Taiwan- A Country with Universal Coverage and No Gatekeeper Regulation
Authors: Ming-Juei Chang, Hui-Ing Ma, Tsung-Hsueh Lu
Abstract:
Introduction: Because of the advance of medical care (e.g., ventilation techniques and gastrostomy feeding), more and more children with CP can live to adulthood. However, little is known about the use of health care services from children to adults who have CP. The patterns of utilization of ambulatory care are heavily influenced by insurance coverage and primary care gatekeeper regulation. The purpose of this study was to examine patterns of ambulatory care utilization among individuals with CP in Taiwan, a country with universal coverage and no gatekeeper regulation. Methods: A representative sample of one million patients (about 1/23 of total population) covered by Taiwan’s National Health Insurance was used to analyze the ambulatory care utilization in individuals with CP. Data were analyzed by 3 different age groups (children, youth and adults) during 2000 to 2003. Participants were identified by the presence of CP diagnosis made by pediatricians or physicians of physical and rehabilitation medicine and stated at least three times in claims data. Results: Annual rates of outpatient physician visits were 31680 for children, 16492 for youth, and 28617 for adults with CP (per 1000 persons). Individuals with CP received over 50% of their outpatient care from hospital outpatient department. Higher use of specialist physician services was found in children (54.7%) than in the other two age groups (28.4% in youth and 18.8% in adults). Diseases of respiratory system were the most frequent diagnoses for visits in both children and youth with CP. Diseases of the circulatory system were the main reasons (24.3%) that adults with CP visited hospital outpatient care department or clinics. Conclusion: This study showed different patterns of ambulatory care utilization among different age groups. It appears that youth and adults with CP continue to have complex health issues and rely heavily on the health care system. Additional studies are needed to determine the factors which influence ambulatory care utilization among individuals with CP.Keywords: cerebral palsy, health services, lifespan, universal coverage
Procedia PDF Downloads 37426588 Exploring SSD Suitable Allocation Schemes Incompliance with Workload Patterns
Authors: Jae Young Park, Hwansu Jung, Jong Tae Kim
Abstract:
Whether the data has been well parallelized is an important factor in the Solid-State-Drive (SSD) performance. SSD parallelization is affected by allocation scheme and it is directly connected to SSD performance. There are dynamic allocation and static allocation in representative allocation schemes. Dynamic allocation is more adaptive in exploiting write operation parallelism, while static allocation is better in read operation parallelism. Therefore, it is hard to select the appropriate allocation scheme when the workload is mixed read and write operations. We simulated conditions on a few mixed data patterns and analyzed the results to help the right choice for better performance. As the results, if data arrival interval is long enough prior operations to be finished and continuous read intensive data environment static allocation is more suitable. Dynamic allocation performs the best on write performance and random data patterns.Keywords: dynamic allocation, NAND flash based SSD, SSD parallelism, static allocation
Procedia PDF Downloads 33926587 The Evolution of Man through Cranial and Dental Remains: A Literature Review
Authors: Rishana Bilimoria
Abstract:
Darwin’s insightful anthropological theory on the evolution drove mankind’s understanding of our existence in the natural world. Scientists consider analysis of dental and craniofacial remains to be pivotal in uncovering facts about our evolutionary journey. The resilient mineral content of enamel and dentine allow cranial and dental remains to be preserved for millions of years, making it an excellent resource not only in anthropology but other fields of research including forensic dentistry. This literature review aims to chronologically approach each ancestral species, reviewing Australopithecus, Paranthropus, Homo Habilis, Homo Rudolfensis, Homo Erectus, Homo Neanderthalis, and finally Homo Sapiens. Studies included in the review assess the features of cranio-dental remains that are of evolutionary importance, such as microstructure, microwear, morphology, and jaw biomechanics. The article discusses the plethora of analysis techniques employed to study dental remains including carbon dating, dental topography, confocal imaging, DPI scanning and light microscopy, in addition to microwear study and analysis of features such as coronal and root morphology, mandibular corpus shape, craniofacial anatomy and microstructure. Furthermore, results from these studies provide insight into the diet, lifestyle and consequently, ecological surroundings of each species. We can correlate dental fossil evidence with wider theories on pivotal global events, to help us contextualize each species in space and time. Examples include dietary adaptation during the period of global cooling converting the landscape of Africa from forest to grassland. Global migration ‘out of Africa’ can be demonstrated by enamel thickness variation, cranial vault variation over time demonstrates accommodation to larger brain sizes, and dental wear patterns can place the commencement of lithic technology in history. Conclusions from this literature review show that dental evidence plays a major role in painting a phenotypic and all rounded picture of species of the Homo genus, in particular, analysis of coronal morphology through carbon dating and dental wear analysis. With regards to analysis technique, whilst studies require larger sample sizes, this could be unrealistic since there are limitations in ability to retrieve fossil data. We cannot deny the reliability of carbon dating; however, there is certainly scope for the use of more recent techniques, and further evidence of their success is required.Keywords: cranio-facial, dental remains, evolution, hominids
Procedia PDF Downloads 16526586 Reverse Engineering of a Secondary Structure of a Helicopter: A Study Case
Authors: Jose Daniel Giraldo Arias, Camilo Rojas Gomez, David Villegas Delgado, Gullermo Idarraga Alarcon, Juan Meza Meza
Abstract:
The reverse engineering processes are widely used in the industry with the main goal to determine the materials and the manufacture used to produce a component. There are a lot of characterization techniques and computational tools that are used in order to get this information. A study case of a reverse engineering applied to a secondary sandwich- hybrid type structure used in a helicopter is presented. The methodology used consists of five main steps, which can be applied to any other similar component: Collect information about the service conditions of the part, disassembly and dimensional characterization, functional characterization, material properties characterization and manufacturing processes characterization, allowing to obtain all the supports of the traceability of the materials and processes of the aeronautical products that ensure their airworthiness. A detailed explanation of each step is covered. Criticality and comprehend the functionalities of each part, information of the state of the art and information obtained from interviews with the technical groups of the helicopter’s operators were analyzed,3D optical scanning technique, standard and advanced materials characterization techniques and finite element simulation allow to obtain all the characteristics of the materials used in the manufacture of the component. It was found that most of the materials are quite common in the aeronautical industry, including Kevlar, carbon, and glass fibers, aluminum honeycomb core, epoxy resin and epoxy adhesive. The stacking sequence and volumetric fiber fraction are a critical issue for the mechanical behavior; a digestion acid method was used for this purpose. This also helps in the determination of the manufacture technique which for this case was Vacuum Bagging. Samples of the material were manufactured and submitted to mechanical and environmental tests. These results were compared with those obtained during reverse engineering, which allows concluding that the materials and manufacture were correctly determined. Tooling for the manufacture was designed and manufactured according to the geometry and manufacture process requisites. The part was manufactured and the mechanical, and environmental tests required were also performed. Finally, a geometric characterization and non-destructive techniques allow verifying the quality of the part.Keywords: reverse engineering, sandwich-structured composite parts, helicopter, mechanical properties, prototype
Procedia PDF Downloads 41826585 Coding and Decoding versus Space Diversity for Rayleigh Fading Radio Frequency Channels
Authors: Ahmed Mahmoud Ahmed Abouelmagd
Abstract:
The diversity is the usual remedy of the transmitted signal level variations (Fading phenomena) in radio frequency channels. Diversity techniques utilize two or more copies of a signal and combine those signals to combat fading. The basic concept of diversity is to transmit the signal via several independent diversity branches to get independent signal replicas via time – frequency - space - and polarization diversity domains. Coding and decoding processes can be an alternative remedy for fading phenomena, it cannot increase the channel capacity, but it can improve the error performance. In this paper we propose the use of replication decoding with BCH code class, and Viterbi decoding algorithm with convolution coding; as examples of coding and decoding processes. The results are compared to those obtained from two optimized selection space diversity techniques. The performance of Rayleigh fading channel, as the model considered for radio frequency channels, is evaluated for each case. The evaluation results show that the coding and decoding approaches, especially the BCH coding approach with replication decoding scheme, give better performance compared to that of selection space diversity optimization approaches. Also, an approach for combining the coding and decoding diversity as well as the space diversity is considered, the main disadvantage of this approach is its complexity but it yields good performance results.Keywords: Rayleigh fading, diversity, BCH codes, Replication decoding, convolution coding, viterbi decoding, space diversity
Procedia PDF Downloads 44326584 Social Data Aggregator and Locator of Knowledge (STALK)
Authors: Rashmi Raghunandan, Sanjana Shankar, Rakshitha K. Bhat
Abstract:
Social media contributes a vast amount of data and information about individuals to the internet. This project will greatly reduce the need for unnecessary manual analysis of large and diverse social media profiles by filtering out and combining the useful information from various social media profiles, eliminating irrelevant data. It differs from the existing social media aggregators in that it does not provide a consolidated view of various profiles. Instead, it provides consolidated INFORMATION derived from the subject’s posts and other activities. It also allows analysis over multiple profiles and analytics based on several profiles. We strive to provide a query system to provide a natural language answer to questions when a user does not wish to go through the entire profile. The information provided can be filtered according to the different use cases it is used for.Keywords: social network, analysis, Facebook, Linkedin, git, big data
Procedia PDF Downloads 44426583 Data Integrity between Ministry of Education and Private Schools in the United Arab Emirates
Authors: Rima Shishakly, Mervyn Misajon
Abstract:
Education is similar to other businesses and industries. Achieving data integrity is essential in order to attain a significant supporting for all the stakeholders in the educational sector. Efficient data collect, flow, processing, storing and retrieving are vital in order to deliver successful solutions to the different stakeholders. Ministry of Education (MOE) in United Arab Emirates (UAE) has adopted ‘Education 2020’ a series of five-year plans designed to introduce advanced education management information systems. As part of this program, in 2010 MOE implemented Student Information Systems (SIS) to manage and monitor the students’ data and information flow between MOE and international private schools in UAE. This paper is going to discuss data integrity concerns between MOE, and private schools. The paper will clarify the data integrity issues and will indicate the challenges that face private schools in UAE.Keywords: education management information systems (EMIS), student information system (SIS), United Arab Emirates (UAE), ministry of education (MOE), (KHDA) the knowledge and human development authority, Abu Dhabi educational counsel (ADEC)
Procedia PDF Downloads 22226582 Greatly Improved Dielectric Properties of Poly'vinylidene fluoride' Nanocomposites Using Ag-BaTiO₃ Hybrid Nanoparticles as Filler
Authors: K. Silakaew, P. Thongbai
Abstract:
There is an increasing need for high–permittivity polymer–matrix composites (PMC) owing to the rapid development of the electronics industry. Unfortunately, the dielectric permittivity of PMC is still too low ( < 80). Moreover, the dielectric loss tangent is usually high (tan > 0.1) when the dielectric permittivity of PMC increased. In this research work, the dielectric properties of poly(vinylidene fluoride) (PVDF)–based nanocomposites can be significantly improved by incorporating by silver–BaTiO3 (Ag–BT) ceramic hybrid nanoparticles. The Ag–BT/PVDF nanocomposites were fabricated using various volume fractions of Ag–BT hybrid nanoparticles (fAg–BT = 0–0.5). The Ag–BT/PVDF nanocomposites were characterized using several techniques. The main phase of Ag and BT can be detected by the XRD technique. The microstructure of the Ag–BT/PVDF nanocomposites was investigated to reveal the dispersion of Ag–BT hybrid nanoparticles because the dispersion state of a filler can have an effect on the dielectric properties of the nanocomposites. It was found that the filler hybrid nanoparticles were well dispersed in the PVDF matrix. The phase formation of PVDF phases was identified using the XRD and FTIR techniques. We found that the fillers can increase the polar phase of a PVDF polymer. The fabricated Ag–BT/PVDF nanocomposites are systematically characterized to explain the dielectric behavior in Ag–BT/PVDF nanocomposites. Interestingly, largely enhanced dielectric permittivity (>240) and suppressed loss tangent (tan<0.08) over a wide frequency range (102 – 105 Hz) are obtained. Notably, the dielectric permittivity is slightly dependent on temperature. The greatly enhanced dielectric permittivity was explained by the interfacial polarization between the Ag and PVDF interface, and due to a high permittivity of BT particles.Keywords: BaTiO3, PVDF, polymer composite, dielectric properties
Procedia PDF Downloads 19326581 Social Semantic Web-Based Analytics Approach to Support Lifelong Learning
Authors: Khaled Halimi, Hassina Seridi-Bouchelaghem
Abstract:
The purpose of this paper is to describe how learning analytics approaches based on social semantic web techniques can be applied to enhance the lifelong learning experiences in a connectivist perspective. For this reason, a prototype of a system called SoLearn (Social Learning Environment) that supports this approach. We observed and studied literature related to lifelong learning systems, social semantic web and ontologies, connectivism theory, learning analytics approaches and reviewed implemented systems based on these fields to extract and draw conclusions about necessary features for enhancing the lifelong learning process. The semantic analytics of learning can be used for viewing, studying and analysing the massive data generated by learners, which helps them to understand through recommendations, charts and figures their learning and behaviour, and to detect where they have weaknesses or limitations. This paper emphasises that implementing a learning analytics approach based on social semantic web representations can enhance the learning process. From one hand, the analysis process leverages the meaning expressed by semantics presented in the ontology (relationships between concepts). From the other hand, the analysis process exploits the discovery of new knowledge by means of inferring mechanism of the semantic web.Keywords: connectivism, learning analytics, lifelong learning, social semantic web
Procedia PDF Downloads 21526580 Detecting Memory-Related Gene Modules in sc/snRNA-seq Data by Deep-Learning
Authors: Yong Chen
Abstract:
To understand the detailed molecular mechanisms of memory formation in engram cells is one of the most fundamental questions in neuroscience. Recent single-cell RNA-seq (scRNA-seq) and single-nucleus RNA-seq (snRNA-seq) techniques have allowed us to explore the sparsely activated engram ensembles, enabling access to the molecular mechanisms that underlie experience-dependent memory formation and consolidation. However, the absence of specific and powerful computational methods to detect memory-related genes (modules) and their regulatory relationships in the sc/snRNA-seq datasets has strictly limited the analysis of underlying mechanisms and memory coding principles in mammalian brains. Here, we present a deep-learning method named SCENTBOX, to detect memory-related gene modules and causal regulatory relationships among themfromsc/snRNA-seq datasets. SCENTBOX first constructs codifferential expression gene network (CEGN) from case versus control sc/snRNA-seq datasets. It then detects the highly correlated modules of differential expression genes (DEGs) in CEGN. The deep network embedding and attention-based convolutional neural network strategies are employed to precisely detect regulatory relationships among DEG genes in a module. We applied them on scRNA-seq datasets of TRAP; Ai14 mouse neurons with fear memory and detected not only known memory-related genes, but also the modules and potential causal regulations. Our results provided novel regulations within an interesting module, including Arc, Bdnf, Creb, Dusp1, Rgs4, and Btg2. Overall, our methods provide a general computational tool for processing sc/snRNA-seq data from case versus control studie and a systematic investigation of fear-memory-related gene modules.Keywords: sc/snRNA-seq, memory formation, deep learning, gene module, causal inference
Procedia PDF Downloads 12026579 Towards a Balancing Medical Database by Using the Least Mean Square Algorithm
Authors: Kamel Belammi, Houria Fatrim
Abstract:
imbalanced data set, a problem often found in real world application, can cause seriously negative effect on classification performance of machine learning algorithms. There have been many attempts at dealing with classification of imbalanced data sets. In medical diagnosis classification, we often face the imbalanced number of data samples between the classes in which there are not enough samples in rare classes. In this paper, we proposed a learning method based on a cost sensitive extension of Least Mean Square (LMS) algorithm that penalizes errors of different samples with different weight and some rules of thumb to determine those weights. After the balancing phase, we applythe different classifiers (support vector machine (SVM), k- nearest neighbor (KNN) and multilayer neuronal networks (MNN)) for balanced data set. We have also compared the obtained results before and after balancing method.Keywords: multilayer neural networks, k- nearest neighbor, support vector machine, imbalanced medical data, least mean square algorithm, diabetes
Procedia PDF Downloads 532