Search results for: action based method
35939 Multiscale Modeling of Damage in Textile Composites
Authors: Jaan-Willem Simon, Bertram Stier, Brett Bednarcyk, Evan Pineda, Stefanie Reese
Abstract:
Textile composites, in which the reinforcing fibers are woven or braided, have become very popular in numerous applications in aerospace, automotive, and maritime industry. These textile composites are advantageous due to their ease of manufacture, damage tolerance, and relatively low cost. However, physics-based modeling of the mechanical behavior of textile composites is challenging. Compared to their unidirectional counterparts, textile composites introduce additional geometric complexities, which cause significant local stress and strain concentrations. Since these internal concentrations are primary drivers of nonlinearity, damage, and failure within textile composites, they must be taken into account in order for the models to be predictive. The macro-scale approach to modeling textile-reinforced composites treats the whole composite as an effective, homogenized material. This approach is very computationally efficient, but it cannot be considered predictive beyond the elastic regime because the complex microstructural geometry is not considered. Further, this approach can, at best, offer a phenomenological treatment of nonlinear deformation and failure. In contrast, the mesoscale approach to modeling textile composites explicitly considers the internal geometry of the reinforcing tows, and thus, their interaction, and the effects of their curved paths can be modeled. The tows are treated as effective (homogenized) materials, requiring the use of anisotropic material models to capture their behavior. Finally, the micro-scale approach goes one level lower, modeling the individual filaments that constitute the tows. This paper will compare meso- and micro-scale approaches to modeling the deformation, damage, and failure of textile-reinforced polymer matrix composites. For the mesoscale approach, the woven composite architecture will be modeled using the finite element method, and an anisotropic damage model for the tows will be employed to capture the local nonlinear behavior. For the micro-scale, two different models will be used, the one being based on the finite element method, whereas the other one makes use of an embedded semi-analytical approach. The goal will be the comparison and evaluation of these approaches to modeling textile-reinforced composites in terms of accuracy, efficiency, and utility.Keywords: multiscale modeling, continuum damage model, damage interaction, textile composites
Procedia PDF Downloads 35435938 Comparative Analysis of Glycated Hemoglobin (hba1c) Between HPLC and Immunoturbidimetry Method in Type II Diabetes Mellitus Patient
Authors: Intanri Kurniati, Raja Iqbal Mulya Harahap, Agustyas Tjiptaningrum, Reni Zuraida
Abstract:
Background: Diabetes mellitus is still increasing and has become a health and social burden in the world. It is known that glycation among various proteins is increased in diabetic patients compared with non-diabetic subjects. Some of these glycated proteins are suggested to be involved in the development and progression of chronic diabetic complications. Among these glycated proteins, glycated hemoglobin (HbA1C) is commonly used as the gold standard index of glycemic control in the clinical setting. HbA1C testing has some methods, and the most commonly used is immunoturbidimetry. This research aimed to compare the HbA1c level between immunoturbidimetry and HbA1C level in T2DM patients. Methods: This research involves 77 patients from Abd Muluk Hospital Bandar Lampung; the patient was asked for consent in this research, then underwent phlebotomy to be tested for HbA1C; the sample was then examined for HbA1C with Turbidimetric Inhibition Immunoassay (TINIA) and High-Performance Liquid Chromatography (HPLC) method. Result: Mean± SD of the samples with the TINIA method was 9.2±1,2; meanwhile, the level HbA1C with the HPLC method is 9.6±1,2. The t-test showed no significant difference between the group subjects. (p<0.05). It was proposed that the two methods have high suitability in testing, and both are eligibly used for the patient. Discussion: There was no significant difference among research subjects, indicating that the high conformity of the two methods is suitable to be used for monitoring patients clinically. Conclusion: There is increasing in HbA1C level in a patient with T2DM measured with HPLC and or Turbidimetric Inhibition Immunoassay (TINIA) method, and there were no significant differences among those methods.Keywords: diabetes mellitus, glycated albumin, HbA1C, HPLC, immunoturbidimetry
Procedia PDF Downloads 9935937 The Relevance of (Re)Designing Professional Paths with Unemployed Working-Age Adults
Authors: Ana Rodrigues, Maria Cadilhe, Filipa Ferreira, Claudia Pereira, Marta Santos
Abstract:
Professional paths must be understood in the multiplicity of their possible configurations. While some actors tend to represent their path as a harmonious succession of positions in the life cycle, most recognize the existence of unforeseen and uncontrollable bifurcations, caused, for example, by a work accident or by going through a period of unemployment. Considering the intensified challenges posed by the ongoing societal changes (e.g., technological and demographic), and looking at the Portuguese context, where the unemployment rate continues to be more evident in certain age groups, like in individuals aged 45 years or over, it is essential to support those adults by providing strategies capable of supporting them during professional transitions, being this a joint responsibility of governments, employers, workers, educational institutions, among others. Concerned about those issues, Porto City Council launched the challenge of designing and implementing a Lifelong Career Guidance program, which was answered with the presentation of a customized conceptual and operational model: groWing|Lifelong Career Guidance. A pilot project targeting working-age adults (35 or older) who were unemployed was carried out, aiming to support them to reconstruct their professional paths, through the recovery of their past experiences and through a reflection about dimensions such as skills, interests, constraints, and labor market. A research action approach was used to assess the proposed model, namely the perceived relevance of the theme and of the project, by adults themselves (N=44), employment professionals (N=15) and local companies (N=15), in an integrated manner. A set of activities were carried out: a train the trainer course and a monitoring session with employment professionals; collective and individual sessions with adults, including a monitoring session as well; and a workshop with local companies. Support materials for individual/collective reflection about professional paths were created and adjusted for each involved agent. An evaluation model was co-build by different stakeholders. Assessment was carried through a form created for the purpose, completed at the end of the different activities, which allowed us to collect quantitative and qualitative data. Statistical analysis was carried through SPSS software. Results showed that the participants, as well as the employment professionals and the companies involved, considered both the topic and the project as extremely relevant. Also, adults saw the project as an opportunity to reflect on their paths and become aware of the opportunities and the necessary conditions to achieve their goals; the professionals highlighted the support given by an integrated methodology and the existence of tools to assist the process; companies valued the opportunity to think about the topic and the possible initiatives they could implement within the company to diversify their recruitment pool. The results allow us to conclude that, in the local context under study, there is an alignment between different agents regarding the pertinence of supporting adults with work experience in professional transitions, seeing the project as a relevant strategy to address this issue, which justifies that it can be extended in time and to other working-age adults in the future.Keywords: professional paths, research action, turning points, lifelong career guidance, relevance
Procedia PDF Downloads 8735936 Methodological Aspect of Emergy Accounting in Co-Production Branching Systems
Authors: Keshab Shrestha, Hung-Suck Park
Abstract:
Emergy accounting of the systems networks is guided by a definite rule called ‘emergy algebra’. The systems networks consist of two types of branching. These are the co-product branching and split branching. The emergy accounting procedure for both the branching types is different. According to the emergy algebra, each branch in the co-product branching has different transformity values whereas the split branching has the same transformity value. After the transformity value of each branch is determined, the emergy is calculated by multiplying this with the energy. The aim of this research is to solve the problems in determining the transformity values in the co-product branching through the introduction of a new methodology, the modified physical quantity method. Initially, the existing methodologies for emergy accounting in the co-product branching is discussed and later, the modified physical quantity method is introduced with a case study of the Eucalyptus pulp production. The existing emergy accounting methodologies in the co-product branching has wrong interpretations with incorrect emergy calculations. The modified physical quantity method solves those problems of emergy accounting in the co-product branching systems. The transformity value calculated for each branch is different and also applicable in the emergy calculations. The methodology also strictly follows the emergy algebra rules. This new modified physical quantity methodology is a valid approach in emergy accounting particularly in the multi-production systems networks.Keywords: co-product branching, emergy accounting, emergy algebra, modified physical quantity method, transformity value
Procedia PDF Downloads 29235935 A Real-Time Bayesian Decision-Support System for Predicting Suspect Vehicle’s Intended Target Using a Sparse Camera Network
Authors: Payam Mousavi, Andrew L. Stewart, Huiwen You, Aryeh F. G. Fayerman
Abstract:
We present a decision-support tool to assist an operator in the detection and tracking of a suspect vehicle traveling to an unknown target destination. Multiple data sources, such as traffic cameras, traffic information, weather, etc., are integrated and processed in real-time to infer a suspect’s intended destination chosen from a list of pre-determined high-value targets. Previously, we presented our work in the detection and tracking of vehicles using traffic and airborne cameras. Here, we focus on the fusion and processing of that information to predict a suspect’s behavior. The network of cameras is represented by a directional graph, where the edges correspond to direct road connections between the nodes and the edge weights are proportional to the average time it takes to travel from one node to another. For our experiments, we construct our graph based on the greater Los Angeles subset of the Caltrans’s “Performance Measurement System” (PeMS) dataset. We propose a Bayesian approach where a posterior probability for each target is continuously updated based on detections of the suspect in the live video feeds. Additionally, we introduce the concept of ‘soft interventions’, inspired by the field of Causal Inference. Soft interventions are herein defined as interventions that do not immediately interfere with the suspect’s movements; rather, a soft intervention may induce the suspect into making a new decision, ultimately making their intent more transparent. For example, a soft intervention could be temporarily closing a road a few blocks from the suspect’s current location, which may require the suspect to change their current course. The objective of these interventions is to gain the maximum amount of information about the suspect’s intent in the shortest possible time. Our system currently operates in a human-on-the-loop mode where at each step, a set of recommendations are presented to the operator to aid in decision-making. In principle, the system could operate autonomously, only prompting the operator for critical decisions, allowing the system to significantly scale up to larger areas and multiple suspects. Once the intended target is identified with sufficient confidence, the vehicle is reported to the authorities to take further action. Other recommendations include a selection of road closures, i.e., soft interventions, or to continue monitoring. We evaluate the performance of the proposed system using simulated scenarios where the suspect, starting at random locations, takes a noisy shortest path to their intended target. In all scenarios, the suspect’s intended target is unknown to our system. The decision thresholds are selected to maximize the chances of determining the suspect’s intended target in the minimum amount of time and with the smallest number of interventions. We conclude by discussing the limitations of our current approach to motivate a machine learning approach, based on reinforcement learning in order to relax some of the current limiting assumptions.Keywords: autonomous surveillance, Bayesian reasoning, decision support, interventions, patterns of life, predictive analytics, predictive insights
Procedia PDF Downloads 11535934 Sources of Occupational Stress among Teachers in Command Secondary Schools of Nigerian Army
Authors: Mary Esere, Mogbekeloluwa Fakokunde, Adetoun Idowu
Abstract:
Background: Working in a military setting could elicit some amount of stressful doses into ones system because of the attendant peculiar characteristics found in the military environment. Thus, this study was carried out to find out the sources of occupational stress among teachers in various Command Secondary Schools within 2 Division of Nigerian Army. Method: The study employed a survey method. Simple random sampling technique was used to select the schools in the Division. A total of 200 respondents participated in the study. Sources of Teachers’ Occupational Stress Questionnaire (STOSQ) was administered to the respondents to collect relevant data. The t-test and Analysis of Variance (ANOVA) statistics were used to test the hypotheses. Findings: From the study, it was discovered that teachers in this setting do experience occupational stress. Their major sources of stress bother on issues relating to salaries and allowances and staff welfare concerns. The findings also revealed that there were no significant differences in the sources of occupational stress among the teachers in respect to gender and marital status. Discussion: Based on these findings, it was recommended that the Appropriate Superior Authority (ASA) should reconstitute the proscribed Armed Forces Schools Management Board (AFSMB) where issues, such as staff salaries and welfare concerns for teachers working in the schools under the three services (Army, Navy, Airforce) will always be addressed. This will go a long way in enhancing the psychological well-being of the teachers.Keywords: Nigerian army, occupational stress, sources, teachers
Procedia PDF Downloads 49035933 Research on Pilot Sequence Design Method of Multiple Input Multiple Output Orthogonal Frequency Division Multiplexing System Based on High Power Joint Criterion
Authors: Linyu Wang, Jiahui Ma, Jianhong Xiang, Hanyu Jiang
Abstract:
For the pilot design of the sparse channel estimation model in Multiple Input Multiple Output Orthogonal Frequency Division Multiplexing (MIMO-OFDM) systems, the observation matrix constructed according to the matrix cross-correlation criterion, total correlation criterion and other optimization criteria are not optimal, resulting in inaccurate channel estimation and high bit error rate at the receiver. This paper proposes a pilot design method combining high-power sum and high-power variance criteria, which can more accurately estimate the channel. First, the pilot insertion position is designed according to the high-power variance criterion under the condition of equal power. Then, according to the high power sum criterion, the pilot power allocation is converted into a cone programming problem, and the power allocation is carried out. Finally, the optimal pilot is determined by calculating the weighted sum of the high power sum and the high power variance. Compared with the traditional pilot frequency, under the same conditions, the constructed MIMO-OFDM system uses the optimal pilot frequency for channel estimation, and the communication bit error rate performance obtains a gain of 6~7dB.Keywords: MIMO-OFDM, pilot optimization, compressed sensing, channel estimation
Procedia PDF Downloads 14935932 A Context Aware Mobile Learning System with a Cognitive Recommendation Engine
Authors: Jalal Maqbool, Gyu Myoung Lee
Abstract:
Using smart devices for context aware mobile learning is becoming increasingly popular. This has led to mobile learning technology becoming an indispensable part of today’s learning environment and platforms. However, some fundamental issues remain - namely, mobile learning still lacks the ability to truly understand human reaction and user behaviour. This is due to the fact that current mobile learning systems are passive and not aware of learners’ changing contextual situations. They rely on static information about mobile learners. In addition, current mobile learning platforms lack the capability to incorporate dynamic contextual situations into learners’ preferences. Thus, this thesis aims to address these issues highlighted by designing a context aware framework which is able to sense learner’s contextual situations, handle data dynamically, and which can use contextual information to suggest bespoke learning content according to a learner’s preferences. This is to be underpinned by a robust recommendation system, which has the capability to perform these functions, thus providing learners with a truly context-aware mobile learning experience, delivering learning contents using smart devices and adapting to learning preferences as and when it is required. In addition, part of designing an algorithm for the recommendation engine has to be based on learner and application needs, personal characteristics and circumstances, as well as being able to comprehend human cognitive processes which would enable the technology to interact effectively and deliver mobile learning content which is relevant, according to the learner’s contextual situations. The concept of this proposed project is to provide a new method of smart learning, based on a capable recommendation engine for providing an intuitive mobile learning model based on learner actions.Keywords: aware, context, learning, mobile
Procedia PDF Downloads 24535931 The Political Economy of Fiscal and Monetary Interactions in Brazil
Authors: Marcos Centurion-Vicencio
Abstract:
This study discusses the idea of ‘dominance’ in economic policy and its practical influence over monetary decisions. The discretionary use of repurchase agreements in Brazil over the period 2006-2016 and its effects on the overall price level are the specific issues we will be focusing on. The set of in-depth interviews carried out with public servants at the Brazilian central bank and national treasury, alongside data collected from the National Institution of Statistics (IBGE), suggest that monetary and fiscal dominance do not differ in nature once the assumption of depoliticized central bankers is relaxed. In both regimes, the pursuit of private gains via public institutions affects price stability. While short-sighted politicians in the latter are at the origin of poor monetary decisions, the action of short-sighted financial interest groups is likely to generate a similar outcome in the former. This study then contributes to rethinking monetary policy theory as well as the nature of public borrowing.Keywords: fiscal and monetary interactions, interest groups, monetary capture, public borrowing
Procedia PDF Downloads 13535930 Critical Buckling Load of Carbon Nanotube with Non-Local Timoshenko Beam Using the Differential Transform Method
Authors: Tayeb Bensattalah, Mohamed Zidour, Mohamed Ait Amar Meziane, Tahar Hassaine Daouadji, Abdelouahed Tounsi
Abstract:
In this paper, the Differential Transform Method (DTM) is employed to predict and to analysis the non-local critical buckling loads of carbon nanotubes with various end conditions and the non-local Timoshenko beam described by single differential equation. The equation differential of buckling of the nanobeams is derived via a non-local theory and the solution for non-local critical buckling loads is finding by the DTM. The DTM is introduced briefly. It can easily be applied to linear or nonlinear problems and it reduces the size of computational work. Influence of boundary conditions, the chirality of carbon nanotube and aspect ratio on non-local critical buckling loads are studied and discussed. Effects of nonlocal parameter, ratios L/d, the chirality of single-walled carbon nanotube, as well as the boundary conditions on buckling of CNT are investigated.Keywords: boundary conditions, buckling, non-local, differential transform method
Procedia PDF Downloads 30135929 Patent on Brian: Brain Waves Stimulation
Authors: Jalil Qoulizadeh, Hasan Sadeghi
Abstract:
Brain waves are electrical wave patterns that are produced in the human brain. Knowing these waves and activating them can have a positive effect on brain function and ultimately create an ideal life. The brain has the ability to produce waves from 0.1 to above 65 Hz. (The Beta One device produces exactly these waves) This is because it is said that the waves produced by the Beta One device exactly match the waves produced by the brain. The function and method of this device is based on the magnetic stimulation of the brain. The technology used in the design and producƟon of this device works in a way to strengthen and improve the frequencies of brain waves with a pre-defined algorithm according to the type of requested function, so that the person can access the expected functions in life activities. to perform better. The effect of this field on neurons and their stimulation: In order to evaluate the effect of this field created by the device, on the neurons, the main tests are by conducting electroencephalography before and after stimulation and comparing these two baselines by qEEG or quantitative electroencephalography method using paired t-test in 39 subjects. It confirms the significant effect of this field on the change of electrical activity recorded after 30 minutes of stimulation in all subjects. The Beta One device is able to induce the appropriate pattern of the expected functions in a soft and effective way to the brain in a healthy and effective way (exactly in accordance with the harmony of brain waves), the process of brain activities first to a normal state and then to a powerful one. Production of inexpensive neuroscience equipment (compared to existing rTMS equipment) Magnetic brain stimulation for clinics - homes - factories and companies - professional sports clubs.Keywords: stimulation, brain, waves, betaOne
Procedia PDF Downloads 8135928 Numerical Study of Wettability on the Triangular Micro-pillared Surfaces Using Lattice Boltzmann Method
Authors: Ganesh Meshram, Gloria Biswal
Abstract:
In this study, we present the numerical investigation of surface wettability on triangular micropillar surfaces by using a two-dimensional (2D) pseudo-potential multiphase lattice Boltzmann method with a D2Q9 model for various interaction parameters of the range varies from -1.40 to -2.50. Initially, simulation of the equilibrium state of a water droplet on a flat surface is considered for various interaction parameters to examine the accuracy of the present numerical model. We then imposed the microscale pillars on the bottom wall of the surface with different heights of the pillars to form the hydrophobic and superhydrophobic surfaces which enable the higher contact angle. The wettability of surfaces is simulated with water droplets of radius 100 lattice units in the domain of 800x800 lattice units. The present study shows that increasing the interaction parameter of the pillared hydrophobic surfaces dramatically reduces the contact area between water droplets and solid walls due to the momentum redirection phenomenon. Contact angles for different values of interaction strength have been validated qualitatively with the analytical results.Keywords: contact angle, lattice boltzmann method, d2q9 model, pseudo-potential multiphase method, hydrophobic surfaces, wenzel state, cassie-baxter state, wettability
Procedia PDF Downloads 6935927 Measuring the Height of a Person in Closed Circuit Television Video Footage Using 3D Human Body Model
Authors: Dojoon Jung, Kiwoong Moon, Joong Lee
Abstract:
The height of criminals is one of the important clues that can determine the scope of the suspect's search or exclude the suspect from the search target. Although measuring the height of criminals by video alone is limited by various reasons, the 3D data of the scene and the Closed Circuit Television (CCTV) footage are matched, the height of the criminal can be measured. However, it is still difficult to measure the height of CCTV footage in the non-contact type measurement method because of variables such as position, posture, and head shape of criminals. In this paper, we propose a method of matching the CCTV footage with the 3D data on the crime scene and measuring the height of the person using the 3D human body model in the matched data. In the proposed method, the height is measured by using 3D human model in various scenes of the person in the CCTV footage, and the measurement value of the target person is corrected by the measurement error of the replay CCTV footage of the reference person. We tested for 20 people's walking CCTV footage captured from an indoor and an outdoor and corrected the measurement values with 5 reference persons. Experimental results show that the measurement error (true value-measured value) average is 0.45 cm, and this method is effective for the measurement of the person's height in CCTV footage.Keywords: human height, CCTV footage, 2D/3D matching, 3D human body model
Procedia PDF Downloads 24835926 Using Sandplay Therapy to Assess Psychological Resilience
Authors: Dan Wang
Abstract:
Sandplay therapy is a Jungian psychological therapy developed by Dora Kalff in 1956. In sandplay therapy, the client first makes a sandtray with various miniatures and then has a communication with the therapist based on the sandtray. The special method makes sandplay therapy has great assessment potential. With regarding that the core treatment hypothesis of sandplay therapy - the self-healing power, is very similar to resilience. This study tries to use sandplay to evaluate psychological resilience. Participants are 107 undergraduates recruited from three public universities in China who were required to make an initial sandtray and to complete the Ego-Resiliency Scale (ER89) respectively. First, a 28- category General Sandtray Coding Manual (GSCM) was developed based on literature on sandplay therapy. Next, using GSCM to code the 107 initial sandtrays and conducted correlation analysis and regression analysis between all GSCM categories and ER89. Results show three categories (i.e., vitality, water types, and relationships) of sandplay account for 36.6% of the variance of ego-resilience and form the four-point Likert-type Sandtray Projective Test of Resilience (SPTR). Finally, it is found that SPTR dimensions and total score all have good inter-rater reliability, ranging from 0.89 to 0.93. This study provides an alternative approach to measure psychological resilience and can help to guide clinical social work.Keywords: sandplay therapy, psychological resilience, measurement, college students
Procedia PDF Downloads 25635925 An Evaluative Study of Services Provided in Community Based Rehabilitation Centres in Jordan
Authors: Wesam Darawsheh
Abstract:
Purpose: There is an absence of studies directed to evaluate the effectiveness of Community Based Rehabilitation (CBR) programs in Jordan. This research study is aimed at investigating the effectiveness of the services of CBR programmes in Jordan. Method: A questionnaire anonymized survey was carried out with forty-seven participants (stakeholders and volunteers) from four CBR centres in Jordan. It comprised eighteen questions that collected both qualitative and quantitative data with both closed- and open-ended questions. The survey assessed participants’ knowledge of CBR and perception of the effectiveness of services provided. The quantitative data were analyzed using SPSS Version 22.0 (2016, IBM Corporation New York). Qualitative data were analyzed through thematic content and analysis and open coding to identify emergent themes. Results: The ROC curve revealed that the AUC for questions of the survey to be (AUC=0.846) which indicated a good specificity and sensitivity of the questions of the survey. The MANOVA revealed insignificant results in the effect of the CBR site (p= 0.157), and the level of education of participants (p=0.549), on the perception of the effectiveness of CBR services. There were insignificant differences between the scores of PWDs and volunteers (p=0.781). 40.4% evaluated the effectiveness of CBR services to be low. This mainly stemmed out from the lack of efforts of the CBR programmes to raise the knowledge of the local community about CBR, disability and the role toward PWDs. Conclusions: A speculation for priorities of CBR programmes in Jordan was offered where efforts need to be directed at promoting livelihood and the empowerment components, in order to actualize the main three principles of CBR mainly by promoting multispectral collaboration as a way of operation.Keywords: community based rehabilitation (CBR), people with disabilities (PWDS), CBR centres, rehabilitation services, Jordan, mixed-methods, evaluative study
Procedia PDF Downloads 25335924 A Hybrid System for Boreholes Soil Sample
Authors: Ali Ulvi Uzer
Abstract:
Data reduction is an important topic in the field of pattern recognition applications. The basic concept is the reduction of multitudinous amounts of data down to the meaningful parts. The Principal Component Analysis (PCA) method is frequently used for data reduction. The Support Vector Machine (SVM) method is a discriminative classifier formally defined by a separating hyperplane. In other words, given labeled training data, the algorithm outputs an optimal hyperplane which categorizes new examples. This study offers a hybrid approach that uses the PCA for data reduction and Support Vector Machines (SVM) for classification. In order to detect the accuracy of the suggested system, two boreholes taken from the soil sample was used. The classification accuracies for this dataset were obtained through using ten-fold cross-validation method. As the results suggest, this system, which is performed through size reduction, is a feasible system for faster recognition of dataset so our study result appears to be very promising.Keywords: feature selection, sequential forward selection, support vector machines, soil sample
Procedia PDF Downloads 45535923 The Impact of Digital Inclusive Finance on the High-Quality Development of China's Export Trade
Authors: Yao Wu
Abstract:
In the context of financial globalization, China has put forward the policy goal of high-quality development, and the digital economy, with its advantage of information resources, is driving China's export trade to achieve high-quality development. Due to the long-standing financing constraints of small and medium-sized export enterprises, how to expand the export scale of small and medium-sized enterprises has become a major threshold for the development of China's export trade. This paper firstly adopts the hierarchical analysis method to establish the evaluation system of high-quality development of China's export trade; secondly, the panel data of 30 provinces in China from 2011 to 2018 are selected for empirical analysis to establish the impact model of digital inclusive finance on the high-quality development of China's export trade; based on the analysis of heterogeneous enterprise trade model, a mediating effect model is established to verify the mediating role of credit constraint in the development of high-quality export trade in China. Based on the above analysis, this paper concludes that inclusive digital finance, with its unique digital and inclusive nature, alleviates the credit constraint problem among SMEs, enhances the binary marginal effect of SMEs' exports, optimizes their export scale and structure, and promotes the high-quality development of regional and even national export trade. Finally, based on the findings of this paper, we propose insights and suggestions for inclusive digital finance to promote the high-quality development of export trade.Keywords: digital inclusive finance, high-quality development of export trade, fixed effects, binary marginal effects
Procedia PDF Downloads 9335922 Data Quality Enhancement with String Length Distribution
Authors: Qi Xiu, Hiromu Hota, Yohsuke Ishii, Takuya Oda
Abstract:
Recently, collectable manufacturing data are rapidly increasing. On the other hand, mega recall is getting serious as a social problem. Under such circumstances, there are increasing needs for preventing mega recalls by defect analysis such as root cause analysis and abnormal detection utilizing manufacturing data. However, the time to classify strings in manufacturing data by traditional method is too long to meet requirement of quick defect analysis. Therefore, we present String Length Distribution Classification method (SLDC) to correctly classify strings in a short time. This method learns character features, especially string length distribution from Product ID, Machine ID in BOM and asset list. By applying the proposal to strings in actual manufacturing data, we verified that the classification time of strings can be reduced by 80%. As a result, it can be estimated that the requirement of quick defect analysis can be fulfilled.Keywords: string classification, data quality, feature selection, probability distribution, string length
Procedia PDF Downloads 31835921 Environmental Quality On-Line Monitoring Based on Enterprises Resource Planning on Implementation ISO 14001:2004
Authors: Ahmad Badawi Saluy
Abstract:
This study aims to develop strategies for the prevention or elimination of environmental pollution as well as changes in external variables of the environment in order to implement the environmental management system ISO 14001:2004 by integrating analysis of environmental issues data, RKL-RPL transactional data and regulation as part of ERP on the management dashboard. This research uses a quantitative descriptive approach with analysis method comparing with air quality standard (PP 42/1999, LH 21/2008), water quality standard (permenkes RI 416/1990, KepmenLH 51/2004, kepmenLH 55/2013 ), and biodiversity indicators. Based on the research, the parameters of RPL monitoring have been identified, among others, the quality of emission air (SO₂, NO₂, dust, particulate) due to the influence of fuel quality, combustion performance in a combustor and the effect of development change around the generating area. While in water quality (TSS, TDS) there was an increase due to the flow of water in the cooling intake carrying sedimentation from the flow of Banjir Kanal Timur. Including compliance with the ISO 14001:2004 clause on application design significantly contributes to the improvement of the quality of power plant management.Keywords: environmental management systems, power plant management, regulatory compliance , enterprises resource planning
Procedia PDF Downloads 17935920 Study on Disaster Prevention Plan for an Electronic Industry in Thailand
Authors: S. Pullteap, M. Pathomsuriyaporn
Abstract:
In this article, a study of employee’s opinion to the factors that affect to the flood preventive and the corrective action plan in an electronic industry at the Sharp Manufacturing (Thailand) Co., Ltd. has been investigated. The surveys data of 175 workers and supervisors have, however, been selected for data analysis. The results is shown that the employees emphasize about the needs in a subsidy at the time of disaster at high levels of 77.8%, as the plan focusing on flood prevention of the rehabilitation equipment is valued at the intermediate level, which is 79.8%. Demonstration of the hypothesis has found that the different education levels has thus been affected to the needs factor at the flood disaster time. Moreover, most respondents give priority to flood disaster risk management factor. Consequently, we found that the flood prevention plan is valued at high level, especially on information monitoring, which is 93.4% for the supervisor item. The respondents largely assume that the flood will have impacts on the industry, up to 80%, thus to focus on flood management plans is enormous.Keywords: flood prevention plan, flood event, electronic industrial plant, disaster, risk management
Procedia PDF Downloads 32735919 Energy Efficiency in Hot Arid Climates Code Compliance and Enforcement for Residential Buildings
Authors: Mohamed Edesy, Carlo Cecere
Abstract:
This paper is a part of an ongoing research that proposes energy strategies for residential buildings in hot arid climates. In Egypt, the residential sector is dominated by increase in consumption rates annually. A building energy efficiency code was introduced by the government in 2005; it indicates minimum design and application requirements for residential buildings. Submission is mandatory and should lead to about 20% energy savings with an increase in comfort levels. However, compliance is almost nonexistent, electricity is subsidized and incentives to adopt energy efficient patterns are very low. This work presents an overview of the code and analyzes the impact of its introduction on different sectors. It analyses compliance barriers and indicates challenges that stand in the way of a realistic enforcement. It proposes an action plan for immediate code enforcement, updating current code to include retrofit, and development of rating systems for buildings. This work presents a broad national plan for energy efficiency empowerment in the residential sector.Keywords: energy efficiency, housing, energy policies, code enforcement
Procedia PDF Downloads 34735918 Sensitivity Analysis of Movable Bed Roughness Formula in Sandy Rivers
Authors: Mehdi Fuladipanah
Abstract:
Sensitivity analysis as a technique is applied to determine influential input factors on model output. Variance-based sensitivity analysis method has more application compared to other methods because of including linear and non-linear models. In this paper, van Rijn’s movable bed roughness formula was selected to evaluate because of its reasonable results in sandy rivers. This equation contains four variables as: flow depth, sediment size,bBed form height and bed form length. These variable’s importance was determined using the first order of Fourier Amplitude Sensitivity Test. Sensitivity index was applied to evaluate importance of factors. The first order FAST based sensitivity indices test, explain 90% of the total variance that is indicating acceptance criteria of FAST application. More value of this index is indicating more important variable. Results show that bed form height, bed form length, sediment size and flow depth are more influential factors with sensitivity index: 32%, 24%, 19% and 15% respectively.Keywords: sdensitivity analysis, variance, movable bed roughness formula, Sandy River
Procedia PDF Downloads 26135917 Two-Stage Estimation of Tropical Cyclone Intensity Based on Fusion of Coarse and Fine-Grained Features from Satellite Microwave Data
Authors: Huinan Zhang, Wenjie Jiang
Abstract:
Accurate estimation of tropical cyclone intensity is of great importance for disaster prevention and mitigation. Existing techniques are largely based on satellite imagery data, and research and utilization of the inner thermal core structure characteristics of tropical cyclones still pose challenges. This paper presents a two-stage tropical cyclone intensity estimation network based on the fusion of coarse and fine-grained features from microwave brightness temperature data. The data used in this network are obtained from the thermal core structure of tropical cyclones through the Advanced Technology Microwave Sounder (ATMS) inversion. Firstly, the thermal core information in the pressure direction is comprehensively expressed through the maximal intensity projection (MIP) method, constructing coarse-grained thermal core images that represent the tropical cyclone. These images provide a coarse-grained feature range wind speed estimation result in the first stage. Then, based on this result, fine-grained features are extracted by combining thermal core information from multiple view profiles with a distributed network and fused with coarse-grained features from the first stage to obtain the final two-stage network wind speed estimation. Furthermore, to better capture the long-tail distribution characteristics of tropical cyclones, focal loss is used in the coarse-grained loss function of the first stage, and ordinal regression loss is adopted in the second stage to replace traditional single-value regression. The selection of tropical cyclones spans from 2012 to 2021, distributed in the North Atlantic (NA) regions. The training set includes 2012 to 2017, the validation set includes 2018 to 2019, and the test set includes 2020 to 2021. Based on the Saffir-Simpson Hurricane Wind Scale (SSHS), this paper categorizes tropical cyclone levels into three major categories: pre-hurricane, minor hurricane, and major hurricane, with a classification accuracy rate of 86.18% and an intensity estimation error of 4.01m/s for NA based on this accuracy. The results indicate that thermal core data can effectively represent the level and intensity of tropical cyclones, warranting further exploration of tropical cyclone attributes under this data.Keywords: Artificial intelligence, deep learning, data mining, remote sensing
Procedia PDF Downloads 6335916 Primary and Secondary Big Bangs Theory of Creation of Universe
Authors: Shyam Sunder Gupta
Abstract:
The current theory for the creation of the universe, the Big Bang theory, is widely accepted but leaves some unanswered questions. It does not explain the origin of the singularity or what causes the Big Bang. The theory of the Big Bang also does not explain why there is such a huge amount of dark energy and dark matter in our universe. Also, there is a question related to one universe or multiple universes which needs to be answered. This research addresses these questions using the Bhagvat Puran and other Vedic scriptures as the basis. There is a Unique Pure Energy Field that is eternal, infinite, and finest of all and never transforms when in its original form. The Carrier Particles of Unique Pure Energy are Param-anus- Fundamental Energy Particles. Param-anus and a combination of these particles create bigger particles from which the Universe gets created. For creation to initiate, Unique Pure Energy is represented in three phases: positive phase energy, neutral phase eternal time energy and negative phase energy. Positive phase energy further expands in three forms of creative energies (CE1, CE2andCE3). From CE1 energy, three energy modes, mode of activation, mode of action, and mode of darkness, were created. From these three modes, 16 Principles, subtlest forms of energies, namely Pradhan, Mahat-tattva, Time, Ego, Intellect, Mind, Sound, Space, Touch, Air, Form, Fire, Taste, Water, Smell, and Earth, get created. In the Mahat-tattva, dominant in the Mode of Darkness, CE1 energy creates innumerable primary singularities from seven principles: Pradhan, Mahat-tattva, Ego, Sky, Air, Fire, and Water. CE1 energy gets divided as CE2 and enters, along with three modes and time, in each singularity, and primary Big Bang takes place, and innumerable Invisible Universes get created. Each Universe has seven coverings of 7 principles, and each layer is 10 times thicker than the previous layer. By energy CE2, space in Invisible Universe under the coverings is divided into two halves. In the lower half, the process of evolution gets initiated, and seeds of 24 elements get created, out of which 5 fundamental elements, building blocks of matter, Sky, Air, Fire, Water and Earth, create seeds of stars, planets, galaxies and all other matter. Since 5 fundamental elements get created out of the mode of darkness, it explains why there is so much dark energy and dark matter in our Universe. This process of creation, in the lower half of Invisible universe continues for 2.16 billion years. Further, in the lower part of the energy field, exactly at the Centre of Invisible Universe, Secondary Singularity is created, through which, by force of Mode of Action, Secondary Big Bang takes place and Visible Universe gets created in the shape of Lotus Flower, expanding into upper part. Visible matter starts appearing after a gap of 360,000 years. Within the Visible Universe, a small part gets created known as the Phenomenal Material World, which is our Solar System, the sun being in the Centre. Diameter of Solar planetary system is 6.4 billion km.Keywords: invisible universe, phenomenal material world, primary Big Bang, secondary Big Bang, singularities, visible universe
Procedia PDF Downloads 9035915 An Emergentist Defense of Incompatibility between Morally Significant Freedom and Causal Determinism
Authors: Lubos Rojka
Abstract:
The common perception of morally responsible behavior is that it presupposes freedom of choice, and that free decisions and actions are not determined by natural events, but by a person. In other words, the moral agent has the ability and the possibility of doing otherwise when making morally responsible decisions, and natural causal determinism cannot fully account for morally significant freedom. The incompatibility between a person’s morally significant freedom and causal determinism appears to be a natural position. Nevertheless, some of the most influential philosophical theories on moral responsibility are compatibilist or semi-compatibilist, and they exclude the requirement of alternative possibilities, which contradicts the claims of classical incompatibilism. The compatibilists often employ Frankfurt-style thought experiments to prove their theory. The goal of this paper is to examine the role of imaginary Frankfurt-style examples in compatibilist accounts. More specifically, the compatibilist accounts defended by John Martin Fischer and Michael McKenna will be inserted into the broader understanding of a person elaborated by Harry Frankfurt, Robert Kane and Walter Glannon. Deeper analysis reveals that the exclusion of alternative possibilities based on Frankfurt-style examples is problematic and misleading. A more comprehensive account of moral responsibility and morally significant (source) freedom requires higher order complex theories of human will and consciousness, in which rational and self-creative abilities and a real possibility to choose otherwise, at least on some occasions during a lifetime, are necessary. Theoretical moral reasons and their logical relations seem to require a sort of higher-order agent-causal incompatibilism. The ability of theoretical or abstract moral reasoning requires complex (strongly emergent) mental and conscious properties, among which an effective free will, together with first and second-order desires. Such a hierarchical theoretical model unifies reasons-responsiveness, mesh theory and emergentism. It is incompatible with physical causal determinism, because such determinism only allows non-systematic processes that may be hard to predict, but not complex (strongly) emergent systems. An agent’s effective will and conscious reflectivity is the starting point of a morally responsible action, which explains why a decision is 'up to the subject'. A free decision does not always have a complete causal history. This kind of an emergentist source hyper-incompatibilism seems to be the best direction of the search for an adequate explanation of moral responsibility in the traditional (merit-based) sense. Physical causal determinism as a universal theory would exclude morally significant freedom and responsibility in the traditional sense because it would exclude the emergence of and supervenience by the essential complex properties of human consciousness.Keywords: consciousness, free will, determinism, emergence, moral responsibility
Procedia PDF Downloads 16435914 Teaching Non-Euclidean Geometries to Learn Euclidean One: An Experimental Study
Authors: Silvia Benvenuti, Alessandra Cardinali
Abstract:
In recent years, for instance, in relation to the Covid 19 pandemic and the evidence of climate change, it is becoming quite clear that the development of a young kid into an adult citizen requires a solid scientific background. Citizens are required to exert logical thinking and know the methods of science in order to adapt, understand, and develop as persons. Mathematics sits at the core of these required skills: learning the axiomatic method is fundamental to understand how hard sciences work and helps in consolidating logical thinking, which will be useful for the entire life of a student. At the same time, research shows that the axiomatic study of geometry is a problematic topic for students, even for those with interest in mathematics. With this in mind, the main goals of the research work we will describe are: (1) to show whether non-Euclidean geometries can be a tool to allow students to consolidate the knowledge of Euclidean geometries by developing it in a critical way; (2) to promote the understanding of the modern axiomatic method in geometry; (3) to give students a new perspective on mathematics so that they can see it as a creative activity and a widely discussed topic with a historical background. One of the main issues related to the state-of-the-art in this topic is the shortage of experimental studies with students. For this reason, our aim is to show further experimental evidence of the potential benefits of teaching non-Euclidean geometries at high school, based on data collected from a study started in 2005 in the frame of the Italian National Piano Lauree Scientifiche, continued by a teacher training organized in September 2018, perfected in a pilot study that involved 77 high school students during the school years 2018-2019 and 2019-2020. and finally implemented through an experimental study conducted in 2020-21 with 87 high school students. Our study shows that there is potential for further research to challenge current conceptions of the school mathematics curriculum and of the capabilities of high school mathematics students.Keywords: Non-Euclidean geometries, beliefs about mathematics, questionnaires, modern axiomatic method
Procedia PDF Downloads 7535913 From Mathematics Project-Based Learning to Commercial Product Using Geometer’s Sketchpad (GSP)
Authors: Krongthong Khairiree
Abstract:
The purpose of this research study is to explore mathematics project-based learning approach and the use of technology in the context of school mathematics in Thailand. Data of the study were collected from 6 sample secondary schools and the students were 6-14 years old. Research findings show that through mathematics project-based learning approach and the use of GSP, students were able to make mathematics learning fun and challenging. Based on the students’ interviews they revealed that, with GSP, they were able to visualize and create graphical representations, which will enable them to develop their mathematical thinking skills, concepts and understanding. The students had fun in creating variety of graphs of functions which they can not do by drawing on graph paper. In addition, there are evidences to show the students’ abilities in connecting mathematics to real life outside the classroom and commercial products, such as weaving, patterning of broomstick, and ceramics design.Keywords: mathematics, project-based learning, Geometer’s Sketchpad (GSP), commercial products
Procedia PDF Downloads 33635912 Effect of Heat Treatment on Nutrients, Bioactive Contents and Biological Activities of Red Beet (Beta Vulgaris L.)
Authors: Amessis-Ouchemoukh Nadia, Salhi Rim, Ouchemoukh Salim, Ayad Rabha, Sadou Dyhia, Guenaoui Nawel, Hamouche Sara, Madani Khodir
Abstract:
The cooking method is a key factor influencing the quality of vegetables. In this study, the effect of the most common cooking methods on the nutritional composition, phenolic content, pigment content and antioxidant activities (evaluated by DPPH, ABTS, CUPRAC, FRAP, reducing power and phosphomolybdene method) of fresh, steamed, and boiled red beet was investigated. The fresh samples showed the highest nutritional and bioactive composition compared to the cooked ones. The boiling method didn’t lead to a significant reduction (p< 0.05) in the content of phenolics, flavonoids, flavanols and DPPH, ABTS, FRAP, CUPRAC, phosphomolybdeneum and reducing power capacities. This effect was less pronounced when steam cooking was used, and the losses of bioactive compounds were lower. As a result, steam cooking resulted in greater retention of bioactive compounds and antioxidant activity compared to boiling. Overall, this study suggests that steam cooking is a better method in terms of retention of pigments and bioactive compounds and antioxidant activity of beetroot.Keywords: beta vulgaris, cooking methods, bioactive compounds, antioxidant activities
Procedia PDF Downloads 6135911 Buffer Allocation and Traffic Shaping Policies Implemented in Routers Based on a New Adaptive Intelligent Multi Agent Approach
Authors: M. Taheri Tehrani, H. Ajorloo
Abstract:
In this paper, an intelligent multi-agent framework is developed for each router in which agents have two vital functionalities, traffic shaping and buffer allocation and are positioned in the ports of the routers. With traffic shaping functionality agents shape the traffic forward by dynamic and real time allocation of the rate of generation of tokens in a Token Bucket algorithm and with buffer allocation functionality agents share their buffer capacity between each other based on their need and the conditions of the network. This dynamic and intelligent framework gives this opportunity to some ports to work better under burst and more busy conditions. These agents work intelligently based on Reinforcement Learning (RL) algorithm and will consider effective parameters in their decision process. As RL have limitation considering much parameter in its decision process due to the volume of calculations, we utilize our novel method which invokes Principle Component Analysis (PCA) on the RL and gives a high dimensional ability to this algorithm to consider as much as needed parameters in its decision process. This implementation when is compared to our previous work where traffic shaping was done without any sharing and dynamic allocation of buffer size for each port, the lower packet drop in the whole network specifically in the source routers can be seen. These methods are implemented in our previous proposed intelligent simulation environment to be able to compare better the performance metrics. The results obtained from this simulation environment show an efficient and dynamic utilization of resources in terms of bandwidth and buffer capacities pre allocated to each port.Keywords: principal component analysis, reinforcement learning, buffer allocation, multi- agent systems
Procedia PDF Downloads 51835910 Torque Loss Prediction Test Method of Bolted Joints in Heavy Commercial Vehicles
Authors: Volkan Ayik
Abstract:
Loosening as a result of torque loss in bolted joints is one of the most encountered problems resulting in loss of connection between parts. The main reason for this is the dynamic loads to which the joints are subjected while the vehicle is moving. In particular, vibration-induced loads can loosen the joints in any size and geometry. The aim of this study is to study an improved method due to road-induced vibration in heavy commercial vehicles for estimating the vibration performance of bolted joints of the components connected to the chassis, before conducting prototype level vehicle structural strength tests on a proving ground. The frequency and displacements caused by the road conditions-induced vibration loads have been determined for the parts connected to the chassis, and various experimental design scenarios have been formed by matching specific components and vibration behaviors. In the studies, the performance of the torque, washer, test displacement, and test frequency parameters were observed by maintaining the connection characteristics on the vehicle, and the sensitivity ratios for these variables were calculated. As a result of these experimental design findings, tests performed on a developed device based on Junker’s vibration device and proving ground conditions versus test correlation levels were found.Keywords: bolted joints, junker’s test, loosening failure, torque loss
Procedia PDF Downloads 124