Search results for: panel data method
35603 Optical Fiber Data Throughput in a Quantum Communication System
Authors: Arash Kosari, Ali Araghi
Abstract:
A mathematical model for an optical-fiber communication channel is developed which results in an expression that calculates the throughput and loss of the corresponding link. The data are assumed to be transmitted by using of separate photons with different polarizations. The derived model also shows the dependency of data throughput with length of the channel and depolarization factor. It is observed that absorption of photons affects the throughput in a more intensive way in comparison with that of depolarization. Apart from that, the probability of depolarization and the absorption of radiated photons are obtained.Keywords: absorption, data throughput, depolarization, optical fiber
Procedia PDF Downloads 28635602 Development of a Telemedical Network Supporting an Automated Flow Cytometric Analysis for the Clinical Follow-up of Leukaemia
Authors: Claude Takenga, Rolf-Dietrich Berndt, Erling Si, Markus Diem, Guohui Qiao, Melanie Gau, Michael Brandstoetter, Martin Kampel, Michael Dworzak
Abstract:
In patients with acute lymphoblastic leukaemia (ALL), treatment response is increasingly evaluated with minimal residual disease (MRD) analyses. Flow Cytometry (FCM) is a fast and sensitive method to detect MRD. However, the interpretation of these multi-parametric data requires intensive operator training and experience. This paper presents a pipeline-software, as a ready-to-use FCM-based MRD-assessment tool for the daily clinical practice for patients with ALL. The new tool increases accuracy in assessment of FCM-MRD in samples which are difficult to analyse by conventional operator-based gating since computer-aided analysis potentially has a superior resolution due to utilization of the whole multi-parametric FCM-data space at once instead of step-wise, two-dimensional plot-based visualization. The system developed as a telemedical network reduces the work-load and lab-costs, staff-time needed for training, continuous quality control, operator-based data interpretation. It allows dissemination of automated FCM-MRD analysis to medical centres which have no established expertise for the benefit of an even larger community of diseased children worldwide. We established a telemedical network system for analysis and clinical follow-up and treatment monitoring of Leukaemia. The system is scalable and adapted to link several centres and laboratories worldwide.Keywords: data security, flow cytometry, leukaemia, telematics platform, telemedicine
Procedia PDF Downloads 98435601 Enhancing Wheat Productivity for Small-Scale Farmers in the Northern State of Sudan through Developing a Local Made Seed Cleaner and Different Seeding Methods
Authors: Yasir Hassan Satti Mohammed
Abstract:
The wheat cleaner was designed, manufactured, and tested in the workshop of the department of agricultural engineering, faculty of agricultural sciences, university of Dongola, the northern state of Sudan, for the purpose of enhancing productivity for small-scale-farmers who used to plant their saved wheat seeds every season with all risk of weed infestation and low viability. A one-season field experiment was then conducted according to the Randomized Complete Block Design (RCBD) experimental design in the demonstration farm of Dongola research station using clean seeds and unclean seeds of a local wheat variety (Imam); two different planting methods were also adopted in the experiment. One is the traditional seed drilling within the recommended seed rate (50 kg.feddan⁻¹), whereas the other was the precision seeding method using half of the recommended seed rate (25 kg.feddan⁻¹). The effect of seed type and planting method on field parameters were investigated, and the data was then analyzed using a computer application SAS system version 9.3. The results revealed significant (P ≥ 0.05) and highly significant (P ≥ 0.01) differences between treatments. The precision seeding method with clean seeds increased the number of kernels per spike (KS), tillers per plant (TPP), one thousand kernels mass (TKM), the biomass of wheat (BWT), and total yield (TOY), whereas weeds per area (WSM), the biomass of weeds (BWD) and weight of weed seeds were apparently decreased compared to seed drilling with unclean seed. Wheat seed cleaner could be of great benefit for small-scale wheat farmers in Sudan who cannot afford the cleaned seeds commercially provided by the local government.Keywords: wheat cleaner, precision seeding, seed drilling method, small-scale farmers
Procedia PDF Downloads 9535600 Offshore Outsourcing: Global Data Privacy Controls and International Compliance Issues
Authors: Michelle J. Miller
Abstract:
In recent year, there has been a rise of two emerging issues that impact the global employment and business market that the legal community must review closer: offshore outsourcing and data privacy. These two issues intersect because employment opportunities are shifting due to offshore outsourcing and some States, like the United States, anti-outsourcing legislation has been passed or presented to retain jobs within the country. In addition, the legal requirements to retain the privacy of data as a global employer extends to employees and third party service provides, including services outsourced to offshore locations. For this reason, this paper will review the intersection of these two issues with a specific focus on data privacy.Keywords: outsourcing, data privacy, international compliance, multinational corporations
Procedia PDF Downloads 41135599 Small Target Recognition Based on Trajectory Information
Authors: Saad Alkentar, Abdulkareem Assalem
Abstract:
Recognizing small targets has always posed a significant challenge in image analysis. Over long distances, the image signal-to-noise ratio tends to be low, limiting the amount of useful information available to detection systems. Consequently, visual target recognition becomes an intricate task to tackle. In this study, we introduce a Track Before Detect (TBD) approach that leverages target trajectory information (coordinates) to effectively distinguish between noise and potential targets. By reframing the problem as a multivariate time series classification, we have achieved remarkable results. Specifically, our TBD method achieves an impressive 97% accuracy in separating target signals from noise within a mere half-second time span (consisting of 10 data points). Furthermore, when classifying the identified targets into our predefined categories—airplane, drone, and bird—we achieve an outstanding classification accuracy of 96% over a more extended period of 1.5 seconds (comprising 30 data points).Keywords: small targets, drones, trajectory information, TBD, multivariate time series
Procedia PDF Downloads 4835598 Evaluation of Satellite and Radar Rainfall Product over Seyhan Plain
Authors: Kazım Kaba, Erdem Erdi, M. Akif Erdoğan, H. Mustafa Kandırmaz
Abstract:
Rainfall is crucial data source for very different discipline such as agriculture, hydrology and climate. Therefore rain rate should be known well both spatial and temporal for any area. Rainfall is measured by using rain-gauge at meteorological ground stations traditionally for many years. At the present time, rainfall products are acquired from radar and satellite images with a temporal and spatial continuity. In this study, we investigated the accuracy of these rainfall data according to rain-gauge data. For this purpose, we used Adana-Hatay radar hourly total precipitation product (RN1) and Meteosat convective rainfall rate (CRR) product over Seyhan plain. We calculated daily rainfall values from RN1 and CRR hourly precipitation products. We used the data of rainy days of four stations located within range of the radar from October 2013 to November 2015. In the study, we examined two rainfall data over Seyhan plain and the correlation between the rain-gauge data and two raster rainfall data was observed lowly.Keywords: meteosat, radar, rainfall, rain-gauge, Turkey
Procedia PDF Downloads 32835597 The Effect of Emotional Support towards Quality of Work Life on Balinese Working Women
Authors: I. Ketut Yoga Adityawira, Putu Ayu Novia Viorica, Komang Rahayu Indrawati
Abstract:
In addition to work and take care of the family, Balinese women also have a role to participate in social activities in Bali. So this will have an impact on the quality of work life of Balinese women. One way to reduce the impact of the fulfillment of the role of Balinese women namely through emotional support. The aim of this research is to find out the effect of emotional support towards the quality of work life on Balinese working women. Data were retrieved by quasi-experimental method with pretest-posttest design. Data were analyzed by Analysis of Variance (ANOVA) through SPSS 17.0 for Windows. The number of subjects in this research is 30 people with the criteria: Balinese Women, aged 27 to 55 years old, have a minimum of two years experience of work and has been married. The analysis showed that there is no effect of emotional support towards the quality of work life on Balinese working women, with information there is no significant of probability value p = 0.304 (p > 0.05).Keywords: Balinese women, emotional support, quality of work life, working women
Procedia PDF Downloads 20835596 Influence of Water Reservoir Parameters on the Climate and Coastal Areas
Authors: Lia Matchavariani
Abstract:
Water reservoir construction on the rivers flowing into the sea complicates the coast protection, seashore starts to degrade causing coast erosion and disaster on the backdrop of current climate change. The instruments of the impact of a water reservoir on the climate and coastal areas are its contact surface with the atmosphere and the area irrigated with its water or humidified with infiltrated waters. The Black Sea coastline is characterized by the highest ecological vulnerability. The type and intensity of the water reservoir impact are determined by its morphometry, type of regulation, level regime, and geomorphological and geological characteristics of the adjoining area. Studies showed the impact of the water reservoir on the climate, on its comfort parameters is positive if it is located in the zone of insufficient humidity and vice versa, is negative if the water reservoir is found in the zone with abundant humidity. There are many natural and anthropogenic factors determining the peculiarities of the impact of the water reservoir on the climate, which can be assessed with maximum accuracy by the so-called “long series” method, which operates on the meteorological elements (temperature, wind, precipitations, etc.) with the long series formed with the stationary observation data. This is the time series, which consists of two periods with statistically sufficient duration. The first period covers the observations up to the formation of the water reservoir and another period covers the observations accomplished during its operation. If no such data are available, or their series is statistically short, “an analog” method is used. Such an analog water reservoir is selected based on the similarity of the environmental conditions. It must be located within the zone of the designed water reservoir, under similar environmental conditions, and besides, a sufficient number of observations accomplished in its coastal zone.Keywords: coast-constituent sediment, eustasy, meteorological parameters, seashore degradation, water reservoirs impact
Procedia PDF Downloads 4535595 Spatial Data Mining by Decision Trees
Authors: Sihem Oujdi, Hafida Belbachir
Abstract:
Existing methods of data mining cannot be applied on spatial data because they require spatial specificity consideration, as spatial relationships. This paper focuses on the classification with decision trees, which are one of the data mining techniques. We propose an extension of the C4.5 algorithm for spatial data, based on two different approaches Join materialization and Querying on the fly the different tables. Similar works have been done on these two main approaches, the first - Join materialization - favors the processing time in spite of memory space, whereas the second - Querying on the fly different tables- promotes memory space despite of the processing time. The modified C4.5 algorithm requires three entries tables: a target table, a neighbor table, and a spatial index join that contains the possible spatial relationship among the objects in the target table and those in the neighbor table. Thus, the proposed algorithms are applied to a spatial data pattern in the accidentology domain. A comparative study of our approach with other works of classification by spatial decision trees will be detailed.Keywords: C4.5 algorithm, decision trees, S-CART, spatial data mining
Procedia PDF Downloads 61235594 Na Promoted Ni/γ-Al2O3 Catalysts Prepared by Solution Combustion Method for Syngas Methanation
Authors: Yan Zeng, Hongfang Ma, Haitao Zhang, Weiyong Ying
Abstract:
Ni-based catalysts with different amounts of Na as promoter from 2 to 6 wt % were prepared by solution combustion method. The catalytic activity was investigated in syngas methanation reaction. Carbon oxides conversion and methane selectivity are greatly influenced by sodium loading. Adding 2 wt% Na remarkably improves catalytic activity and long-term stability, attributed to its smaller mean NiO particle size, better distribution, and milder metal-support interaction. However, excess addition of Na results in deactivation distinctly due to the blockage of active sites.Keywords: nickel catalysts, syngas methanation, sodium, solution combustion method
Procedia PDF Downloads 40735593 The Effect of Sorafenibe on Soat1 Protein by Using Molecular Docking Method
Authors: Mahdiyeh Gholaminezhad
Abstract:
Context: The study focuses on the potential impact of Sorafenib on SOAT1 protein in liver cancer treatment, addressing the need for more effective therapeutic options. Research aim: To explore the effects of Sorafenib on the activity of SOAT1 protein in liver cancer cells. Methodology: Molecular docking was employed to analyze the interaction between Sorafenib and SOAT1 protein. Findings: The study revealed a significant effect of Sorafenib on the stability and activity of SOAT1 protein, suggesting its potential as a treatment for liver cancer. Theoretical importance: This research highlights the molecular mechanism underlying Sorafenib's anti-cancer properties, contributing to the understanding of its therapeutic effects. Data collection: Data on the molecular structure of Sorafenib and SOAT1 protein were obtained from computational simulations and databases. Analysis procedures: Molecular docking simulations were performed to predict the binding interactions between Sorafenib and SOAT1 protein. Question addressed: How does Sorafenib influence the activity of SOAT1 protein and what are the implications for liver cancer treatment? Conclusion: The study demonstrates the potential of Sorafenib as a targeted therapy for liver cancer by affecting the activity of SOAT1 protein. Reviewers' Comments: The study provides valuable insights into the molecular basis of Sorafenib's action on SOAT1 protein, suggesting its therapeutic potential. To enhance the methodology, the authors could consider validating the docking results with experimental data for further validation.Keywords: liver cancer, sorafenib, SOAT1, molecular docking
Procedia PDF Downloads 2735592 Comparison of DPC and FOC Vector Control Strategies on Reducing Harmonics Caused by Nonlinear Load in the DFIG Wind Turbine
Authors: Hamid Havasi, Mohamad Reza Gholami Dehbalaei, Hamed Khorami, Shahram Karimi, Hamdi Abdi
Abstract:
Doubly-fed induction generator (DFIG) equipped with a power converter is an efficient tool for converting mechanical energy of a variable speed system to a fixed-frequency electrical grid. Since electrical energy sources faces with production problems such as harmonics caused by nonlinear loads, so in this paper, compensation performance of DPC and FOC method on harmonics reduction of a DFIG wind turbine connected to a nonlinear load in MATLAB Simulink model has been simulated and effect of each method on nonlinear load harmonic elimination has been compared. Results of the two mentioned control methods shows the advantage of the FOC method on DPC method for harmonic compensation. Also, the fifth and seventh harmonic components of the network and THD greatly reduced.Keywords: DFIG machine, energy conversion, nonlinear load, THD, DPC, FOC
Procedia PDF Downloads 58935591 A Prediction Method for Large-Size Event Occurrences in the Sandpile Model
Authors: S. Channgam, A. Sae-Tang, T. Termsaithong
Abstract:
In this research, the occurrences of large size events in various system sizes of the Bak-Tang-Wiesenfeld sandpile model are considered. The system sizes (square lattice) of model considered here are 25×25, 50×50, 75×75 and 100×100. The cross-correlation between the ratio of sites containing 3 grain time series and the large size event time series for these 4 system sizes are also analyzed. Moreover, a prediction method of the large-size event for the 50×50 system size is also introduced. Lastly, it can be shown that this prediction method provides a slightly higher efficiency than random predictions.Keywords: Bak-Tang-Wiesenfeld sandpile model, cross-correlation, avalanches, prediction method
Procedia PDF Downloads 38235590 An Application of Extreme Value Theory as a Risk Measurement Approach in Frontier Markets
Authors: Dany Ng Cheong Vee, Preethee Nunkoo Gonpot, Noor Sookia
Abstract:
In this paper, we consider the application of Extreme Value Theory as a risk measurement tool. The Value at Risk, for a set of indices, from six Stock Exchanges of Frontier markets is calculated using the Peaks over Threshold method and the performance of the model index-wise is evaluated using coverage tests and loss functions. Our results show that 'fat-tailedness' alone of the data is not enough to justify the use of EVT as a VaR approach. The structure of the returns dynamics is also a determining factor. This approach works fine in markets which have had extremes occurring in the past thus making the model capable of coping with extremes coming up (Colombo, Tunisia and Zagreb Stock Exchanges). On the other hand, we find that indices with lower past than present volatility fail to adequately deal with future extremes (Mauritius and Kazakhstan). We also conclude that using EVT alone produces quite static VaR figures not reflecting the actual dynamics of the data.Keywords: extreme value theory, financial crisis 2008, value at risk, frontier markets
Procedia PDF Downloads 27635589 Computer Simulations of Stress Corrosion Studies of Quartz Particulate Reinforced ZA-27 Metal Matrix Composites
Authors: K. Vinutha
Abstract:
The stress corrosion resistance of ZA-27 / TiO2 metal matrix composites (MMC’s) in high temperature acidic media has been evaluated using an autoclave. The liquid melt metallurgy technique using vortex method was used to fabricate MMC’s. TiO2 particulates of 50-80 µm in size are added to the matrix. ZA-27 containing 2,4,6 weight percentage of TiO2 are prepared. Stress corrosion tests were conducted by weight loss method for different exposure time, normality and temperature of the acidic medium. The corrosion rates of composites were lower to that of matrix ZA-27 alloy under all conditions.Keywords: autoclave, MMC’s, stress corrosion, vortex method
Procedia PDF Downloads 47735588 Data-Driven Dynamic Overbooking Model for Tour Operators
Authors: Kannapha Amaruchkul
Abstract:
We formulate a dynamic overbooking model for a tour operator, in which most reservations contain at least two people. The cancellation rate and the timing of the cancellation may depend on the group size. We propose two overbooking policies, namely economic- and service-based. In an economic-based policy, we want to minimize the expected oversold and underused cost, whereas, in a service-based policy, we ensure that the probability of an oversold situation does not exceed the pre-specified threshold. To illustrate the applicability of our approach, we use tour package data in 2016-2018 from a tour operator in Thailand to build a data-driven robust optimization model, and we tested the proposed overbooking policy in 2019. We also compare the data-driven approach to the conventional approach of fitting data into a probability distribution.Keywords: applied stochastic model, data-driven robust optimization, overbooking, revenue management, tour operator
Procedia PDF Downloads 13435587 Hierarchically Modeling Cognition and Behavioral Problems of an Under-Represented Group
Authors: Zhidong Zhang, Zhi-Chao Zhang
Abstract:
This study examines adolescent psychological and behavioral problems. The Achenbach systems of empirically based assessment (ASEBA) were used as the instrument. The problem framework consists of internal, external and social behavioral problems which are theoretically developed based on about 113 items plus relevant background variables. In this study, the sample consist of 1,975 sixth and seventh grade students in Northeast China. Stratified random sampling method was used to collect the data, meaning that samples were from different school districts, schools, and classes. The researchers looked at both macro and micro effect. Therefore, multilevel analysis techniques were used in the data analysis. The parts of the research results indicated that the background variables such as extracurricular activities were directly related to students’ internal problems.Keywords: behavioral problems, anxious/depressed problems, internalizing problems, mental health, under-represented groups, empirically-based assessment, hierarchical modeling, ASEBA, multilevel analysis
Procedia PDF Downloads 60335586 Land Use Change Detection Using Satellite Images for Najran City, Kingdom of Saudi Arabia (KSA)
Authors: Ismail Elkhrachy
Abstract:
Determination of land use changing is an important component of regional planning for applications ranging from urban fringe change detection to monitoring change detection of land use. This data are very useful for natural resources management.On the other hand, the technologies and methods of change detection also have evolved dramatically during past 20 years. So it has been well recognized that the change detection had become the best methods for researching dynamic change of land use by multi-temporal remotely-sensed data. The objective of this paper is to assess, evaluate and monitor land use change surrounding the area of Najran city, Kingdom of Saudi Arabia (KSA) using Landsat images (June 23, 2009) and ETM+ image(June. 21, 2014). The post-classification change detection technique was applied. At last,two-time subset images of Najran city are compared on a pixel-by-pixel basis using the post-classification comparison method and the from-to change matrix is produced, the land use change information obtained.Three classes were obtained, urban, bare land and agricultural land from unsupervised classification method by using Erdas Imagine and ArcGIS software. Accuracy assessment of classification has been performed before calculating change detection for study area. The obtained accuracy is between 61% to 87% percent for all the classes. Change detection analysis shows that rapid growth in urban area has been increased by 73.2%, the agricultural area has been decreased by 10.5 % and barren area reduced by 7% between 2009 and 2014. The quantitative study indicated that the area of urban class has unchanged by 58.2 km〗^2, gained 70.3 〖km〗^2 and lost 16 〖km〗^2. For bare land class 586.4〖km〗^2 has unchanged, 53.2〖km〗^2 has gained and 101.5〖km〗^2 has lost. While agriculture area class, 20.2〖km〗^2 has unchanged, 31.2〖km〗^2 has gained and 37.2〖km〗^2 has lost.Keywords: land use, remote sensing, change detection, satellite images, image classification
Procedia PDF Downloads 52435585 Modeling and Statistical Analysis of a Soap Production Mix in Bejoy Manufacturing Industry, Anambra State, Nigeria
Authors: Okolie Chukwulozie Paul, Iwenofu Chinwe Onyedika, Sinebe Jude Ebieladoh, M. C. Nwosu
Abstract:
The research work is based on the statistical analysis of the processing data. The essence is to analyze the data statistically and to generate a design model for the production mix of soap manufacturing products in Bejoy manufacturing company Nkpologwu, Aguata Local Government Area, Anambra state, Nigeria. The statistical analysis shows the statistical analysis and the correlation of the data. T test, Partial correlation and bi-variate correlation were used to understand what the data portrays. The design model developed was used to model the data production yield and the correlation of the variables show that the R2 is 98.7%. However, the results confirm that the data is fit for further analysis and modeling. This was proved by the correlation and the R-squared.Keywords: General Linear Model, correlation, variables, pearson, significance, T-test, soap, production mix and statistic
Procedia PDF Downloads 44535584 Development of a Regression Based Model to Predict Subjective Perception of Squeak and Rattle Noise
Authors: Ramkumar R., Gaurav Shinde, Pratik Shroff, Sachin Kumar Jain, Nagesh Walke
Abstract:
Advancements in electric vehicles have significantly reduced the powertrain noise and moving components of vehicles. As a result, in-cab noises have become more noticeable to passengers inside the car. To ensure a comfortable ride for drivers and other passengers, it has become crucial to eliminate undesirable component noises during the development phase. Standard practices are followed to identify the severity of noises based on subjective ratings, but it can be a tedious process to identify the severity of each development sample and make changes to reduce it. Additionally, the severity rating can vary from jury to jury, making it challenging to arrive at a definitive conclusion. To address this, an automotive component was identified to evaluate squeak and rattle noise issue. Physical tests were carried out for random and sine excitation profiles. Aim was to subjectively assess the noise using jury rating method and objectively evaluate the same by measuring the noise. Suitable jury evaluation method was selected for the said activity, and recorded sounds were replayed for jury rating. Objective data sound quality metrics viz., loudness, sharpness, roughness, fluctuation strength and overall Sound Pressure Level (SPL) were measured. Based on this, correlation co-efficients was established to identify the most relevant sound quality metrics that are contributing to particular identified noise issue. Regression analysis was then performed to establish the correlation between subjective and objective data. Mathematical model was prepared using artificial intelligence and machine learning algorithm. The developed model was able to predict the subjective rating with good accuracy.Keywords: BSR, noise, correlation, regression
Procedia PDF Downloads 7935583 Pricing European Continuous-Installment Options under Regime-Switching Models
Authors: Saghar Heidari
Abstract:
In this paper, we study the valuation problem of European continuous-installment options under Markov-modulated models with a partial differential equation approach. Due to the opportunity for continuing or stopping to pay installments, the valuation problem under regime-switching models can be formulated as coupled partial differential equations (CPDE) with free boundary features. To value the installment options, we express the truncated CPDE as a linear complementarity problem (LCP), then a finite element method is proposed to solve the resulted variational inequality. Under some appropriate assumptions, we establish the stability of the method and illustrate some numerical results to examine the rate of convergence and accuracy of the proposed method for the pricing problem under the regime-switching model.Keywords: continuous-installment option, European option, regime-switching model, finite element method
Procedia PDF Downloads 13735582 Processes and Application of Casting Simulation and Its Software’s
Authors: Surinder Pal, Ajay Gupta, Johny Khajuria
Abstract:
Casting simulation helps visualize mold filling and casting solidification; predict related defects like cold shut, shrinkage porosity and hard spots; and optimize the casting design to achieve the desired quality with high yield. Flow and solidification of molten metals are, however, a very complex phenomenon that is difficult to simulate correctly by conventional computational techniques, especially when the part geometry is intricate and the required inputs (like thermo-physical properties and heat transfer coefficients) are not available. Simulation software is based on the process of modeling a real phenomenon with a set of mathematical formulas. It is, essentially, a program that allows the user to observe an operation through simulation without actually performing that operation. Simulation software is used widely to design equipment so that the final product will be as close to design specs as possible without expensive in process modification. Simulation software with real-time response is often used in gaming, but it also has important industrial applications. When the penalty for improper operation is costly, such as airplane pilots, nuclear power plant operators, or chemical plant operators, a mockup of the actual control panel is connected to a real-time simulation of the physical response, giving valuable training experience without fear of a disastrous outcome. The all casting simulation software has own requirements, like magma cast has only best for crack simulation. The latest generation software Auto CAST developed at IIT Bombay provides a host of functions to support method engineers, including part thickness visualization, core design, multi-cavity mold design with common gating and feeding, application of various feed aids (feeder sleeves, chills, padding, etc.), simulation of mold filling and casting solidification, automatic optimization of feeders and gating driven by the desired quality level, and what-if cost analysis. IIT Bombay has developed a set of applications for the foundry industry to improve casting yield and quality. Casting simulation is a fast and efficient solution for process for advanced tool which is the result of more than 20 years of collaboration with major industrial partners and academic institutions around the world. In this paper the process of casting simulation is studied.Keywords: casting simulation software’s, simulation technique’s, casting simulation, processes
Procedia PDF Downloads 47535581 Divergence Regularization Method for Solving Ill-Posed Cauchy Problem for the Helmholtz Equation
Authors: Benedict Barnes, Anthony Y. Aidoo
Abstract:
A Divergence Regularization Method (DRM) is used to regularize the ill-posed Helmholtz equation where the boundary deflection is inhomogeneous in a Hilbert space H. The DRM incorporates a positive integer scaler which homogenizes the inhomogeneous boundary deflection in Cauchy problem of the Helmholtz equation. This ensures the existence, as well as, uniqueness of solution for the equation. The DRM restores all the three conditions of well-posedness in the sense of Hadamard.Keywords: divergence regularization method, Helmholtz equation, ill-posed inhomogeneous Cauchy boundary conditions
Procedia PDF Downloads 18935580 Analysis of Shallow Foundation Using Conventional and Finite Element Approach
Authors: Sultan Al Shafian, Mozaher Ul Kabir, Khondoker Istiak Ahmad, Masnun Abrar, Mahfuza Khanum, Hossain M. Shahin
Abstract:
For structural evaluation of shallow foundation, the modulus of subgrade reaction is one of the most widely used and accepted parameter for its ease of calculations. To determine this parameter, one of the most common field method is Plate Load test method. In this field test method, the subgrade modulus is considered for a specific location and according to its application, it is assumed that the displacement occurred in one place does not affect other adjacent locations. For this kind of assumptions, the modulus of subgrade reaction sometimes forced the engineers to overdesign the underground structure, which eventually results in increasing the cost of the construction and sometimes failure of the structure. In the present study, the settlement of a shallow foundation has been analyzed using both conventional and numerical analysis. Around 25 plate load tests were conducted on a sand fill site in Bangladesh to determine the Modulus of Subgrade reaction of ground which is later used to design a shallow foundation considering different depth. After the collection of the field data, the field condition was appropriately simulated in a finite element software. Finally results obtained from both the conventional and numerical approach has been compared. A significant difference has been observed in the case of settlement while comparing the results. A proper correlation has also been proposed at the end of this research work between the two methods of in order to provide the most efficient way to calculate the subgrade modulus of the ground for designing the shallow foundation.Keywords: modulus of subgrade reaction, shallow foundation, finite element analysis, settlement, plate load test
Procedia PDF Downloads 18235579 Corporate Social Responsibility: A Comparative Study of Two Largest Banks in India
Authors: Navdeep Kaur
Abstract:
Corporate Social Responsibility is the process through which the organizations execute their philanthropic visions for social welfare. This paper considers the data of one Public Sector Bank–State Bank of India (SBI) and one Private Sector Bank-Industrial Credit and Investment Corporation of India (ICICI) from the year 2008 to 2016. The study is based on descriptive research design, and secondary data collected from the annual report of respective bank from website and different literature are reviewed. Least Square Method is used for estimating CSR spending for the financial year 2017-18. The analysis shows that these banks are making efforts for the implementation of CSR, but are not spending their 2% share of profits on CSR. There is a need for better CSR activities by the banks, which is possible by concentrating more on the prevailing social issues. The finding reveals that the percentage of profit after tax spends for CSR by SBI is more compare to ICICI. The estimated Spending for CSR for 2017-18 is also more in SBI as compared to ICICI.Keywords: banking sector, corporate social responsibility in India, financial institution, public sector banks, SBI, ICICI
Procedia PDF Downloads 19335578 Helping the Development of Public Policies with Knowledge of Criminal Data
Authors: Diego De Castro Rodrigues, Marcelo B. Nery, Sergio Adorno
Abstract:
The project aims to develop a framework for social data analysis, particularly by mobilizing criminal records and applying descriptive computational techniques, such as associative algorithms and extraction of tree decision rules, among others. The methods and instruments discussed in this work will enable the discovery of patterns, providing a guided means to identify similarities between recurring situations in the social sphere using descriptive techniques and data visualization. The study area has been defined as the city of São Paulo, with the structuring of social data as the central idea, with a particular focus on the quality of the information. Given this, a set of tools will be validated, including the use of a database and tools for visualizing the results. Among the main deliverables related to products and the development of articles are the discoveries made during the research phase. The effectiveness and utility of the results will depend on studies involving real data, validated both by domain experts and by identifying and comparing the patterns found in this study with other phenomena described in the literature. The intention is to contribute to evidence-based understanding and decision-making in the social field.Keywords: social data analysis, criminal records, computational techniques, data mining, big data
Procedia PDF Downloads 8435577 Dividend Payout and Capital Structure: A Family Firm Perspective
Authors: Abhinav Kumar Rajverma, Arun Kumar Misra, Abhijeet Chandra
Abstract:
Family involvement in business is universal across countries, with varying characteristics. Firms of developed economies have diffused ownership structure; however, that of emerging markets have concentrated ownership structure, having resemblance with that of family firms. Optimization of dividend payout and leverage are very crucial for firm’s valuation. This paper studies dividend paying behavior of National Stock Exchange listed Indian firms from financial year 2007 to 2016. The final sample consists of 422 firms and of these more than 49% (207) are family firms. Results reveal that family firms pay lower dividend and are more leveraged compared to non-family firms. This unique data set helps to understand dividend behavior and capital structure of sample firms over a long-time period and across varying family ownership concentration. Using panel regression models, this paper examines factors affecting dividend payout and capital structure and establishes a link between the two using Two-stage Least Squares regression model. Profitability shows a positive impact on dividend and negative impact on leverage, confirming signaling and pecking order theory. Further, findings support bankruptcy theory as firm size has a positive relation with dividend and leverage and volatility shows a negative relation with both dividend and leverage. Findings are also consistent with agency theory, family ownership concentration has negative relation with both dividend payments and leverage. Further, the impact of family ownership control confirms the similar finding. The study further reveals that firms with high family ownership concentration (family control) do have an impact on determining the level of private benefits. Institutional ownership is not significant for dividend payments. However, it shows significant negative relation with leverage for both family and non-family firms. Dividend payout and leverage show mixed association with each other. This paper provides evidence of how varying level of family ownership concentration and ownership control influences the dividend policy and capital structure of firms in an emerging market like India and it can have significant contribution towards understanding and formulating corporate dividend policy decisions and capital structure for emerging economies, where majority of firms exhibit behavior of family firm.Keywords: dividend, family firms, leverage, ownership structure
Procedia PDF Downloads 28035576 Predicting Open Chromatin Regions in Cell-Free DNA Whole Genome Sequencing Data by Correlation Clustering
Authors: Fahimeh Palizban, Farshad Noravesh, Amir Hossein Saeidian, Mahya Mehrmohamadi
Abstract:
In the recent decade, the emergence of liquid biopsy has significantly improved cancer monitoring and detection. Dying cells, including those originating from tumors, shed their DNA into the blood and contribute to a pool of circulating fragments called cell-free DNA. Accordingly, identifying the tissue origin of these DNA fragments from the plasma can result in more accurate and fast disease diagnosis and precise treatment protocols. Open chromatin regions are important epigenetic features of DNA that reflect cell types of origin. Profiling these features by DNase-seq, ATAC-seq, and histone ChIP-seq provides insights into tissue-specific and disease-specific regulatory mechanisms. There have been several studies in the area of cancer liquid biopsy that integrate distinct genomic and epigenomic features for early cancer detection along with tissue of origin detection. However, multimodal analysis requires several types of experiments to cover the genomic and epigenomic aspects of a single sample, which will lead to a huge amount of cost and time. To overcome these limitations, the idea of predicting OCRs from WGS is of particular importance. In this regard, we proposed a computational approach to target the prediction of open chromatin regions as an important epigenetic feature from cell-free DNA whole genome sequence data. To fulfill this objective, local sequencing depth will be fed to our proposed algorithm and the prediction of the most probable open chromatin regions from whole genome sequencing data can be carried out. Our method integrates the signal processing method with sequencing depth data and includes count normalization, Discrete Fourie Transform conversion, graph construction, graph cut optimization by linear programming, and clustering. To validate the proposed method, we compared the output of the clustering (open chromatin region+, open chromatin region-) with previously validated open chromatin regions related to human blood samples of the ATAC-DB database. The percentage of overlap between predicted open chromatin regions and the experimentally validated regions obtained by ATAC-seq in ATAC-DB is greater than 67%, which indicates meaningful prediction. As it is evident, OCRs are mostly located in the transcription start sites (TSS) of the genes. In this regard, we compared the concordance between the predicted OCRs and the human genes TSS regions obtained from refTSS and it showed proper accordance around 52.04% and ~78% with all and the housekeeping genes, respectively. Accurately detecting open chromatin regions from plasma cell-free DNA-seq data is a very challenging computational problem due to the existence of several confounding factors, such as technical and biological variations. Although this approach is in its infancy, there has already been an attempt to apply it, which leads to a tool named OCRDetector with some restrictions like the need for highly depth cfDNA WGS data, prior information about OCRs distribution, and considering multiple features. However, we implemented a graph signal clustering based on a single depth feature in an unsupervised learning manner that resulted in faster performance and decent accuracy. Overall, we tried to investigate the epigenomic pattern of a cell-free DNA sample from a new computational perspective that can be used along with other tools to investigate genetic and epigenetic aspects of a single whole genome sequencing data for efficient liquid biopsy-related analysis.Keywords: open chromatin regions, cancer, cell-free DNA, epigenomics, graph signal processing, correlation clustering
Procedia PDF Downloads 15035575 Extreme Value Modelling of Ghana Stock Exchange Indices
Authors: Kwabena Asare, Ezekiel N. N. Nortey, Felix O. Mettle
Abstract:
Modelling of extreme events has always been of interest in fields such as hydrology and meteorology. However, after the recent global financial crises, appropriate models for modelling of such rare events leading to these crises have become quite essential in the finance and risk management fields. This paper models the extreme values of the Ghana Stock Exchange All-Shares indices (2000-2010) by applying the Extreme Value Theory to fit a model to the tails of the daily stock returns data. A conditional approach of the EVT was preferred and hence an ARMA-GARCH model was fitted to the data to correct for the effects of autocorrelation and conditional heteroscedastic terms present in the returns series, before EVT method was applied. The Peak Over Threshold (POT) approach of the EVT, which fits a Generalized Pareto Distribution (GPD) model to excesses above a certain selected threshold, was employed. Maximum likelihood estimates of the model parameters were obtained and the model’s goodness of fit was assessed graphically using Q-Q, P-P and density plots. The findings indicate that the GPD provides an adequate fit to the data of excesses. The size of the extreme daily Ghanaian stock market movements were then computed using the Value at Risk (VaR) and Expected Shortfall (ES) risk measures at some high quantiles, based on the fitted GPD model.Keywords: extreme value theory, expected shortfall, generalized pareto distribution, peak over threshold, value at risk
Procedia PDF Downloads 55735574 Twitter Sentiment Analysis during the Lockdown on New-Zealand
Authors: Smah Almotiri
Abstract:
One of the most common fields of natural language processing (NLP) is sentimental analysis. The inferred feeling in the text can be successfully mined for various events using sentiment analysis. Twitter is viewed as a reliable data point for sentimental analytics studies since people are using social media to receive and exchange different types of data on a broad scale during the COVID-19 epidemic. The processing of such data may aid in making critical decisions on how to keep the situation under control. The aim of this research is to look at how sentimental states differed in a single geographic region during the lockdown at two different times.1162 tweets were analyzed related to the COVID-19 pandemic lockdown using keywords hashtags (lockdown, COVID-19) for the first sample tweets were from March 23, 2020, until April 23, 2020, and the second sample for the following year was from March 1, 2020, until April 4, 2020. Natural language processing (NLP), which is a form of Artificial intelligence, was used for this research to calculate the sentiment value of all of the tweets by using AFINN Lexicon sentiment analysis method. The findings revealed that the sentimental condition in both different times during the region's lockdown was positive in the samples of this study, which are unique to the specific geographical area of New Zealand. This research suggests applying machine learning sentimental methods such as Crystal Feel and extending the size of the sample tweet by using multiple tweets over a longer period of time.Keywords: sentiment analysis, Twitter analysis, lockdown, Covid-19, AFINN, NodeJS
Procedia PDF Downloads 190