Search results for: velocity analysis
26783 Optimizing Data Transfer and Processing in Multi-Cloud Environments for Big Data Workloads
Authors: Gaurav Kumar Sinha
Abstract:
In an era defined by the proliferation of data and the utilization of cloud computing environments, the efficient transfer and processing of big data workloads across multi-cloud platforms have emerged as critical challenges. This research paper embarks on a comprehensive exploration of the complexities associated with managing and optimizing big data in a multi-cloud ecosystem.The foundation of this study is rooted in the recognition that modern enterprises increasingly rely on multiple cloud providers to meet diverse business needs, enhance redundancy, and reduce vendor lock-in. As a consequence, managing data across these heterogeneous cloud environments has become intricate, necessitating innovative approaches to ensure data integrity, security, and performance.The primary objective of this research is to investigate strategies and techniques for enhancing the efficiency of data transfer and processing in multi-cloud scenarios. It recognizes that big data workloads are characterized by their sheer volume, variety, velocity, and complexity, making traditional data management solutions insufficient for harnessing the full potential of multi-cloud architectures.The study commences by elucidating the challenges posed by multi-cloud environments in the context of big data. These challenges encompass data fragmentation, latency, security concerns, and cost optimization. To address these challenges, the research explores a range of methodologies and solutions. One of the key areas of focus is data transfer optimization. The paper delves into techniques for minimizing data movement latency, optimizing bandwidth utilization, and ensuring secure data transmission between different cloud providers. It evaluates the applicability of dedicated data transfer protocols, intelligent data routing algorithms, and edge computing approaches in reducing transfer times.Furthermore, the study examines strategies for efficient data processing across multi-cloud environments. It acknowledges that big data processing requires distributed and parallel computing capabilities that span across cloud boundaries. The research investigates containerization and orchestration technologies, serverless computing models, and interoperability standards that facilitate seamless data processing workflows.Security and data governance are paramount concerns in multi-cloud environments. The paper explores methods for ensuring data security, access control, and compliance with regulatory frameworks. It considers encryption techniques, identity and access management, and auditing mechanisms as essential components of a robust multi-cloud data security strategy.The research also evaluates cost optimization strategies, recognizing that the dynamic nature of multi-cloud pricing models can impact the overall cost of data transfer and processing. It examines approaches for workload placement, resource allocation, and predictive cost modeling to minimize operational expenses while maximizing performance.Moreover, this study provides insights into real-world case studies and best practices adopted by organizations that have successfully navigated the challenges of multi-cloud big data management. It presents a comparative analysis of various multi-cloud management platforms and tools available in the market.Keywords: multi-cloud environments, big data workloads, data transfer optimization, data processing strategies
Procedia PDF Downloads 6826782 Citation Analysis on the Articles published in Bayero Journal of Pure and Applied Sciences (BAJOPAS), from 2008-2020: An International Journal in Bayero University, Kano, Nigeria
Authors: G. A. Babalola, Yusuf Muhammad
Abstract:
An analysis was carried out on 19,759 citations appended to the References Section of 881 research articles published in Bayero Journal of Pure and Applied Sciences. It was found that journals publications were the most cited source of information among pure and applied sciences scientists with 12,090 (61.2%). The study also revealed that researchers in the field of pure and applied sciences used very current and up to date information sources in writing theirs articles with 10,091 (51.1%) citations and an average mean 11.1 per article in the journal.Keywords: citation analysis, BAJOPAS, journal article, Bayero University Kano, Nigeria
Procedia PDF Downloads 16526781 Comparative Sustainability Performance Analysis of Australian Companies Using Composite Measures
Authors: Ramona Zharfpeykan, Paul Rouse
Abstract:
Organizational sustainability is important to both organizations themselves and their stakeholders. Despite its increasing popularity and increasing numbers of organizations reporting sustainability, research on evaluating and comparing the sustainability performance of companies is limited. The aim of this study was to develop models to measure sustainability performance for both cross-sectional and longitudinal comparisons across companies in the same or different industries. A secondary aim was to see if sustainability reports can be used to evaluate sustainability performance. The study used both a content analysis of Australian sustainability reports in mining and metals and financial services for 2011-2014 and a survey of Australian and New Zealand organizations. Two methods ranging from a composite index using uniform weights to data envelopment analysis (DEA) were employed to analyze the data and develop the models. The results show strong statistically significant relationships between the developed models, which suggests that each model provides a consistent, systematic and reasonably robust analysis. The results of the models show that for both industries, companies that had sustainability scores above or below the industry average stayed almost the same during the study period. These indices and models can be used by companies to evaluate their sustainability performance and compare it with previous years, or with other companies in the same or different industries. These methods can also be used by various stakeholders and sustainability ranking companies such as the Global Reporting Initiative (GRI).Keywords: data envelopment analysis, sustainability, sustainability performance measurement system, sustainability performance index, global reporting initiative
Procedia PDF Downloads 18126780 Microbubbles Enhanced Synthetic Phorbol Ester Degradation by Ozonolysis
Authors: D. Kuvshinov, A. Siswanto, W. Zimmerman
Abstract:
A phorbol-12-myristate-13-acetate (TPA) is a synthetic analogue of phorbol ester (PE), a natural toxic compound of Euphorbiaceae plant. The oil extracted from plants of this family is useful source for primarily biofuel. However this oil can also be used as a food stock due to its significant nutrition content. The limitations for utilizing the oil as a food stock are mainly due to a toxicity of PE. Nowadays a majority of PE detoxification processes are expensive as include multi steps alcohol extraction sequence. Ozone is considered as a strong oxidative agent. It reaction with PE it attacks the carbon double bond of PE. This modification of PE molecular structure results into nontoxic ester with high lipid content. This report presents data on development of simple and cheap PE detoxification process with water application as a buffer and ozone as reactive component. The core of this new technique is a simultaneous application of new microscale plasma unit for ozone production and patented gas oscillation technology. In combination with a reactor design the technology permits ozone injection to the water-TPA mixture in form of microbubbles. The efficacy of a heterogeneous process depends on diffusion coefficient which can be controlled by contact time and interface area. The low velocity of rising microbubbles and high surface to volume ratio allow fast mass transfer to be achieved during the process. Direct injection of ozone is the most efficient process for a highly reactive and short lived chemical. Data on the plasma unit behavior are presented and influence of the gas oscillation technology to the microbubbles production mechanism has been discussed. Data on overall process efficacy for TPA degradation is shown.Keywords: microbubble, ozonolysis, synthetic phorbol ester, chemical engineering
Procedia PDF Downloads 21726779 Detecting of Crime Hot Spots for Crime Mapping
Authors: Somayeh Nezami
Abstract:
The management of financial and human resources of police in metropolitans requires many information and exact plans to reduce a rate of crime and increase the safety of the society. Geographical Information Systems have an important role in providing crime maps and their analysis. By using them and identification of crime hot spots along with spatial presentation of the results, it is possible to allocate optimum resources while presenting effective methods for decision making and preventive solutions. In this paper, we try to explain and compare between some of the methods of hot spots analysis such as Mode, Fuzzy Mode and Nearest Neighbour Hierarchical spatial clustering (NNH). Then the spots with the highest crime rates of drug smuggling for one province in Iran with borderline with Afghanistan are obtained. We will show that among these three methods NNH leads to the best result.Keywords: GIS, Hot spots, nearest neighbor hierarchical spatial clustering, NNH, spatial analysis of crime
Procedia PDF Downloads 32926778 Effects of Various Wavelet Transforms in Dynamic Analysis of Structures
Authors: Seyed Sadegh Naseralavi, Sadegh Balaghi, Ehsan Khojastehfar
Abstract:
Time history dynamic analysis of structures is considered as an exact method while being computationally intensive. Filtration of earthquake strong ground motions applying wavelet transform is an approach towards reduction of computational efforts, particularly in optimization of structures against seismic effects. Wavelet transforms are categorized into continuum and discrete transforms. Since earthquake strong ground motion is a discrete function, the discrete wavelet transform is applied in the present paper. Wavelet transform reduces analysis time by filtration of non-effective frequencies of strong ground motion. Filtration process may be repeated several times while the approximation induces more errors. In this paper, strong ground motion of earthquake has been filtered once applying each wavelet. Strong ground motion of Northridge earthquake is filtered applying various wavelets and dynamic analysis of sampled shear and moment frames is implemented. The error, regarding application of each wavelet, is computed based on comparison of dynamic response of sampled structures with exact responses. Exact responses are computed by dynamic analysis of structures applying non-filtered strong ground motion.Keywords: wavelet transform, computational error, computational duration, strong ground motion data
Procedia PDF Downloads 37826777 Bidirectional Encoder Representations from Transformers Sentiment Analysis Applied to Three Presidential Pre-Candidates in Costa Rica
Authors: Félix David Suárez Bonilla
Abstract:
A sentiment analysis service to detect polarity (positive, neural, and negative), based on transfer learning, was built using a Spanish version of BERT and applied to tweets written in Spanish. The dataset that was used consisted of 11975 reviews, which were extracted from Google Play using the google-play-scrapper package. The BETO trained model used: the AdamW optimizer, a batch size of 16, a learning rate of 2x10⁻⁵ and 10 epochs. The system was tested using tweets of three presidential pre-candidates from Costa Rica. The system was finally validated using human labeled examples, achieving an accuracy of 83.3%.Keywords: NLP, transfer learning, BERT, sentiment analysis, social media, opinion mining
Procedia PDF Downloads 17426776 Doing Cause-and-Effect Analysis Using an Innovative Chat-Based Focus Group Method
Authors: Timothy Whitehill
Abstract:
This paper presents an innovative chat-based focus group method for collecting qualitative data to construct a cause-and-effect analysis in business research. This method was developed in response to the research and data collection challenges faced by the Covid-19 outbreak in the United Kingdom during 2020-21. This paper discusses the methodological approaches and builds a contemporary argument for its effectiveness in exploring cause-and-effect relationships in the context of focus group research, systems thinking and problem structuring methods. The pilot for this method was conducted between October 2020 and March 2021 and collected more than 7,000 words of chat-based data which was used to construct a consensus drawn cause-and-effect analysis. This method was developed in support of an ongoing Doctorate in Business Administration (DBA) thesis, which is using Design Science Research methodology to operationalize organisational resilience in UK construction sector firms.Keywords: cause-and-effect analysis, focus group research, problem structuring methods, qualitative research, systems thinking
Procedia PDF Downloads 22126775 A Multivariate Statistical Approach for Water Quality Assessment of River Hindon, India
Authors: Nida Rizvi, Deeksha Katyal, Varun Joshi
Abstract:
River Hindon is an important river catering the demand of highly populated rural and industrial cluster of western Uttar Pradesh, India. Water quality of river Hindon is deteriorating at an alarming rate due to various industrial, municipal and agricultural activities. The present study aimed at identifying the pollution sources and quantifying the degree to which these sources are responsible for the deteriorating water quality of the river. Various water quality parameters, like pH, temperature, electrical conductivity, total dissolved solids, total hardness, calcium, chloride, nitrate, sulphate, biological oxygen demand, chemical oxygen demand and total alkalinity were assessed. Water quality data obtained from eight study sites for one year has been subjected to the two multivariate techniques, namely, principal component analysis and cluster analysis. Principal component analysis was applied with the aim to find out spatial variability and to identify the sources responsible for the water quality of the river. Three Varifactors were obtained after varimax rotation of initial principal components using principal component analysis. Cluster analysis was carried out to classify sampling stations of certain similarity, which grouped eight different sites into two clusters. The study reveals that the anthropogenic influence (municipal, industrial, waste water and agricultural runoff) was the major source of river water pollution. Thus, this study illustrates the utility of multivariate statistical techniques for analysis and elucidation of multifaceted data sets, recognition of pollution sources/factors and understanding temporal/spatial variations in water quality for effective river water quality management.Keywords: cluster analysis, multivariate statistical techniques, river Hindon, water quality
Procedia PDF Downloads 46726774 Dynamic and Thermal Characteristics of Three-Dimensional Turbulent Offset Jet
Authors: Ali Assoudi, Sabra Habli, Nejla Mahjoub Saïd, Philippe Bournot, Georges Le Palec
Abstract:
Studying the flow characteristics of a turbulent offset jet is an important topic among researchers across the world because of its various engineering applications. Some of the common examples include: injection and carburetor systems, entrainment and mixing process in gas turbine and boiler combustion chambers, Thrust-augmenting ejectors for V/STOL aircrafts and HVAC systems, environmental dischargers, film cooling and many others. An offset jet is formed when a jet discharges into a medium above a horizontal solid wall parallel to the axis of the jet exit but which is offset by a certain distance. The structure of a turbulent offset-jet can be described by three main regions. Close to the nozzle exit, an offset jet possesses characteristic features similar to those of free jets. Then, the entrainment of fluid between the jet, the offset wall and the bottom wall creates a low pressure zone, forcing the jet to deflect towards the wall and eventually attaches to it at the impingement point. This is referred to as the Coanda effect. Further downstream after the reattachment point, the offset jet has the characteristics of a wall jet flow. Therefore, the offset jet has characteristics of free, impingement and wall jets, and it is relatively more complex compared to these types of flows. The present study examines the dynamic and thermal evolution of a 3D turbulent offset jet with different offset height ratio (the ratio of the distance from the jet exit to the impingement bottom wall and the jet nozzle diameter). To achieve this purpose a numerical study was conducted to investigate a three-dimensional offset jet flow through the resolution of the different governing Navier–Stokes’ equations by means of the finite volume method and the RSM second-order turbulent closure model. A detailed discussion has been provided on the flow and thermal characteristics in the form of streamlines, mean velocity vector, pressure field and Reynolds stresses.Keywords: offset jet, offset ratio, numerical simulation, RSM
Procedia PDF Downloads 30426773 Quantitative Analysis of the Quality of Housing and Land Use in the Built-up area of Croatian Coastal City of Zadar
Authors: Silvija Šiljeg, Ante Šiljeg, Branko Cavrić
Abstract:
Housing is considered as a basic human need and important component of the quality of life (QoL) in urban areas worldwide. In contemporary housing studies, the concept of the quality of housing (QoH) is considered as a multi-dimensional and multi-disciplinary field. It emphasizes connection between various aspects of the QoL which could be measured by quantitative and qualitative indicators at different spatial levels (e.g. local, city, metropolitan, regional). The main goal of this paper is to examine the QoH and compare results of quantitative analysis with the clutter land use categories derived for selected local communities in Croatian Coastal City of Zadar. The qualitative housing analysis based on the four housing indicators (out of total 24 QoL indicators) has provided identification of the three Zadar’s local communities with the highest estimated QoH ranking. Furthermore, by using GIS overlay techniques, the QoH was merged with the urban environment analysis and introduction of spatial metrics based on the three categories: the element, class and environment as a whole. In terms of semantic-content analysis, the research has also generated a set of indexes suitable for evaluation of “housing state of affairs” and future decision making aiming at improvement of the QoH in selected local communities.Keywords: housing, quality, indicators, indexes, urban environment, GIS, element, class
Procedia PDF Downloads 41026772 Analysis of the Suspension Rocker of Formula SAE Prototype by Finite Element Method
Authors: Jessyca A. Bessa, Darlan A. Barroso, Jonas P. Reges, Auzuir R. Alexandria
Abstract:
This work aims to study the rocker. This is a device of the suspension of Formula SAE vehicle that receives efforts from the motion scrolling of the vehicle and transmits them to the chassis frame minimized by a momentum ratio and smoothed by the set spring - damper. A review of parameters used in vehicle dynamics and a geometric analysis of the forces and stresses caused by such was carried out. The main function of the rocker is to reduce the force transmitted to the frame due to movement of rolling and subsequent application of the suspension. This functions is taken as satisfactory, since the force applied to the wheel and which would be transmitted to the chassis is reduced from 3833.9N to 3496.48N. From these values can be further more detailed simulations using the finite element method aimed at mass reduction or even rocker manufacturing feasibility aluminum. Then, the analysis by the finite element method was applied. This analysis uses the theory of discretization of systems and examines the strength of the component based on the distortion energy, determining the maximum straining experienced by the component and the region of higher demand.Keywords: rocker, suspension, the finite element method, mechatronics engineering
Procedia PDF Downloads 54126771 Spin Rate Decaying Law of Projectile with Hemispherical Head in Exterior Trajectory
Authors: Quan Wen, Tianxiao Chang, Shaolu Shi, Yushi Wang, Guangyu Wang
Abstract:
As a kind of working environment of the fuze, the spin rate decaying law of projectile in exterior trajectory is of great value in the design of the rotation count fixed distance fuze. In addition, it is significant in the field of devices for simulation tests of fuze exterior ballistic environment, flight stability, and dispersion accuracy of gun projectile and opening and scattering design of submunition and illuminating cartridges. Besides, the self-destroying mechanism of the fuze in small-caliber projectile often works by utilizing the attenuation of centrifugal force. In the theory of projectile aerodynamics and fuze design, there are many formulas describing the change law of projectile angular velocity in external ballistic such as Roggla formula, exponential function formula, and power function formula. However, these formulas are mostly semi-empirical due to the poor test conditions and insufficient test data at that time. These formulas are difficult to meet the design requirements of modern fuze because they are not accurate enough and have a narrow range of applications now. In order to provide more accurate ballistic environment parameters for the design of a hemispherical head projectile fuze, the projectile’s spin rate decaying law in exterior trajectory under the effect of air resistance was studied. In the analysis, the projectile shape was simplified as hemisphere head, cylindrical part, rotating band part, and anti-truncated conical tail. The main assumptions are as follows: a) The shape and mass are symmetrical about the longitudinal axis, b) There is a smooth transition between the ball hea, c) The air flow on the outer surface is set as a flat plate flow with the same area as the expanded outer surface of the projectile, and the boundary layer is turbulent, d) The polar damping moment attributed to the wrench hole and rifling mark on the projectile is not considered, e) The groove of the rifle on the rotating band is uniform, smooth and regular. The impacts of the four parts on aerodynamic moment of the projectile rotation were obtained by aerodynamic theory. The surface friction stress of the projectile, the polar damping moment formed by the head of the projectile, the surface friction moment formed by the cylindrical part, the rotating band, and the anti-truncated conical tail were obtained by mathematical derivation. After that, the mathematical model of angular spin rate attenuation was established. In the whole trajectory with the maximum range angle (38°), the absolute error of the polar damping torque coefficient obtained by simulation and the coefficient calculated by the mathematical model established in this paper is not more than 7%. Therefore, the credibility of the mathematical model was verified. The mathematical model can be described as a first-order nonlinear differential equation, which has no analytical solution. The solution can be only gained as a numerical solution by connecting the model with projectile mass motion equations in exterior ballistics.Keywords: ammunition engineering, fuze technology, spin rate, numerical simulation
Procedia PDF Downloads 14426770 Aerodynamic Analysis of Multiple Winglets for Aircrafts
Authors: S. Pooja Pragati, B. Sudarsan, S. Raj Kumar
Abstract:
This paper provides a practical design of a new concept of massive Induced Drag reductions of stream vise staggered multiple winglets. It is designed to provide an optimum performance of a winglet from conventional designs. In preparing for a mechanical design, aspects such as shape, dimensions are analyzed to yield a huge amount of reduction in fuel consumption and increased performance. Owing to its simplicity of application and effectiveness we believe that it will enable us to consider its enhanced version for the grid effect of the staggered multiple winglets on the deflected mass flow of the wing system. The objective of the analysis were to compare the aerodynamic characteristics of two winglet configuration and to investigate the performance of two winglets shape simulated at selected cant angle of 0,45,60 degree.Keywords: multiple winglets, induced drag, aerodynamics analysis, low speed aircrafts
Procedia PDF Downloads 48026769 Development of a Matlab® Program for the Bi-Dimensional Truss Analysis Using the Stiffness Matrix Method
Authors: Angel G. De Leon Hernandez
Abstract:
A structure is defined as a physical system or, in certain cases, an arrangement of connected elements, capable of bearing certain loads. The structures are presented in every part of the daily life, e.g., in the designing of buildings, vehicles and mechanisms. The main goal of a structure designer is to develop a secure, aesthetic and maintainable system, considering the constraint imposed to every case. With the advances in the technology during the last decades, the capabilities of solving engineering problems have increased enormously. Nowadays the computers, play a critical roll in the structural analysis, pitifully, for university students the vast majority of these software are inaccessible due to the high complexity and cost they represent, even when the software manufacturers offer student versions. This is exactly the reason why the idea of developing a more reachable and easy-to-use computing tool. This program is designed as a tool for the university students enrolled in courser related to the structures analysis and designs, as a complementary instrument to achieve a better understanding of this area and to avoid all the tedious calculations. Also, the program can be useful for graduated engineers in the field of structural design and analysis. A graphical user interphase is included in the program to make it even simpler to operate it and understand the information requested and the obtained results. In the present document are included the theoretical basics in which the program is based to solve the structural analysis, the logical path followed in order to develop the program, the theoretical results, a discussion about the results and the validation of those results.Keywords: stiffness matrix method, structural analysis, Matlab® applications, programming
Procedia PDF Downloads 12226768 Prediction of Welding Induced Distortion in Thin Metal Plates Using Temperature Dependent Material Properties and FEA
Authors: Rehan Waheed, Abdul Shakoor
Abstract:
Distortion produced during welding of thin metal plates is a problem in many industries. The purpose of this research was to study distortion produced during welding in 2mm Mild Steel plate by simulating the welding process using Finite Element Analysis. Simulation of welding process requires a couple field transient analyses. At first a transient thermal analysis is performed and the temperature obtained from thermal analysis is used as input in structural analysis to find distortion. An actual weld sample is prepared and the weld distortion produced is measured. The simulated and actual results were in quite agreement with each other and it has been found that there is profound deflection at center of plate. Temperature dependent material properties play significant role in prediction of weld distortion. The results of this research can be used for prediction and control of weld distortion in large steel structures by changing different weld parameters.Keywords: welding simulation, FEA, welding distortion, temperature dependent mechanical properties
Procedia PDF Downloads 39026767 Fetal Ilium as a Tool for Sex Determination: Discriminant Functional Analysis
Authors: Luv Sharma
Abstract:
Sex determination has been the most intriguing puzzle for forensic pathologists and anthropologists, for which efforts have been made for a long. Sexual dimorphism is well established in the adult pelvis, and it is known to provide the highest level of information about sexual dimorphism. This study was conducted to know whether this dimorphism exists in fetal bones or not. A total of 34 pairs of fetal pelvis bones (22 males and 12 Females), ages ranging from 4 months to full term, were collected from unidentified dead fetuses brought to the Department of Forensic Medicine for routine medicolegal autopsies to study for sexual dimorphism in the Department of Anatomy, Pt. BD Sharma PGIMS, Rohtak. Samples were divided into 2 age groups, and various metric parameters were recorded with the help of a digital vernier caliper. Data obtained was subjected to descriptive and discriminant functional analysis. Results of Descriptive and Discriminant Functional Analysis showed that sex determination can be done with 100% accuracy by using different combinations of parameters of fetal ilium. This study illustrates that sexual dimorphism exists from early fetal life after mid-pregnancy; it can be clearly established by discriminant functional analysis.Keywords: Ilium, fetus, sex determination, morphometric
Procedia PDF Downloads 5926766 Frequent Itemset Mining Using Rough-Sets
Authors: Usman Qamar, Younus Javed
Abstract:
Frequent pattern mining is the process of finding a pattern (a set of items, subsequences, substructures, etc.) that occurs frequently in a data set. It was proposed in the context of frequent itemsets and association rule mining. Frequent pattern mining is used to find inherent regularities in data. What products were often purchased together? Its applications include basket data analysis, cross-marketing, catalog design, sale campaign analysis, Web log (click stream) analysis, and DNA sequence analysis. However, one of the bottlenecks of frequent itemset mining is that as the data increase the amount of time and resources required to mining the data increases at an exponential rate. In this investigation a new algorithm is proposed which can be uses as a pre-processor for frequent itemset mining. FASTER (FeAture SelecTion using Entropy and Rough sets) is a hybrid pre-processor algorithm which utilizes entropy and rough-sets to carry out record reduction and feature (attribute) selection respectively. FASTER for frequent itemset mining can produce a speed up of 3.1 times when compared to original algorithm while maintaining an accuracy of 71%.Keywords: rough-sets, classification, feature selection, entropy, outliers, frequent itemset mining
Procedia PDF Downloads 43726765 Urban Analysis of the Old City of Oran and Its Building after an Earthquake
Authors: A. Zatir, A. Mokhtari, A. Foufa, S. Zatir
Abstract:
The city of Oran, like any other region of northern Algeria, is subject to frequent seismic activity, the study presented in this work will be based on an analysis of urban and architectural context of the city of Oran before the date of the earthquake of 1790, and then try to deduce the differences between the old city before and after the earthquake. The analysis developed as a specific objective to tap into the seismic history of the city of Oran parallel to its urban history. The example of the citadel of Oran indicates that constructions presenting the site of the old citadel, may present elements of resistance for face to seismic effects. Removed in city observations of these structures, showed the ingenuity of the techniques used by the ancient builders, including the good performance of domes and arches in resistance to seismic forces.Keywords: earthquake, citadel, performance, traditional techniques, constructions
Procedia PDF Downloads 30526764 Swot Analysis for Employment of Graduates of Physical Education and Sport Sciences in Iran
Authors: Mohammad Reza Boroumand Devlagh
Abstract:
Employment problem, especially university graduates is the most important challenges in the decade ahead. The purpose of this study is the SWOT analysis for employment of graduates of Physical Education and Sport Sciences in Iran. The sample of this research consist of 115 (35.5 + 8.0 years) of physical education and sport sciences faculty members of higher education institutions, major sport managers and graduates of physical education and sport sciences. Library method, interview and questioners were used to collect data. The questionnaires were made in four parts: Strengths, Weaknesses, Opportunities and Threats with Cronbach's alpha coefficient of 0.94. After data collection, means, standard deviation (SD) and percentage were calculated by using SPSS software. Fridman was used for the statical analysis at P < 0.05. The results showed that Employment of graduates of Physical Education and Sport Sciences in Iran Located In the worst position possible (T-W area) in Strategic Position and Action Evaluation Matrix) SPACEM), and there are more weaknesses than strengths (2.02 < 2.5) in internal evaluation and there are more threats than opportunities(2.36 < 2.5) in external evaluation.Keywords: employment, graduate, physical education and sport sciences, SWOT analysis
Procedia PDF Downloads 53926763 The Comparison of Joint Simulation and Estimation Methods for the Geometallurgical Modeling
Authors: Farzaneh Khorram
Abstract:
This paper endeavors to construct a block model to assess grinding energy consumption (CCE) and pinpoint blocks with the highest potential for energy usage during the grinding process within a specified region. Leveraging geostatistical techniques, particularly joint estimation, or simulation, based on geometallurgical data from various mineral processing stages, our objective is to forecast CCE across the study area. The dataset encompasses variables obtained from 2754 drill samples and a block model comprising 4680 blocks. The initial analysis encompassed exploratory data examination, variography, multivariate analysis, and the delineation of geological and structural units. Subsequent analysis involved the assessment of contacts between these units and the estimation of CCE via cokriging, considering its correlation with SPI. The selection of blocks exhibiting maximum CCE holds paramount importance for cost estimation, production planning, and risk mitigation. The study conducted exploratory data analysis on lithology, rock type, and failure variables, revealing seamless boundaries between geometallurgical units. Simulation methods, such as Plurigaussian and Turning band, demonstrated more realistic outcomes compared to cokriging, owing to the inherent characteristics of geometallurgical data and the limitations of kriging methods.Keywords: geometallurgy, multivariate analysis, plurigaussian, turning band method, cokriging
Procedia PDF Downloads 7026762 Failure Cases Analysis in Petrochemical Industry
Authors: S. W. Liu, J. H. Lv, W. Z. Wang
Abstract:
In recent years, the failure accidents in petrochemical industry have been frequent, and have posed great security problems in personnel and property. The improvement of petrochemical safety is highly requested in order to prevent re-occurrence of severe accident. This study focuses on surveying the failure cases occurred in petrochemical field, which were extracted from journals of engineering failure, including engineering failure analysis and case studies in engineering failure analysis. The relation of failure mode, failure mechanism, type of components, and type of materials was analyzed in this study. And the analytical results showed that failures occurred more frequently in vessels and piping among the petrochemical equipment. Moreover, equipment made of carbon steel and stainless steel accounts for the majority of failures compared to other materials. This may be related to the application of the equipment and the performance of the material. In addition, corrosion failures were the largest in number of occurrence in the failure of petrochemical equipment, in which stress corrosion cracking accounts for a large proportion. This may have a lot to do with the service environment of the petrochemical equipment. Therefore, it can be concluded that the corrosion prevention of petrochemical equipment is particularly important.Keywords: cases analysis, corrosion, failure, petrochemical industry
Procedia PDF Downloads 30726761 A Thorough Analysis on The Dialog Application Replika
Authors: Weeam Abdulrahman, Gawaher Al-Madwary, Fatima Al-Ammari, Razan Mohammad
Abstract:
This research discusses the AI features in Replika which is a dialog with a customized characters application, interaction and communication with AI in different ways that is provided for the user. spreading a survey with questions on how the AI worked is one approach of exposing the app to others to utilize and also we made an analysis that provides us with the conclusion of our research as a result, individuals will be able to try out the app. In the methodology we explain each page that pops up in the screen while using replika and Specify each part and icon.Keywords: Replika, AI, artificial intelligence, dialog app
Procedia PDF Downloads 17726760 Behavioral Analysis of Stock Using Selective Indicators from Fundamental and Technical Analysis
Authors: Vish Putcha, Chandrasekhar Putcha, Siva Hari
Abstract:
In the current digital era of free trading and pandemic-driven remote work culture, markets worldwide gained momentum for retail investors to trade from anywhere easily. The number of retail traders rose to 24% of the market from 15% at the pre-pandemic level. Most of them are young retail traders with high-risk tolerance compared to the previous generation of retail traders. This trend boosted the growth of subscription-based market predictors and market data vendors. Young traders are betting on these predictors, assuming one of them is correct. However, 90% of retail traders are on the losing end. This paper presents multiple indicators and attempts to derive behavioral patterns from the underlying stocks. The two major indicators that traders and investors follow are technical and fundamental. The famous investor, Warren Buffett, adheres to the “Value Investing” method that is based on a stock’s fundamental Analysis. In this paper, we present multiple indicators from various methods to understand the behavior patterns of stocks. For this research, we picked five stocks with a market capitalization of more than $200M, listed on the exchange for more than 20 years, and from different industry sectors. To study the behavioral pattern over time for these five stocks, a total of 8 indicators are chosen from fundamental, technical, and financial indicators, such as Price to Earning (P/E), Price to Book Value (P/B), Debt to Equity (D/E), Beta, Volatility, Relative Strength Index (RSI), Moving Averages and Dividend yields, followed by detailed mathematical Analysis. This is an interdisciplinary paper between various disciplines of Engineering, Accounting, and Finance. The research takes a new approach to identify clear indicators affecting stocks. Statistical Analysis of the data will be performed in terms of the probabilistic distribution, then follow and then determine the probability of the stock price going over a specific target value. The Chi-square test will be used to determine the validity of the assumed distribution. Preliminary results indicate that this approach is working well. When the complete results are presented in the final paper, they will be beneficial to the community.Keywords: stock pattern, stock market analysis, stock predictions, trading, investing, fundamental analysis, technical analysis, quantitative trading, financial analysis, behavioral analysis
Procedia PDF Downloads 8526759 Quantitative Assessment of Soft Tissues by Statistical Analysis of Ultrasound Backscattered Signals
Authors: Da-Ming Huang, Ya-Ting Tsai, Shyh-Hau Wang
Abstract:
Ultrasound signals backscattered from the soft tissues are mainly depending on the size, density, distribution, and other elastic properties of scatterers in the interrogated sample volume. The quantitative analysis of ultrasonic backscattering is frequently implemented using the statistical approach due to that of backscattering signals tends to be with the nature of the random variable. Thus, the statistical analysis, such as Nakagami statistics, has been applied to characterize the density and distribution of scatterers of a sample. Yet, the accuracy of statistical analysis could be readily affected by the receiving signals associated with the nature of incident ultrasound wave and acoustical properties of samples. Thus, in the present study, efforts were made to explore such effects as the ultrasound operational modes and attenuation of biological tissue on the estimation of corresponding Nakagami statistical parameter (m parameter). In vitro measurements were performed from healthy and pathological fibrosis porcine livers using different single-element ultrasound transducers and duty cycles of incident tone burst ranging respectively from 3.5 to 7.5 MHz and 10 to 50%. Results demonstrated that the estimated m parameter tends to be sensitively affected by the use of ultrasound operational modes as well as the tissue attenuation. The healthy and pathological tissues may be characterized quantitatively by m parameter under fixed measurement conditions and proper calibration.Keywords: ultrasound backscattering, statistical analysis, operational mode, attenuation
Procedia PDF Downloads 32326758 Dynamic Analysis of Differential Systems with Infinite Memory and Damping
Authors: Kun-Peng Jin, Jin Liang, Ti-Jun Xiao
Abstract:
In this work, we are concerned with the dynamic behaviors of solutions to some coupled systems with infinite memory, which consist of two partial differential equations where only one partial differential equation has damping. Such coupled systems are good mathematical models to describe the deformation and stress characteristics of some viscoelastic materials affected by temperature change, external forces, and other factors. By using the theory of operator semigroups, we give wellposedness results for the Cauchy problem for these coupled systems. Then, with the help of some auxiliary functions and lemmas, which are specially designed for overcoming difficulties in the proof, we show that the solutions of the coupled systems decay to zero in a strong way under a few basic conditions. The results in this dynamic analysis of coupled systems are generalizations of many existing results.Keywords: dynamic analysis, coupled system, infinite memory, damping.
Procedia PDF Downloads 22126757 Multi-Stage Optimization of Local Environmental Quality by Comprehensive Computer Simulated Person as Sensor for Air Conditioning Control
Authors: Sung-Jun Yoo, Kazuhide Ito
Abstract:
In this study, a comprehensive computer simulated person (CSP) that integrates computational human model (virtual manikin) and respiratory tract model (virtual airway), was applied for estimation of indoor environmental quality. Moreover, an inclusive prediction method was established by integrating computational fluid dynamics (CFD) analysis with advanced CSP which is combined with physiologically-based pharmacokinetic (PBPK) model, unsteady thermoregulation model for analysis targeting micro-climate around human body and respiratory area with high accuracy. This comprehensive method can estimate not only the contaminant inhalation but also constant interaction in the contaminant transfer between indoor spaces, i.e., a target area for indoor air quality (IAQ) assessment, and respiratory zone for health risk assessment. This study focused on the usage of the CSP as an air/thermal quality sensor in indoors, which means the application of comprehensive model for assessment of IAQ and thermal environmental quality. Demonstrative analysis was performed in order to examine the applicability of the comprehensive model to the heating, ventilation, air conditioning (HVAC) control scheme. CSP was located at the center of the simple model room which has dimension of 3m×3m×3m. Formaldehyde which is generated from floor material was assumed as a target contaminant, and flow field, sensible/latent heat and contaminant transfer analysis in indoor space were conducted by using CFD simulation coupled with CSP. In this analysis, thermal comfort was evaluated by thermoregulatory analysis, and respiratory exposure risks represented by adsorption flux/concentration at airway wall surface were estimated by PBPK-CFD hybrid analysis. These Analysis results concerning IAQ and thermal comfort will be fed back to the HVAC control and could be used to find a suitable ventilation rate and energy requirement for air conditioning system.Keywords: CFD simulation, computer simulated person, HVAC control, indoor environmental quality
Procedia PDF Downloads 36126756 An Efficient Process Analysis and Control Method for Tire Mixing Operation
Authors: Hwang Ho Kim, Do Gyun Kim, Jin Young Choi, Sang Chul Park
Abstract:
Since tire production process is very complicated, company-wide management of it is very difficult, necessitating considerable amounts of capital and labors. Thus, productivity should be enhanced and maintained competitive by developing and applying effective production plans. Among major processes for tire manufacturing, consisting of mixing component preparation, building and curing, the mixing process is an essential and important step because the main component of tire, called compound, is formed at this step. Compound as a rubber synthesis with various characteristics plays its own role required for a tire as a finished product. Meanwhile, scheduling tire mixing process is similar to flexible job shop scheduling problem (FJSSP) because various kinds of compounds have their unique orders of operations, and a set of alternative machines can be used to process each operation. In addition, setup time required for different operations may differ due to alteration of additives. In other words, each operation of mixing processes requires different setup time depending on the previous one, and this kind of feature, called sequence dependent setup time (SDST), is a very important issue in traditional scheduling problems such as flexible job shop scheduling problems. However, despite of its importance, there exist few research works dealing with the tire mixing process. Thus, in this paper, we consider the scheduling problem for tire mixing process and suggest an efficient particle swarm optimization (PSO) algorithm to minimize the makespan for completing all the required jobs belonging to the process. Specifically, we design a particle encoding scheme for the considered scheduling problem, including a processing sequence for compounds and machine allocation information for each job operation, and a method for generating a tire mixing schedule from a given particle. At each iteration, the coordination and velocity of particles are updated, and the current solution is compared with new solution. This procedure is repeated until a stopping condition is satisfied. The performance of the proposed algorithm is validated through a numerical experiment by using some small-sized problem instances expressing the tire mixing process. Furthermore, we compare the solution of the proposed algorithm with it obtained by solving a mixed integer linear programming (MILP) model developed in previous research work. As for performance measure, we define an error rate which can evaluate the difference between two solutions. As a result, we show that PSO algorithm proposed in this paper outperforms MILP model with respect to the effectiveness and efficiency. As the direction for future work, we plan to consider scheduling problems in other processes such as building, curing. We can also extend our current work by considering other performance measures such as weighted makespan or processing times affected by aging or learning effects.Keywords: compound, error rate, flexible job shop scheduling problem, makespan, particle encoding scheme, particle swarm optimization, sequence dependent setup time, tire mixing process
Procedia PDF Downloads 26526755 Multi-Source Data Fusion for Urban Comprehensive Management
Authors: Bolin Hua
Abstract:
In city governance, various data are involved, including city component data, demographic data, housing data and all kinds of business data. These data reflects different aspects of people, events and activities. Data generated from various systems are different in form and data source are different because they may come from different sectors. In order to reflect one or several facets of an event or rule, data from multiple sources need fusion together. Data from different sources using different ways of collection raised several issues which need to be resolved. Problem of data fusion include data update and synchronization, data exchange and sharing, file parsing and entry, duplicate data and its comparison, resource catalogue construction. Governments adopt statistical analysis, time series analysis, extrapolation, monitoring analysis, value mining, scenario prediction in order to achieve pattern discovery, law verification, root cause analysis and public opinion monitoring. The result of Multi-source data fusion is to form a uniform central database, which includes people data, location data, object data, and institution data, business data and space data. We need to use meta data to be referred to and read when application needs to access, manipulate and display the data. A uniform meta data management ensures effectiveness and consistency of data in the process of data exchange, data modeling, data cleansing, data loading, data storing, data analysis, data search and data delivery.Keywords: multi-source data fusion, urban comprehensive management, information fusion, government data
Procedia PDF Downloads 39326754 The Impact of Cognitive Load on Deceit Detection and Memory Recall in Children’s Interviews: A Meta-Analysis
Authors: Sevilay Çankaya
Abstract:
The detection of deception in children’s interviews is essential for statement veracity. The widely used method for deception detection is building cognitive load, which is the logic of the cognitive interview (CI), and its effectiveness for adults is approved. This meta-analysis delves into the effectiveness of inducing cognitive load as a means of enhancing veracity detection during interviews with children. Additionally, the effectiveness of cognitive load on children's total number of events recalled is assessed as a second part of the analysis. The current meta-analysis includes ten effect sizes from search using databases. For the effect size calculation, Hedge’s g was used with a random effect model by using CMA version 2. Heterogeneity analysis was conducted to detect potential moderators. The overall result indicated that cognitive load had no significant effect on veracity outcomes (g =0.052, 95% CI [-.006,1.25]). However, a high level of heterogeneity was found (I² = 92%). Age, participants’ characteristics, interview setting, and characteristics of the interviewer were coded as possible moderators to explain variance. Age was significant moderator (β = .021; p = .03, R2 = 75%) but the analysis did not reveal statistically significant effects for other potential moderators: participants’ characteristics (Q = 0.106, df = 1, p = .744), interview setting (Q = 2.04, df = 1, p = .154), and characteristics of interviewer (Q = 2.96, df = 1, p = .086). For the second outcome, the total number of events recalled, the overall effect was significant (g =4.121, 95% CI [2.256,5.985]). The cognitive load was effective in total recalled events when interviewing with children. All in all, while age plays a crucial role in determining the impact of cognitive load on veracity, the surrounding context, interviewer attributes, and inherent participant traits may not significantly alter the relationship. These findings throw light on the need for more focused, age-specific methods when using cognitive load measures. It may be possible to improve the precision and dependability of deceit detection in children's interviews with the help of more studies in this field.Keywords: deceit detection, cognitive load, memory recall, children interviews, meta-analysis
Procedia PDF Downloads 55