Search results for: bare machine computing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3906

Search results for: bare machine computing

666 The Methodology of Hand-Gesture Based Form Design in Digital Modeling

Authors: Sanghoon Shim, Jaehwan Jung, Sung-Ah Kim

Abstract:

As the digital technology develops, studies on the TUI (Tangible User Interface) that links the physical environment utilizing the human senses with the virtual environment through the computer are actively being conducted. In addition, there has been a tremendous advance in computer design making through the use of computer-aided design techniques, which enable optimized decision-making through comparison with machine learning and parallel comparison of alternatives. However, a complex design that can respond to user requirements or performance can emerge through the intuition of the designer, but it is difficult to actualize the emerged design by the designer's ability alone. Ancillary tools such as Gaudí's Sandbag can be an instrument to reinforce and evolve emerged ideas from designers. With the advent of many commercial tools that support 3D objects, designers' intentions are easily reflected in their designs, but the degree of their reflection reflects their intentions according to the proficiency of design tools. This study embodies the environment in which the form can be implemented by the fingers of the most basic designer in the initial design phase of the complex type building design. Leapmotion is used as a sensor to recognize the hand motions of the designer, and it is converted into digital information to realize an environment that can be linked in real time in virtual reality (VR). In addition, the implemented design can be linked with Rhino™, a 3D authoring tool, and its plug-in Grasshopper™ in real time. As a result, it is possible to design sensibly using TUI, and it can serve as a tool for assisting designer intuition.

Keywords: design environment, digital modeling, hand gesture, TUI, virtual reality

Procedia PDF Downloads 368
665 The Research of the Relationship between Triathlon Competition Results with Physical Fitness Performance

Authors: Chen Chan Wei

Abstract:

The purpose of this study was to investigate the impact of swim 1500m, 10000m run, VO2 max, and body fat on Olympic distance triathlon competition performance. The subjects were thirteen college triathletes with endurance training, with an average age, height and weight of 20.61±1.04 years (mean ± SD), 171.76±8.54 cm and 65.32±8.14 kg respectively. All subjects were required to take the tests of swim 1500m, run 10000m, VO2 max, body fat, and participate in the Olympic distance triathlon competition. First, the swim 1500m test was taken in the standardized 50m pool, with a depth of 2m, and the 10000m run test on the standardized 400m track. After three days, VO2 max was tested with the MetaMax 3B and body fat was measured with the DEXA machine. After two weeks, all 13 subjects joined the Olympic distance triathlon competition at the 2016 New Taipei City Asian Cup. The relationships between swim 1500m, 10000m run, VO2 max, body fat test, and Olympic distance triathlon competition performance were evaluated using Pearson's product-moment correlation. The results show that 10000m run and body fat had a significant positive correlation with Olympic distance triathlon performance (r=.830, .768), but VO2 max has a significant negative correlation with Olympic distance triathlon performance (r=-.735). In conclusion, for improved non-draft Olympic distance triathlon performance, triathletes should focus on running than swimming training and can be measure VO2 max to prediction triathlon performance. Also, managing body fat can improve Olympic distance triathlon performance. In addition, swimming performance was not significantly correlated to Olympic distance triathlon performance, possibly because the 2016 New Taipei City Asian Cup age group was not a drafting competition. The swimming race is the shortest component of Olympic distance triathlons. Therefore, in a non-draft competition, swimming ability is not significantly correlated with overall performance.

Keywords: triathletes, olympic, non-drafting, correlation

Procedia PDF Downloads 250
664 Moderation in Temperature Dependence on Counter Frictional Coefficient and Prevention of Wear of C/C Composites by Synthesizing SiC around Surface and Internal Vacancies

Authors: Noboru Wakamoto, Kiyotaka Obunai, Kazuya Okubo, Toru Fujii

Abstract:

The aim of this study is to moderate the dependence of counter frictional coefficient on temperature between counter surfaces and to reduce the wear of C/C composites at low temperature. To modify the C/C composites, Silica (SiO2) powders were added into phenolic resin for carbon precursor. The preform plate of the precursor of C/C composites was prepared by conventional filament winding method. The C/C composites plates were obtained by carbonizing preform plate at 2200 °C under an argon atmosphere. At that time, the silicon carbides (SiC) were synthesized around the surfaces and the internal vacancies of the C/C composites. The frictional coefficient on the counter surfaces and specific wear volumes of the C/C composites were measured by our developed frictional test machine like pin-on disk type. The XRD indicated that SiC was synthesized in the body of C/C composite fabricated by current method. The results of friction test showed that coefficient of friction of unmodified C/C composites have temperature dependence when the test condition was changed. In contrast, frictional coefficient of the C/C composite modified with SiO2 powders was almost constant at about 0.27 when the temperature condition was changed from Room Temperature (RT) to 300 °C. The specific wear rate decreased from 25×10-6 mm2/N to 0.1×10-6 mm2/N. The observations of the surfaces after friction tests showed that the frictional surface of the modified C/C composites was covered with a film produced by the friction. This study found that synthesizing SiC around surface and internal vacancies of C/C composites was effective to moderate the dependence on the frictional coefficient and reduce to the abrasion of C/C composites.

Keywords: C/C composites, friction coefficient, wear, SiC

Procedia PDF Downloads 345
663 TimeTune: Personalized Study Plans Generation with Google Calendar Integration

Authors: Chevon Fernando, Banuka Athuraliya

Abstract:

The purpose of this research is to provide a solution to the students’ time management, which usually becomes an issue because students must study and manage their personal commitments. "TimeTune," an AI-based study planner that provides an opportunity to maneuver study timeframes by incorporating modern machine learning algorithms with calendar applications, is unveiled as the ideal solution. The research is focused on the development of LSTM models that connect to the Google Calendar API in the process of developing learning paths that would be fit for a unique student's daily life experience and study history. A key finding of this research is the success in building the LSTM model to predict optimal study times, which, integrating with the real-time data of Google Calendar, will generate the timetables automatically in a personalized and customized manner. The methodology encompasses Agile development practices and Object-Oriented Analysis and Design (OOAD) principles, focusing on user-centric design and iterative development. By adopting this method, students can significantly reduce the tension associated with poor study habits and time management. In conclusion, "TimeTune" displays an advanced step in personalized education technology. The fact that its application of ML algorithms and calendar integration is quite innovative is slowly and steadily revolutionizing the lives of students. The excellence of maintaining a balanced academic and personal life is stress reduction, which the applications promise to provide for students when it comes to managing their studies.

Keywords: personalized learning, study planner, time management, calendar integration

Procedia PDF Downloads 49
662 Predictive Analytics Algorithms: Mitigating Elementary School Drop Out Rates

Authors: Bongs Lainjo

Abstract:

Educational institutions and authorities that are mandated to run education systems in various countries need to implement a curriculum that considers the possibility and existence of elementary school dropouts. This research focuses on elementary school dropout rates and the ability to replicate various predictive models carried out globally on selected Elementary Schools. The study was carried out by comparing the classical case studies in Africa, North America, South America, Asia and Europe. Some of the reasons put forward for children dropping out include the notion of being successful in life without necessarily going through the education process. Such mentality is coupled with a tough curriculum that does not take care of all students. The system has completely led to poor school attendance - truancy which continuously leads to dropouts. In this study, the focus is on developing a model that can systematically be implemented by school administrations to prevent possible dropout scenarios. At the elementary level, especially the lower grades, a child's perception of education can be easily changed so that they focus on the better future that their parents desire. To deal effectively with the elementary school dropout problem, strategies that are put in place need to be studied and predictive models are installed in every educational system with a view to helping prevent an imminent school dropout just before it happens. In a competency-based curriculum that most advanced nations are trying to implement, the education systems have wholesome ideas of learning that reduce the rate of dropout.

Keywords: elementary school, predictive models, machine learning, risk factors, data mining, classifiers, dropout rates, education system, competency-based curriculum

Procedia PDF Downloads 175
661 Numerical Simulation of the Flowing of Ice Slurry in Seawater Pipe of Polar Ships

Authors: Li Xu, Huanbao Jiang, Zhenfei Huang, Lailai Zhang

Abstract:

In recent years, as global warming, the sea-ice extent of North Arctic undergoes an evident decrease and Arctic channel has attracted the attention of shipping industry. Ice crystals existing in the seawater of Arctic channel which enter the seawater system of the ship with the seawater were found blocking the seawater pipe. The appearance of cooler paralysis, auxiliary machine error and even ship power system paralysis may be happened if seriously. In order to reduce the effect of high temperature in auxiliary equipment, seawater system will use external ice-water to participate in the cooling cycle and achieve the state of its flow. The distribution of ice crystals in seawater pipe can be achieved. As the ice slurry system is solid liquid two-phase system, the flow process of ice-water mixture is very complex and diverse. In this paper, the flow process in seawater pipe of ice slurry is simulated with fluid dynamics simulation software based on k-ε turbulence model. As the ice packing fraction is a key factor effecting the distribution of ice crystals, the influence of ice packing fraction on the flowing process of ice slurry is analyzed. In this work, the simulation results show that as the ice packing fraction is relatively large, the distribution of ice crystals is uneven in the flowing process of the seawater which has such disadvantage as increase the possibility of blocking, that will provide scientific forecasting methods for the forming of ice block in seawater piping system. It has important significance for the reliability of the operating of polar ships in the future.

Keywords: ice slurry, seawater pipe, ice packing fraction, numerical simulation

Procedia PDF Downloads 367
660 Electricity Price Forecasting: A Comparative Analysis with Shallow-ANN and DNN

Authors: Fazıl Gökgöz, Fahrettin Filiz

Abstract:

Electricity prices have sophisticated features such as high volatility, nonlinearity and high frequency that make forecasting quite difficult. Electricity price has a volatile and non-random character so that, it is possible to identify the patterns based on the historical data. Intelligent decision-making requires accurate price forecasting for market traders, retailers, and generation companies. So far, many shallow-ANN (artificial neural networks) models have been published in the literature and showed adequate forecasting results. During the last years, neural networks with many hidden layers, which are referred to as DNN (deep neural networks) have been using in the machine learning community. The goal of this study is to investigate electricity price forecasting performance of the shallow-ANN and DNN models for the Turkish day-ahead electricity market. The forecasting accuracy of the models has been evaluated with publicly available data from the Turkish day-ahead electricity market. Both shallow-ANN and DNN approach would give successful result in forecasting problems. Historical load, price and weather temperature data are used as the input variables for the models. The data set includes power consumption measurements gathered between January 2016 and December 2017 with one-hour resolution. In this regard, forecasting studies have been carried out comparatively with shallow-ANN and DNN models for Turkish electricity markets in the related time period. The main contribution of this study is the investigation of different shallow-ANN and DNN models in the field of electricity price forecast. All models are compared regarding their MAE (Mean Absolute Error) and MSE (Mean Square) results. DNN models give better forecasting performance compare to shallow-ANN. Best five MAE results for DNN models are 0.346, 0.372, 0.392, 0,402 and 0.409.

Keywords: deep learning, artificial neural networks, energy price forecasting, turkey

Procedia PDF Downloads 294
659 Vibration Transmission across Junctions of Walls and Floors in an Apartment Building: An Experimental Investigation

Authors: Hugo Sampaio Libero, Max de Castro Magalhaes

Abstract:

The perception of sound radiated from a building floor is greatly influenced by the rooms in which it is immersed and by the position of both listener and source. The main question that remains unanswered is related to the influence of the source position on the sound power radiated by a complex wall-floor system in buildings. This research is concerned with the investigation of vibration transmission across walls and floors in buildings. It is primarily based on the determination of vibration reduction index via experimental tests. Knowledge of this parameter may help in predicting noise and vibration propagation in building components. First, the physical mechanisms involving vibration transmission across structural junctions are described. An experimental setup is performed to aid this investigation. The experimental tests have shown that the vibration generation in the walls and floors is directed related to their size and boundary conditions. It is also shown that the vibration source position can affect the overall vibration spectrum significantly. Second, the characteristics of the noise spectra inside the rooms due to an impact source (tapping machine) are also presented. Conclusions are drawn for the general trend of vibration and noise spectrum of the structural components and rooms, respectively. In summary, the aim of this paper is to investigate the vibro-acoustical behavior of building floors and walls under floor impact excitation. The impact excitation was at distinct positions on the slab. The analysis has highlighted the main physical characteristics of the vibration transmission mechanism.

Keywords: vibration transmission, vibration reduction index, impact excitation, experimental tests

Procedia PDF Downloads 93
658 Time Series Simulation by Conditional Generative Adversarial Net

Authors: Rao Fu, Jie Chen, Shutian Zeng, Yiping Zhuang, Agus Sudjianto

Abstract:

Generative Adversarial Net (GAN) has proved to be a powerful machine learning tool in image data analysis and generation. In this paper, we propose to use Conditional Generative Adversarial Net (CGAN) to learn and simulate time series data. The conditions include both categorical and continuous variables with different auxiliary information. Our simulation studies show that CGAN has the capability to learn different types of normal and heavy-tailed distributions, as well as dependent structures of different time series. It also has the capability to generate conditional predictive distributions consistent with training data distributions. We also provide an in-depth discussion on the rationale behind GAN and the neural networks as hierarchical splines to establish a clear connection with existing statistical methods of distribution generation. In practice, CGAN has a wide range of applications in market risk and counterparty risk analysis: it can be applied to learn historical data and generate scenarios for the calculation of Value-at-Risk (VaR) and Expected Shortfall (ES), and it can also predict the movement of the market risk factors. We present a real data analysis including a backtesting to demonstrate that CGAN can outperform Historical Simulation (HS), a popular method in market risk analysis to calculate VaR. CGAN can also be applied in economic time series modeling and forecasting. In this regard, we have included an example of hypothetical shock analysis for economic models and the generation of potential CCAR scenarios by CGAN at the end of the paper.

Keywords: conditional generative adversarial net, market and credit risk management, neural network, time series

Procedia PDF Downloads 144
657 Semantic Indexing Improvement for Textual Documents: Contribution of Classification by Fuzzy Association Rules

Authors: Mohsen Maraoui

Abstract:

In the aim of natural language processing applications improvement, such as information retrieval, machine translation, lexical disambiguation, we focus on statistical approach to semantic indexing for multilingual text documents based on conceptual network formalism. We propose to use this formalism as an indexing language to represent the descriptive concepts and their weighting. These concepts represent the content of the document. Our contribution is based on two steps. In the first step, we propose the extraction of index terms using the multilingual lexical resource Euro WordNet (EWN). In the second step, we pass from the representation of index terms to the representation of index concepts through conceptual network formalism. This network is generated using the EWN resource and pass by a classification step based on association rules model (in attempt to discover the non-taxonomic relations or contextual relations between the concepts of a document). These relations are latent relations buried in the text and carried by the semantic context of the co-occurrence of concepts in the document. Our proposed indexing approach can be applied to text documents in various languages because it is based on a linguistic method adapted to the language through a multilingual thesaurus. Next, we apply the same statistical process regardless of the language in order to extract the significant concepts and their associated weights. We prove that the proposed indexing approach provides encouraging results.

Keywords: concept extraction, conceptual network formalism, fuzzy association rules, multilingual thesaurus, semantic indexing

Procedia PDF Downloads 141
656 Effect of Nitrogen-Based Cryotherapy on the Calf Muscle Spasticity in Stroke Patients

Authors: Engi E. I. Sarhan, Usama M. Rashad, Ibrahim M. I. Hamoda, Mohammed K. Mohamed

Abstract:

Background: This study aimed to know the effect of nitrogen-based cryotherapy on the spasticity of calf muscle in stroke patients. Patients were selected from the outpatient clinic of Neurology, Al-Mansoura general hospital, Al-Mansoura University. Subjects and methods: Thirty Stroke Patients of both sexes ranged from 45 to 60 years old were divided randomly into two equal groups, a study group (A) received a nitrogen-based cryotherapy, a selective physical therapy program and ankle foot orthosis (AFO), while as patients in control group (B) received the same program and AFO only. The treatment duration was three times per week for four weeks for both groups. We assessed spasticity of calf muscle before and after treatment subjectively using modified Ashworth scale (MAS) and objectively via measuring H / M ratio on electromyography machine. We also assessed ankle dorsiflexion ROM objectively using two dimensions motion analysis (2D). Results: After treatment, there was a highly significant improvement in the study group compared to the control group regarding the score of MAS, no significant difference in the study group compared to the control group regarding the readings of H / M ratio, highly significant improvement in the study group compared to the control group regarding the 2D motion analysis findings. Conclusion: This modality considers effective in reducing spasticity in the calf muscle and improving ankle dorsiflexion of the affected limb.

Keywords: ankle foot orthosis, nitrogen-based cryotherapy, stroke, spasticity

Procedia PDF Downloads 202
655 Defining Death and Dying in Relation to Information Technology and Advances in Biomedicine

Authors: Evangelos Koumparoudis

Abstract:

The definition of death is a deep philosophical question, and no single meaning can be ascribed to it. This essay focuses on the ontological, epistemological, and ethical aspects of death and dying in view of technological progress in information technology and biomedicine. It starts with the ad hoc 1968 Harvard committee that proposed that the criterion for the definition of death be irreversible coma and then refers to the debate over the whole brain death formula, emphasizing the integrated function of the organism and higher brain formula, taking consciousness and personality as essential human characteristics. It follows with the contribution of information technology in personalized and precision medicine and anti-aging measures aimed at life prolongation. It also touches on the possibility of the creation of human-machine hybrids and how this raises ontological and ethical issues that concern the “cyborgization” of human beings and the conception of the organism and personhood based on a post/transhumanist essence, and, furthermore, if sentient AI capable of autonomous decision-making that might even surpass human intelligence (singularity, superintelligence) deserves moral or legal personhood. Finally, there is the question as to whether death and dying should be redefined at a transcendent level, which is reinforced by already-existing technologies of “virtual after-” life and the possibility of uploading human minds. In the last section, I refer to the current (and future) applications of nanomedicine in diagnostics, therapeutics, implants, and tissue engineering as well as the aspiration to “immortality” by cryonics. The definition of death is reformulated since age and disease elimination may be realized, and the criterion of irreversibility may be challenged.

Keywords: death, posthumanism, infomedicine, nanomedicine, cryonics

Procedia PDF Downloads 73
654 Effect of Fiber Orientation on the Mechanical Properties of Fabricated Plate Using Basalt Fiber

Authors: Sharmili Routray, Kishor Chandra Biswal

Abstract:

The use of corrosion resistant fiber reinforced polymer (FRP) reinforcement is beneficial in structures particularly those exposed to deicing salts, and/or located in highly corrosive environment. Generally Glass, Carbon and Aramid fibers are used for the strengthening purpose of the structures. Due to the necessities of low weight and high strength materials, it is required to find out the suitable substitute with low cost. Recent developments in fiber production technology allow the strengthening of structures using Basalt fiber which is made from basalt rock. Basalt fiber has good range of thermal performance, high tensile strength, resistance to acids, good electro‐magnetic properties, inert nature, resistance to corrosion, radiation and UV light, vibration and impact loading. This investigation focuses on the effect of fibre content and fiber orientation of basalt fibre on mechanical properties of the fabricated composites. Specimen prepared with unidirectional Basalt fabric as reinforcing materials and epoxy resin as a matrix in polymer composite. In this investigation different fiber orientation are taken and the fabrication is done by hand lay-up process. The variation of the properties with the increasing number of plies of fiber in the composites is also studied. Specimens are subjected to tensile strength test and the failure of the composite is examined with the help of INSTRON universal testing Machine (SATEC) of 600 kN capacities. The average tensile strength and modulus of elasticity of BFRP plates are determined from the test Program.

Keywords: BFRP, fabrication, Fiber Reinforced Polymer (FRP), strengthening

Procedia PDF Downloads 292
653 Local Interpretable Model-agnostic Explanations (LIME) Approach to Email Spam Detection

Authors: Rohini Hariharan, Yazhini R., Blessy Maria Mathew

Abstract:

The task of detecting email spam is a very important one in the era of digital technology that needs effective ways of curbing unwanted messages. This paper presents an approach aimed at making email spam categorization algorithms transparent, reliable and more trustworthy by incorporating Local Interpretable Model-agnostic Explanations (LIME). Our technique assists in providing interpretable explanations for specific classifications of emails to help users understand the decision-making process by the model. In this study, we developed a complete pipeline that incorporates LIME into the spam classification framework and allows creating simplified, interpretable models tailored to individual emails. LIME identifies influential terms, pointing out key elements that drive classification results, thus reducing opacity inherent in conventional machine learning models. Additionally, we suggest a visualization scheme for displaying keywords that will improve understanding of categorization decisions by users. We test our method on a diverse email dataset and compare its performance with various baseline models, such as Gaussian Naive Bayes, Multinomial Naive Bayes, Bernoulli Naive Bayes, Support Vector Classifier, K-Nearest Neighbors, Decision Tree, and Logistic Regression. Our testing results show that our model surpasses all other models, achieving an accuracy of 96.59% and a precision of 99.12%.

Keywords: text classification, LIME (local interpretable model-agnostic explanations), stemming, tokenization, logistic regression.

Procedia PDF Downloads 48
652 Analysis of the Level of Production Failures by Implementing New Assembly Line

Authors: Joanna Kochanska, Dagmara Gornicka, Anna Burduk

Abstract:

The article examines the process of implementing a new assembly line in a manufacturing enterprise of the household appliances industry area. At the initial stages of the project, a decision was made that one of its foundations should be the concept of lean management. Because of that, eliminating as many errors as possible in the first phases of its functioning was emphasized. During the start-up of the line, there were identified and documented all production losses (from serious machine failures, through any unplanned downtime, to micro-stops and quality defects). During 6 weeks (line start-up period), all errors resulting from problems in various areas were analyzed. These areas were, among the others, production, logistics, quality, and organization. The aim of the work was to analyze the occurrence of production failures during the initial phase of starting up the line and to propose a method for determining their critical level during its full functionality. There was examined the repeatability of the production losses in various areas and at different levels at such an early stage of implementation, by using the methods of statistical process control. Based on the Pareto analysis, there were identified the weakest points in order to focus improvement actions on them. The next step was to examine the effectiveness of the actions undertaken to reduce the level of recorded losses. Based on the obtained results, there was proposed a method for determining the critical failures level in the studied areas. The developed coefficient can be used as an alarm in case of imbalance of the production, which is caused by the increased failures level in production and production support processes in the period of the standardized functioning of the line.

Keywords: production failures, level of production losses, new production line implementation, assembly line, statistical process control

Procedia PDF Downloads 131
651 Quantification Model for Capability Evaluation of Optical-Based in-Situ Monitoring System for Laser Powder Bed Fusion (LPBF) Process

Authors: Song Zhang, Hui Wang, Johannes Henrich Schleifenbaum

Abstract:

Due to the increasing demand for quality assurance and reliability for additive manufacturing, the development of an advanced in-situ monitoring system is required to monitor the process anomalies as input for further process control. Optical-based monitoring systems, such as CMOS cameras and NIR cameras, are proved as effective ways to monitor the geometrical distortion and exceptional thermal distribution. Therefore, many studies and applications are focusing on the availability of the optical-based monitoring system for detecting varied types of defects. However, the capability of the monitoring setup is not quantified. In this study, a quantification model to evaluate the capability of the monitoring setups for the LPBF machine based on acquired monitoring data of a designed test artifact is presented, while the design of the relevant test artifacts is discussed. The monitoring setup is evaluated based on its hardware properties, location of the integration, and light condition. Methodology of data processing to quantify the capacity for each aspect is discussed. The minimal capability of the detectable size of the monitoring set up in the application is estimated by quantifying its resolution and accuracy. The quantification model is validated using a CCD camera-based monitoring system for LPBF machines in the laboratory with different setups. The result shows the model to quantify the monitoring system's performance, which makes the evaluation of monitoring systems with the same concept but different setups possible for the LPBF process and provides the direction to improve the setups.

Keywords: data processing, in-situ monitoring, LPBF process, optical system, quantization model, test artifact

Procedia PDF Downloads 197
650 Effect of B2O3 Addition on Sol-gel Synthesized 45S5 Bioglass

Authors: P. Dey, S. K. Pal

Abstract:

Ceramics or glass ceramics with the property of bone bonding at the nearby tissues and producing possible bone in growth are known to be bioactive. The most extensively used glass in this context is 45S5 which is a silica based bioglass mostly explored in the field of tissue engineering as scaffolds for bone repair. Nowadays, the borate based bioglass are being utilized in orthopedic area largely due to its superior bioactivity with the formation of bone bonding. An attempt has been made, in the present study, to observe the effect of B2O3 addition in 45S5 glass and perceive its consequences on the thermal, mechanical and biological properties. The B2O3 was added in 1, 2.5, and 5 wt% with simultaneous reduction in the silica content of the 45S5 composition. The borate based bioglass has been synthesized by the means of sol-gel route. The synthesized powders were then thermally analyzed by DSC-TG. The as synthesized powders were then calcined at 600ºC for 2hrs. The calcined powders were then pressed into pellets followed by sintering at 850ºC with a holding time of 2hrs. The phase analysis and the microstructural analysis of the as synthesized and calcined powder glass samples and the sintered glass samples were being carried out using XRD and FESEM respectively. The formation of hydroxyapatite layer was performed by immersing the sintered samples in the simulated body fluid (SBF) and mechanical property has been tested for the sintered samples by universal testing machine (UTM). The sintered samples showed the presence of sodium calcium silicate phase while the formation of hydroxyapaptite takes place for SBF immersed samples. The formation of hydroxyapatite is more pronounced in case of borated based glass samples instead of 45S5.

Keywords: 45S5 bioglass, bioactive, borate, hydroxyapatite, sol-gel synthesis

Procedia PDF Downloads 256
649 Reliability Levels of Reinforced Concrete Bridges Obtained by Mixing Approaches

Authors: Adrián D. García-Soto, Alejandro Hernández-Martínez, Jesús G. Valdés-Vázquez, Reyna A. Vizguerra-Alvarez

Abstract:

Reinforced concrete bridges designed by code are intended to achieve target reliability levels adequate for the geographical environment where the code is applicable. Several methods can be used to estimate such reliability levels. Many of them require the establishment of an explicit limit state function (LSF). When such LSF is not available as a close-form expression, the simulation techniques are often employed. The simulation methods are computing intensive and time consuming. Note that if the reliability of real bridges designed by code is of interest, numerical schemes, the finite element method (FEM) or computational mechanics could be required. In these cases, it can be quite difficult (or impossible) to establish a close-form of the LSF, and the simulation techniques may be necessary to compute reliability levels. To overcome the need for a large number of simulations when no explicit LSF is available, the point estimate method (PEM) could be considered as an alternative. It has the advantage that only the probabilistic moments of the random variables are required. However, in the PEM, fitting of the resulting moments of the LSF to a probability density function (PDF) is needed. In the present study, a very simple alternative which allows the assessment of the reliability levels when no explicit LSF is available and without the need of extensive simulations is employed. The alternative includes the use of the PEM, and its applicability is shown by assessing reliability levels of reinforced concrete bridges in Mexico when a numerical scheme is required. Comparisons with results by using the Monte Carlo simulation (MCS) technique are included. To overcome the problem of approximating the probabilistic moments from the PEM to a PDF, a well-known distribution is employed. The approach mixes the PEM and other classic reliability method (first order reliability method, FORM). The results in the present study are in good agreement whit those computed with the MCS. Therefore, the alternative of mixing the reliability methods is a very valuable option to determine reliability levels when no close form of the LSF is available, or if numerical schemes, the FEM or computational mechanics are employed.

Keywords: structural reliability, reinforced concrete bridges, combined approach, point estimate method, monte carlo simulation

Procedia PDF Downloads 346
648 Integrated Intensity and Spatial Enhancement Technique for Color Images

Authors: Evan W. Krieger, Vijayan K. Asari, Saibabu Arigela

Abstract:

Video imagery captured for real-time security and surveillance applications is typically captured in complex lighting conditions. These less than ideal conditions can result in imagery that can have underexposed or overexposed regions. It is also typical that the video is too low in resolution for certain applications. The purpose of security and surveillance video is that we should be able to make accurate conclusions based on the images seen in the video. Therefore, if poor lighting and low resolution conditions occur in the captured video, the ability to make accurate conclusions based on the received information will be reduced. We propose a solution to this problem by using image preprocessing to improve these images before use in a particular application. The proposed algorithm will integrate an intensity enhancement algorithm with a super resolution technique. The intensity enhancement portion consists of a nonlinear inverse sign transformation and an adaptive contrast enhancement. The super resolution section is a single image super resolution technique is a Fourier phase feature based method that uses a machine learning approach with kernel regression. The proposed technique intelligently integrates these algorithms to be able to produce a high quality output while also being more efficient than the sequential use of these algorithms. This integration is accomplished by performing the proposed algorithm on the intensity image produced from the original color image. After enhancement and super resolution, a color restoration technique is employed to obtain an improved visibility color image.

Keywords: dynamic range compression, multi-level Fourier features, nonlinear enhancement, super resolution

Procedia PDF Downloads 554
647 Quantification of Effect of Linear Anionic Polyacrylamide on Seepage in Irrigation Channels

Authors: Hamil Uribe, Cristian Arancibia

Abstract:

In Chile, the water for irrigation and hydropower generation is delivery essentially through unlined channels on earth, which have high seepage losses. Traditional seepage-abatement technologies are very expensive. The goals of this work were to quantify water loss in unlined channels and select reaches to evaluate the use of linear anionic polyacrylamide (LA-PAM) to reduce seepage losses. The study was carried out in Maule Region, central area of Chile. Water users indicated reaches with potential seepage losses, 45 km of channels in total, whose flow varied between 1.07 and 23.6 m³ s⁻¹. According to seepage measurements, 4 reaches of channels, 4.5 km in total, were selected for LA-PAM application. One to 4 LA-PAM applications were performed at rates of 11 kg ha⁻¹, considering wet perimeter area as basis of calculation. Large channels were used to allow motorboat moving against the current to carry-out LA-PAM application. For applications, a seeder machine was used to evenly distribute granulated polymer on water surface. Water flow was measured (StreamPro ADCP) upstream and downstream in selected reaches, to estimate seepage losses before and after LA-PAM application. Weekly measurements were made to quantify treatment effect and duration. In each case, water turbidity and temperature were measured. Channels showed variable losses up to 13.5%. Channels showing water gains were not treated with PAM. In all cases, LA-PAM effect was positive, achieving average loss reductions of 8% to 3.1%. Water loss was confirmed and it was possible to reduce seepage through LA-PAM applications provided that losses were known and correctly determined when applying the polymer. This could allow increasing irrigation security in critical periods, especially under drought conditions.

Keywords: canal seepage, irrigation, polyacrylamide, water management

Procedia PDF Downloads 176
646 Utilizing Laser Cutting Method in Men's' Custom-Made Casualwear

Authors: M A. Habit, S. A. Syed-Sahil, A. Bahari

Abstract:

Abstract—Laser cutting is a method of manufacturing process that uses laser in order to cut materials. It provides and ensures extreme accuracy which has a clean cut effect, CO2 laser dominate this application due to their good- quality beam combined with high output power. It comes with a small scale and it has a limitation in cutting sizes of materials, therefore it is more appropriate for custom- made products. The same laser cutting machine is also capable in cutting fine material such as fine silk, cotton, leather, polyester, etc. Lack of explorations and knowledge besides being unaware about this technology had caused many of the designers not to use this laser cutting method in their collections. The objectives of this study are: 1) To identify the potential of laser cutting technique in Custom-Made Garments for men’s casual wear: 2) To experiment the laser cutting technique in custom made garments: 3) To offer guidelines and formula for men’s custom- made casualwear designs with aesthetic value. In order to achieve the objectives, this research has been conducted by using mixed methods which are interviews with two (2) local experts in the apparel manufacturing industries and interviews via telephone with five (5) local respondents who are local emerging fashion designers, the questionnaires were distributed to one hundred (100) respondents around Klang Valley, in order to gain the information about their understanding and awareness regarding laser cutting technology. The experiment was conducted by using natural and man- made fibers. As a conclusion, all of the objectives had been achieved in producing custom-made men’s casualwear and with the production of these attires it will help to educate and enhance the innovation in fine technology. Therefore, there will be a good linkage and collaboration between the design experts and the manufacturing companies.

Keywords: custom-made, fashion, laser cut, men’s wear

Procedia PDF Downloads 444
645 The Impact of Technology on Physics Development

Authors: Fady Gaml Malk Mossad

Abstract:

these days, distance training that make use of internet generation is used widely all over the international to triumph over geographical and time primarily based issues in schooling. portraits, animation and other auxiliary visual resources help scholar to apprehend the topics easily. specially some theoretical guides which are pretty hard to understand along with physics and chemistry require visual material for college kids to apprehend subjects really. in this look at, physics packages for laboratory of physics path had been advanced. All facilities of internet-primarily based instructional technology have been used for students in laboratory research to avoid making mistakes and to analyze higher physics subjects.Android is a mobile running machine (OS) primarily based at the linux kerrnel and currently developed by way of google. With a user interface based on direct manipulation, Android is designed often for touchscreen cell deviced which includes smartphone and pill laptop, with specialized person interface for tv (Android television), vehicles (Android automobile), and wrist watches (Android wear). Now, nearly all peoples using cellphone. smartphone seems to be a have to-have item, because phone has many benefits. in addition, of course cellphone have many blessings for education, like resume of lesson that shape of 7451f44f4142a41b41fe20fbf0d491b7. but, this text isn't always approximately resume of lesson. this article is ready realistic based on android, precisely for physics. consequently, we can give an explanation for our concept approximately physics’s realistic primarily based on android and for output, we want many students might be like to reading physics and continually don't forget approximately physics’s phenomenon through physics’s sensible based on android.

Keywords: physics education, laboratory, web-based education, distance, educationandroid, smartphone, physics practical

Procedia PDF Downloads 15
644 Prediction of Damage to Cutting Tools in an Earth Pressure Balance Tunnel Boring Machine EPB TBM: A Case Study L3 Guadalajara Metro Line (Mexico)

Authors: Silvia Arrate, Waldo Salud, Eloy París

Abstract:

The wear of cutting tools is one of the most decisive elements when planning tunneling works, programming the maintenance stops and saving the optimum stock of spare parts during the evolution of the excavation. Being able to predict the behavior of cutting tools can give a very competitive advantage in terms of costs and excavation performance, optimized to the needs of the TBM itself. The incredible evolution of data science in recent years gives the option to implement it at the time of analyzing the key and most critical parameters related to machinery with the purpose of knowing how the cutting head is performing in front of the excavated ground. Taking this as a case study, Metro Line 3 of Guadalajara in Mexico will develop the feasibility of using Specific Energy versus data science applied over parameters of Torque, Penetration, and Contact Force, among others, to predict the behavior and status of cutting tools. The results obtained through both techniques are analyzed and verified in the function of the wear and the field situations observed in the excavation in order to determine its effectiveness regarding its predictive capacity. In conclusion, the possibilities and improvements offered by the application of digital tools and the programming of calculation algorithms for the analysis of wear of cutting head elements compared to purely empirical methods allow early detection of possible damage to cutting tools, which is reflected in optimization of excavation performance and a significant improvement in costs and deadlines.

Keywords: cutting tools, data science, prediction, TBM, wear

Procedia PDF Downloads 49
643 Design of UV Based Unicycle Robot to Disinfect Germs and Communicate With Multi-Robot System

Authors: Charles Koduru, Parth Patel, M. Hassan Tanveer

Abstract:

In this paper, the communication between a team of robots is used to sanitize an environment with germs is proposed. We introduce capabilities from a team of robots (most likely heterogeneous), a wheeled robot named ROSbot 2.0 that consists of a mounted LiDAR and Kinect sensor, and a modified prototype design of a unicycle-drive Roomba robot called the UV robot. The UV robot consists of ultrasonic sensors to avoid obstacles and is equipped with an ultraviolet light system to disinfect and kill germs, such as bacteria and viruses. In addition, the UV robot is equipped with disinfectant spray to target hidden objects that ultraviolet light is unable to reach. Using the sensors from the ROSbot 2.0, the robot will create a 3-D model of the environment which will be used to factor how the ultraviolet robot will disinfect the environment. Together this proposed system is known as the RME assistive robot device or RME system, which communicates between a navigation robot and a germ disinfecting robot operated by a user. The RME system includes a human-machine interface that allows the user to control certain features of each robot in the RME assistive robot device. This method allows the cleaning process to be done at a more rapid and efficient pace as the UV robot disinfects areas just by moving around in the environment while using the ultraviolet light system to kills germs. The RME system can be used in many applications including, public offices, stores, airports, hospitals, and schools. The RME system will be beneficial even after the COVID-19 pandemic. The Kennesaw State University will continue the research in the field of robotics, engineering, and technology and play its role to serve humanity.

Keywords: multi robot system, assistive robots, COVID-19 pandemic, ultraviolent technology

Procedia PDF Downloads 187
642 Task Scheduling and Resource Allocation in Cloud-based on AHP Method

Authors: Zahra Ahmadi, Fazlollah Adibnia

Abstract:

Scheduling of tasks and the optimal allocation of resources in the cloud are based on the dynamic nature of tasks and the heterogeneity of resources. Applications that are based on the scientific workflow are among the most widely used applications in this field, which are characterized by high processing power and storage capacity. In order to increase their efficiency, it is necessary to plan the tasks properly and select the best virtual machine in the cloud. The goals of the system are effective factors in scheduling tasks and resource selection, which depend on various criteria such as time, cost, current workload and processing power. Multi-criteria decision-making methods are a good choice in this field. In this research, a new method of work planning and resource allocation in a heterogeneous environment based on the modified AHP algorithm is proposed. In this method, the scheduling of input tasks is based on two criteria of execution time and size. Resource allocation is also a combination of the AHP algorithm and the first-input method of the first client. Resource prioritization is done with the criteria of main memory size, processor speed and bandwidth. What is considered in this system to modify the AHP algorithm Linear Max-Min and Linear Max normalization methods are the best choice for the mentioned algorithm, which have a great impact on the ranking. The simulation results show a decrease in the average response time, return time and execution time of input tasks in the proposed method compared to similar methods (basic methods).

Keywords: hierarchical analytical process, work prioritization, normalization, heterogeneous resource allocation, scientific workflow

Procedia PDF Downloads 146
641 Mapping Structurally Significant Areas of G-CSF during Thermal Degradation with NMR

Authors: Mark-Adam Kellerman

Abstract:

Proteins are capable of exploring vast mutational spaces. This makes it difficult for protein engineers to devise rational methods to improve stability and function via mutagenesis. Deciding which residues to mutate requires knowledge of the characteristics they elicit. We probed the characteristics of residues in granulocyte-colony stimulating factor (G-CSF) using a thermal melt (from 295K to 323K) to denature it in a 700 MHz Bruker spectrometer. These characteristics included dynamics, micro-environmental changes experienced/ induced during denaturing and structure-function relationships. 15N-1H HSQC experiments were performed at 2K increments along with this thermal melt. We observed that dynamic residues that also undergo a lot of change in their microenvironment were predominantly in unstructured regions. Moreover, we were able to identify four residues (G4, A6, T133 and Q134) that we class as high priority targets for mutagenesis, given that they all appear in both the top 10% of measures for environmental changes and dynamics (∑Δ and ∆PI). We were also able to probe these NMR observables and combine them with molecular dynamics (MD) to elucidate what appears to be an opening motion of G-CSFs binding site III. V48 appears to be pivotal to this opening motion, which also seemingly distorts the loop region between helices A and B. This observation is in agreement with previous findings that the conformation of this loop region becomes altered in an aggregation-prone state of G-CSF. Hence, we present here an approach to profile the characteristics of residues in order to highlight their potential as rational mutagenesis targets and their roles in important conformational changes. These findings present not only an opportunity to effectively make biobetters, but also open up the possibility to further understand epistasis and machine learn residue behaviours.

Keywords: protein engineering, rational mutagenesis, NMR, molecular dynamics

Procedia PDF Downloads 255
640 Multiaxial Fatigue Analysis of a High Performance Nickel-Based Superalloy

Authors: P. Selva, B. Lorraina, J. Alexis, A. Seror, A. Longuet, C. Mary, F. Denard

Abstract:

Over the past four decades, the fatigue behavior of nickel-based alloys has been widely studied. However, in recent years, significant advances in the fabrication process leading to grain size reduction have been made in order to improve fatigue properties of aircraft turbine discs. Indeed, a change in particle size affects the initiation mode of fatigue cracks as well as the fatigue life of the material. The present study aims to investigate the fatigue behavior of a newly developed nickel-based superalloy under biaxial-planar loading. Low Cycle Fatigue (LCF) tests are performed at different stress ratios so as to study the influence of the multiaxial stress state on the fatigue life of the material. Full-field displacement and strain measurements as well as crack initiation detection are obtained using Digital Image Correlation (DIC) techniques. The aim of this presentation is first to provide an in-depth description of both the experimental set-up and protocol: the multiaxial testing machine, the specific design of the cruciform specimen and performances of the DIC code are introduced. Second, results for sixteen specimens related to different load ratios are presented. Crack detection, strain amplitude and number of cycles to crack initiation vs. triaxial stress ratio for each loading case are given. Third, from fractographic investigations by scanning electron microscopy it is found that the mechanism of fatigue crack initiation does not depend on the triaxial stress ratio and that most fatigue cracks initiate from subsurface carbides.

Keywords: cruciform specimen, multiaxial fatigue, nickel-based superalloy

Procedia PDF Downloads 296
639 Design and Manufacture of a Hybrid Gearbox Reducer System

Authors: Ahmed Mozamel, Kemal Yildizli

Abstract:

Due to mechanical energy losses and a competitive of minimizing these losses and increases the machine efficiency, the need for contactless gearing system has raised. In this work, one stage of mechanical planetary gear transmission system integrated with one stage of magnetic planetary gear system is designed as a two-stage hybrid gearbox system. The permanent magnets internal energy in the form of the magnetic field is used to create meshing between contactless magnetic rotors in order to provide self-system protection against overloading and decrease the mechanical loss of the transmission system by eliminating the friction losses. Classical methods, such as analytical, tabular method and the theory of elasticity are used to calculate the planetary gear design parameters. The finite element method (ANSYS Maxwell) is used to predict the behaviors of a magnetic gearing system. The concentric magnetic gearing system has been modeled and analyzed by using 2D finite element method (ANSYS Maxwell). In addition to that, design and manufacturing processes of prototype components (a planetary gear, concentric magnetic gear, shafts and the bearings selection) of a gearbox system are investigated. The output force, the output moment, the output power and efficiency of the hybrid gearbox system are experimentally evaluated. The viability of applying a magnetic force to transmit mechanical power through a non-contact gearing system is presented. The experimental test results show that the system is capable to operate continuously within the range of speed from 400 rpm to 3000 rpm with the reduction ratio of 2:1 and maximum efficiency of 91%.

Keywords: hybrid gearbox, mechanical gearboxes, magnetic gears, magnetic torque

Procedia PDF Downloads 154
638 Towards Law Data Labelling Using Topic Modelling

Authors: Daniel Pinheiro Da Silva Junior, Aline Paes, Daniel De Oliveira, Christiano Lacerda Ghuerren, Marcio Duran

Abstract:

The Courts of Accounts are institutions responsible for overseeing and point out irregularities of Public Administration expenses. They have a high demand for processes to be analyzed, whose decisions must be grounded on severity laws. Despite the existing large amount of processes, there are several cases reporting similar subjects. Thus, previous decisions on already analyzed processes can be a precedent for current processes that refer to similar topics. Identifying similar topics is an open, yet essential task for identifying similarities between several processes. Since the actual amount of topics is considerably large, it is tedious and error-prone to identify topics using a pure manual approach. This paper presents a tool based on Machine Learning and Natural Language Processing to assists in building a labeled dataset. The tool relies on Topic Modelling with Latent Dirichlet Allocation to find the topics underlying a document followed by Jensen Shannon distance metric to generate a probability of similarity between documents pairs. Furthermore, in a case study with a corpus of decisions of the Rio de Janeiro State Court of Accounts, it was noted that data pre-processing plays an essential role in modeling relevant topics. Also, the combination of topic modeling and a calculated distance metric over document represented among generated topics has been proved useful in helping to construct a labeled base of similar and non-similar document pairs.

Keywords: courts of accounts, data labelling, document similarity, topic modeling

Procedia PDF Downloads 180
637 Computer Aided Shoulder Prosthesis Design and Manufacturing

Authors: Didem Venus Yildiz, Murat Hocaoglu, Murat Dursun, Taner Akkan

Abstract:

The shoulder joint is a more complex structure than the hip or knee joints. In addition to the overall complexity of the shoulder joint, two different factors influence the insufficient outcome of shoulder replacement: the shoulder prosthesis design is far from fully developed and it is difficult to place these shoulder prosthesis due to shoulder anatomy. The glenohumeral joint is the most complex joint of the human shoulder. There are various treatments for shoulder failures such as total shoulder arthroplasty, reverse total shoulder arthroplasty. Due to its reverse design than normal shoulder anatomy, reverse total shoulder arthroplasty has different physiological and biomechanical properties. Post-operative achievement of this arthroplasty is depend on improved design of reverse total shoulder prosthesis. Designation achievement can be increased by several biomechanical and computational analysis. In this study, data of human both shoulders with right side fracture was collected by 3D Computer Tomography (CT) machine in dicom format. This data transferred to 3D medical image processing software (Mimics Materilise, Leuven, Belgium) to reconstruct patient’s left and right shoulders’ bones geometry. Provided 3D geometry model of the fractured shoulder was used to constitute of reverse total shoulder prosthesis by 3-matic software. Finite element (FE) analysis was conducted for comparison of intact shoulder and prosthetic shoulder in terms of stress distribution and displacements. Body weight physiological reaction force of 800 N loads was applied. Resultant values of FE analysis was compared for both shoulders. The analysis of the performance of the reverse shoulder prosthesis could enhance the knowledge of the prosthetic design.

Keywords: reverse shoulder prosthesis, biomechanics, finite element analysis, 3D printing

Procedia PDF Downloads 157