Search results for: probabilistic classification vector machines
2099 Design and Performance of a Large Diameter Shaft in Old Alluvium
Authors: Tamilmani Thiruvengadam, Ramasthanan Arulampalam
Abstract:
This project comprises laying approximately 1.8km of 400mm, 1200mm and 2400mm diameter sewer pipes using pipe jacking machines along Mugliston Park, Buangkok Drive, and Buangkok Link. The works include an estimated 14 circular shafts with depth ranging from 10.0 meters to 29.0 meters. Cast in-situ circular shaft will be used for the temporary shaft excavation. The geology is predominantly Backfill and old alluvium with weak material encountered in between. Where there is a very soft clay, F1 material or weak soil is expected, ground improvement will be carried out outside of the shaft followed by cast in-situ concrete ring wall within the improved soil zone. This paper presents the design methodology, analysis and results of temporary shafts for micro TBM launching and constructing permanent manholes. There is also a comparison of instrumentation readings with the analysis predicted values.Keywords: circular shaft, ground improvement, old alluvium, temporary shaft
Procedia PDF Downloads 2872098 Impact of Network Workload between Virtualization Solutions on a Testbed Environment for Cybersecurity Learning
Authors: Kevin Fernagut, Olivier Flauzac, Erick M. G. Robledo, Florent Nolot
Abstract:
The adoption of modern lightweight virtualization often comes with new threats and network vulnerabilities. This paper seeks to assess this with a different approach studying the behavior of a testbed built with tools such as Kernel-Based Virtual Machine (KVM), Linux Containers (LXC) and Docker, by performing stress tests within a platform where students experiment simultaneously with cyber-attacks, and thus observe the impact on the campus network and also find the best solution for cyber-security learning. Interesting outcomes can be found in the literature comparing these technologies. It is, however, difficult to find results of the effects on the global network where experiments are carried out. Our work shows that other physical hosts and the faculty network were impacted while performing these trials. The problems found are discussed, as well as security solutions and the adoption of new network policies.Keywords: containerization, containers, cybersecurity, cyberattacks, isolation, performance, virtualization, virtual machines
Procedia PDF Downloads 1502097 Fire Safety Engineering of Wood Dust Layer or Cloud
Authors: Marzena Półka, Bożena Kukfisz
Abstract:
This paper presents an analysis of dust explosion hazards in the process industries. It includes selected testing method of dust explosibility and presentation two of them according to experimental standards used by Department of Combustion and Fire Theory in The Main School of Fire Service in Warsaw. In the article are presented values of maximum acceptable surface temperature (MAST) of machines operating in the presence of dust cloud and chosen dust layer with thickness of 5 and 12,5mm. The comparative analysis, points to the conclusion that the value of the minimum ignition temperature of the layer (MITL) and the minimum ignition temperature of dust cloud (MTCD) depends on the granularity of the substance. Increasing the thickness of the dust layer reduces minimum ignition temperature of dust layer. Increasing the thickness of dust at the same time extends the flameless combustion and delays the ignition.Keywords: fire safety engineering, industrial hazards, minimum ignition temperature, wood dust
Procedia PDF Downloads 3192096 Enhancing Engineering Students Educational Experience: Studying Hydrostatic Pumps Association System in Fluid Mechanics Laboratories
Authors: Alexandre Daliberto Frugoli, Pedro Jose Gabriel Ferreira, Pedro Americo Frugoli, Lucio Leonardo, Thais Cavalheri Santos
Abstract:
Laboratory classes in Engineering courses are essential for students to be able to integrate theory with practical reality, by handling equipment and observing experiments. In the researches of physical phenomena, students can learn about the complexities of science. Over the past years, universities in developing countries have been reducing the course load of engineering courses, in accordance with cutting cost agendas. Quality education is the object of study for researchers and requires educators and educational administrators able to demonstrate that the institutions are able to provide great learning opportunities at reasonable costs. Didactic test benches are indispensable equipment in educational activities related to turbo hydraulic pumps and pumping facilities study, which have a high cost and require long class time due to measurements and equipment adjustment time. In order to overcome the aforementioned obstacles, aligned with the professional objectives of an engineer, GruPEFE - UNIP (Research Group in Physics Education for Engineering - Universidade Paulista) has developed a multi-purpose stand for the discipline of fluid mechanics which allows the study of velocity and flow meters, loads losses and pump association. In this work, results obtained by the association in series and in parallel of hydraulic pumps will be presented and discussed, mainly analyzing the repeatability of experimental procedures and their agreement with the theory. For the association in series two identical pumps were used, consisting of the connection of the discharge of a pump to the suction of the next one, allowing the fluid to receive the power of all machines in the association. The characteristic curve of the set is obtained from the curves of each of the pumps, by adding the heads corresponding to the same flow rates. The same pumps were associated in parallel. In this association, the discharge piping is common to the two machines together. The characteristic curve of the set was obtained by adding to each value of H (head height), the flow rates of each pump. For the tests, the input and output pressure of each pump were measured. For each set there were three sets of measurements, varying the flow rate in range from 6.0 to 8.5 m 3 / h. For the two associations, the results showed an excellent repeatability with variations of less than 10% between sets of measurements and also a good agreement with the theory. This variation agrees with the instrumental uncertainty. Thus, the results validate the use of the fluids bench designed for didactic purposes. As a future work, a digital acquisition system is being developed, using differential sensors of extremely low pressures (2 to 2000 Pa approximately) for the microcontroller Arduino.Keywords: engineering education, fluid mechanics, hydrostatic pumps association, multi-purpose stand
Procedia PDF Downloads 2202095 The Residual Efficacy of Etofenprox WP on Different Surfaces for Malaria Control in the Brazilian Legal Amazon
Authors: Ana Paula S. A. Correa, Allan K. R. Galardo, Luana A. Lima, Talita F. Sobral, Josiane N. Muller, Jessica F. S. Barroso, Nercy V. R. Furtado, Ednaldo C. Rêgo., Jose B. P. Lima
Abstract:
Malaria is a public health problem in the Brazilian Legal Amazon. Among the integrated approaches for anopheline control, the Indoor Residual Spraying (IRS) remains one of the main tools in the basic strategy applied in the Amazonian States, where the National Malaria Control Program currently uses one of the insecticides from the pyrethroid class, the Etofenprox WP. Understanding the residual efficacy of insecticides on different surfaces is essential to determine the spray cycles, in order to maintain a rational use and to avoid product waste. The aim of this study was to evaluate the residual efficacy of Etofenprox - VECTRON ® 20 WP on surfaces of Unplastered Cement (UC) and Unpainted Wood (UW) on panels, in field, and in semi-field evaluation of Brazil’s Amapa State. The evaluation criteria used was the cone bioassay test, following the World Health Organization (WHO) recommended method, using plastic cones and female mosquitos of Anopheles sp. The tests were carried out in laboratory panels, semi-field evaluation in a “test house” built in the Macapa municipality, and in the field in 20 houses, being ten houses per surface type (UC and UW), in an endemic malaria area in Mazagão’s municipality. The residual efficacy was measured from March to September 2017, starting one day after the spraying, repeated monthly for a period of six months. The UW surface presented higher residual efficacy than the UC. In fact, the UW presented a residual efficacy of the insecticide throughout the period of this study with a mortality rate above 80% in the panels (= 95%), in the "test house" (= 86%) and in field houses ( = 87%). On the UC surface it was observed a mortality decreased in all the tests performed, with a mortality rate of 45, 47 and 29% on panels, semi-field and in field, respectively; however, the residual efficacy ≥ 80% only occurred in the first evaluation after the 24-hour spraying bioassay in the "test house". Thus, only the UW surface meets the specifications of the World Health Organization Pesticide Evaluation Scheme (WHOPES) regarding the duration of effective action (three to six months). To sum up, the insecticide residual efficacy presented variability on the different surfaces where it was sprayed. Although the IRS with Etofenprox WP was efficient on UW surfaces, and it can be used in spraying cycles at 4-month intervals, it is important to consider the diversity of houses in the Brazilian Legal Amazon, in order to implement alternatives for vector control, including the evaluation of new products or different formulations types for insecticides.Keywords: Anopheles, vector control, insecticide, bioassay
Procedia PDF Downloads 1652094 Possibilistic Aggregations in the Investment Decision Making
Authors: I. Khutsishvili, G. Sirbiladze, B. Ghvaberidze
Abstract:
This work proposes a fuzzy methodology to support the investment decisions. While choosing among competitive investment projects, the methodology makes ranking of projects using the new aggregation OWA operator – AsPOWA, presented in the environment of possibility uncertainty. For numerical evaluation of the weighting vector associated with the AsPOWA operator the mathematical programming problem is constructed. On the basis of the AsPOWA operator the projects’ group ranking maximum criteria is constructed. The methodology also allows making the most profitable investments into several of the project using the method developed by the authors for discrete possibilistic bicriteria problems. The article provides an example of the investment decision-making that explains the work of the proposed methodology.Keywords: expert evaluations, investment decision making, OWA operator, possibility uncertainty
Procedia PDF Downloads 5582093 Classification of ECG Signal Based on Mixture of Linear and Non-Linear Features
Authors: Mohammad Karimi Moridani, Mohammad Abdi Zadeh, Zahra Shahiazar Mazraeh
Abstract:
In recent years, the use of intelligent systems in biomedical engineering has increased dramatically, especially in the diagnosis of various diseases. Also, due to the relatively simple recording of the electrocardiogram signal (ECG), this signal is a good tool to show the function of the heart and diseases associated with it. The aim of this paper is to design an intelligent system for automatically detecting a normal electrocardiogram signal from abnormal one. Using this diagnostic system, it is possible to identify a person's heart condition in a very short time and with high accuracy. The data used in this article are from the Physionet database, available in 2016 for use by researchers to provide the best method for detecting normal signals from abnormalities. Data is of both genders and the data recording time varies between several seconds to several minutes. All data is also labeled normal or abnormal. Due to the low positional accuracy and ECG signal time limit and the similarity of the signal in some diseases with the normal signal, the heart rate variability (HRV) signal was used. Measuring and analyzing the heart rate variability with time to evaluate the activity of the heart and differentiating different types of heart failure from one another is of interest to the experts. In the preprocessing stage, after noise cancelation by the adaptive Kalman filter and extracting the R wave by the Pan and Tampkinz algorithm, R-R intervals were extracted and the HRV signal was generated. In the process of processing this paper, a new idea was presented that, in addition to using the statistical characteristics of the signal to create a return map and extraction of nonlinear characteristics of the HRV signal due to the nonlinear nature of the signal. Finally, the artificial neural networks widely used in the field of ECG signal processing as well as distinctive features were used to classify the normal signals from abnormal ones. To evaluate the efficiency of proposed classifiers in this paper, the area under curve ROC was used. The results of the simulation in the MATLAB environment showed that the AUC of the MLP and SVM neural network was 0.893 and 0.947, respectively. As well as, the results of the proposed algorithm in this paper indicated that the more use of nonlinear characteristics in normal signal classification of the patient showed better performance. Today, research is aimed at quantitatively analyzing the linear and non-linear or descriptive and random nature of the heart rate variability signal, because it has been shown that the amount of these properties can be used to indicate the health status of the individual's heart. The study of nonlinear behavior and dynamics of the heart's neural control system in the short and long-term provides new information on how the cardiovascular system functions, and has led to the development of research in this field. Given that the ECG signal contains important information and is one of the common tools used by physicians to diagnose heart disease, but due to the limited accuracy of time and the fact that some information about this signal is hidden from the viewpoint of physicians, the design of the intelligent system proposed in this paper can help physicians with greater speed and accuracy in the diagnosis of normal and patient individuals and can be used as a complementary system in the treatment centers.Keywords: neart rate variability, signal processing, linear and non-linear features, classification methods, ROC Curve
Procedia PDF Downloads 2622092 Providing a Practical Model to Reduce Maintenance Costs: A Case Study in Golgohar Company
Authors: Iman Atighi, Jalal Soleimannejad, Ahmad Akbarinasab, Saeid Moradpour
Abstract:
In the past, we could increase profit by increasing product prices. But in the new decade, a competitive market does not let us to increase profit with increase prices. Therefore, the only way to increase profit will be reduce costs. A significant percentage of production costs are the maintenance costs, and analysis of these costs could achieve more profit. Most maintenance strategies such as RCM (Reliability-Center-Maintenance), TPM (Total Productivity Maintenance), PM (Preventive Maintenance) etc., are trying to reduce maintenance costs. In this paper, decreasing the maintenance costs of Concentration Plant of Golgohar Company (GEG) was examined by using of MTBF (Mean Time between Failures) and MTTR (Mean Time to Repair) analyses. These analyses showed that instead of buying new machines and increasing costs in order to promote capacity, the improving of MTBF and MTTR indexes would solve capacity problems in the best way and decrease costs.Keywords: Golgohar Iron Ore Mining and Industrial Company, maintainability, maintenance costs, reliability-center-maintenance
Procedia PDF Downloads 3022091 A Survey of Baseband Architecture for Software Defined Radio
Authors: M. A. Fodha, H. Benfradj, A. Ghazel
Abstract:
This paper is a survey of recent works that proposes a baseband processor architecture for software defined radio. A classification of different approaches is proposed. The performance of each architecture is also discussed in order to clarify the suitable approaches that meet software-defined radio constraints.Keywords: multi-core architectures, reconfigurable architectures, software defined radio, baseband processor
Procedia PDF Downloads 4752090 Empirical Decomposition of Time Series of Power Consumption
Authors: Noura Al Akkari, Aurélie Foucquier, Sylvain Lespinats
Abstract:
Load monitoring is a management process for energy consumption towards energy savings and energy efficiency. Non Intrusive Load Monitoring (NILM) is one method of load monitoring used for disaggregation purposes. NILM is a technique for identifying individual appliances based on the analysis of the whole residence data retrieved from the main power meter of the house. Our NILM framework starts with data acquisition, followed by data preprocessing, then event detection, feature extraction, then general appliance modeling and identification at the final stage. The event detection stage is a core component of NILM process since event detection techniques lead to the extraction of appliance features. Appliance features are required for the accurate identification of the household devices. In this research work, we aim at developing a new event detection methodology with accurate load disaggregation to extract appliance features. Time-domain features extracted are used for tuning general appliance models for appliance identification and classification steps. We use unsupervised algorithms such as Dynamic Time Warping (DTW). The proposed method relies on detecting areas of operation of each residential appliance based on the power demand. Then, detecting the time at which each selected appliance changes its states. In order to fit with practical existing smart meters capabilities, we work on low sampling data with a frequency of (1/60) Hz. The data is simulated on Load Profile Generator software (LPG), which was not previously taken into consideration for NILM purposes in the literature. LPG is a numerical software that uses behaviour simulation of people inside the house to generate residential energy consumption data. The proposed event detection method targets low consumption loads that are difficult to detect. Also, it facilitates the extraction of specific features used for general appliance modeling. In addition to this, the identification process includes unsupervised techniques such as DTW. To our best knowledge, there exist few unsupervised techniques employed with low sampling data in comparison to the many supervised techniques used for such cases. We extract a power interval at which falls the operation of the selected appliance along with a time vector for the values delimiting the state transitions of the appliance. After this, appliance signatures are formed from extracted power, geometrical and statistical features. Afterwards, those formed signatures are used to tune general model types for appliances identification using unsupervised algorithms. This method is evaluated using both simulated data on LPG and real-time Reference Energy Disaggregation Dataset (REDD). For that, we compute performance metrics using confusion matrix based metrics, considering accuracy, precision, recall and error-rate. The performance analysis of our methodology is then compared with other detection techniques previously used in the literature review, such as detection techniques based on statistical variations and abrupt changes (Variance Sliding Window and Cumulative Sum).Keywords: general appliance model, non intrusive load monitoring, events detection, unsupervised techniques;
Procedia PDF Downloads 822089 Determining the Direction of Causality between Creating Innovation and Technology Market
Authors: Liubov Evstigneeva
Abstract:
In this paper an attempt is made to establish causal nexuses between innovation and international trade in Russia. The topicality of this issue is determined by the necessity of choosing policy instruments for economic modernization and transition to innovative development. The vector auto regression (VAR) model and Granger test are applied for the Russian monthly data from 2005 until the second quartile of 2015. Both lagged import and export at the national level cause innovation, the latter starts to stimulate foreign trade since it is a remote lag. In comparison to aggregate data, the results by patent’s categories are more diverse. Importing technologies from foreign countries stimulates patent activity, while innovations created in Russia are only Granger causality for import to Commonwealth of Independent States.Keywords: export, import, innovation, patents
Procedia PDF Downloads 3212088 A Monte Carlo Fuzzy Logistic Regression Framework against Imbalance and Separation
Authors: Georgios Charizanos, Haydar Demirhan, Duygu Icen
Abstract:
Two of the most impactful issues in classical logistic regression are class imbalance and complete separation. These can result in model predictions heavily leaning towards the imbalanced class on the binary response variable or over-fitting issues. Fuzzy methodology offers key solutions for handling these problems. However, most studies propose the transformation of the binary responses into a continuous format limited within [0,1]. This is called the possibilistic approach within fuzzy logistic regression. Following this approach is more aligned with straightforward regression since a logit-link function is not utilized, and fuzzy probabilities are not generated. In contrast, we propose a method of fuzzifying binary response variables that allows for the use of the logit-link function; hence, a probabilistic fuzzy logistic regression model with the Monte Carlo method. The fuzzy probabilities are then classified by selecting a fuzzy threshold. Different combinations of fuzzy and crisp input, output, and coefficients are explored, aiming to understand which of these perform better under different conditions of imbalance and separation. We conduct numerical experiments using both synthetic and real datasets to demonstrate the performance of the fuzzy logistic regression framework against seven crisp machine learning methods. The proposed framework shows better performance irrespective of the degree of imbalance and presence of separation in the data, while the considered machine learning methods are significantly impacted.Keywords: fuzzy logistic regression, fuzzy, logistic, machine learning
Procedia PDF Downloads 742087 Smart Web Services in the Web of Things
Authors: Sekkal Nawel
Abstract:
The Web of Things (WoT), integration of smart technologies from the Internet or network to Web architecture or application, is becoming more complex, larger, and dynamic. The WoT is associated with various elements such as sensors, devices, networks, protocols, data, functionalities, and architectures to perform services for stakeholders. These services operate in the context of the interaction of stakeholders and the WoT elements. Such context is becoming a key information source from which data are of various nature and uncertain, thus leading to complex situations. In this paper, we take interest in the development of intelligent Web services. The key ingredients of this “intelligent” notion are the context diversity, the necessity of a semantic representation to manage complex situations and the capacity to reason with uncertain data. In this perspective, we introduce a multi-layered architecture based on a generic intelligent Web service model dealing with various contexts, which proactively predict future situations and reactively respond to real-time situations in order to support decision-making. For semantic context data representation, we use PR-OWL, which is a probabilistic ontology based on Multi-Entity Bayesian Networks (MEBN). PR-OWL is flexible enough to represent complex, dynamic, and uncertain contexts, the key requirements of the development for the intelligent Web services. A case study was carried out using the proposed architecture for intelligent plant watering to show the role of proactive and reactive contextual reasoning in terms of WoT.Keywords: smart web service, the web of things, context reasoning, proactive, reactive, multi-entity bayesian networks, PR-OWL
Procedia PDF Downloads 712086 Advanced Technologies and Algorithms for Efficient Portfolio Selection
Authors: Konstantinos Liagkouras, Konstantinos Metaxiotis
Abstract:
In this paper we present a classification of the various technologies applied for the solution of the portfolio selection problem according to the discipline and the methodological framework followed. We provide a concise presentation of the emerged categories and we are trying to identify which methods considered obsolete and which lie at the heart of the debate. On top of that, we provide a comparative study of the different technologies applied for efficient portfolio construction and we suggest potential paths for future work that lie at the intersection of the presented techniques.Keywords: portfolio selection, optimization techniques, financial models, stochastic, heuristics
Procedia PDF Downloads 4322085 Analyzing Impacts of Road Network on Vegetation Using Geographic Information System and Remote Sensing Techniques
Authors: Elizabeth Malebogo Mosepele
Abstract:
Road transport has become increasingly common in the world; people rely on road networks for transportation purpose on a daily basis. However, environmental impact of roads on surrounding landscapes extends their potential effects even further. This study investigates the impact of road network on natural vegetation. The study will provide baseline knowledge regarding roadside vegetation and would be helpful in future for conservation of biodiversity along the road verges and improvements of road verges. The general hypothesis of this study is that the amount and condition of road side vegetation could be explained by road network conditions. Remote sensing techniques were used to analyze vegetation conditions. Landsat 8 OLI image was used to assess vegetation cover condition. NDVI image was generated and used as a base from which land cover classes were extracted, comprising four categories viz. healthy vegetation, degraded vegetation, bare surface, and water. The classification of the image was achieved using the supervised classification technique. Road networks were digitized from Google Earth. For observed data, transect based quadrats of 50*50 m were conducted next to road segments for vegetation assessment. Vegetation condition was related to road network, with the multinomial logistic regression confirming a significant relationship between vegetation condition and road network. The null hypothesis formulated was that 'there is no variation in vegetation condition as we move away from the road.' Analysis of vegetation condition revealed degraded vegetation within close proximity of a road segment and healthy vegetation as the distance increase away from the road. The Chi Squared value was compared with critical value of 3.84, at the significance level of 0.05 to determine the significance of relationship. Given that the Chi squared value was 395, 5004, the null hypothesis was therefore rejected; there is significant variation in vegetation the distance increases away from the road. The conclusion is that the road network plays an important role in the condition of vegetation.Keywords: Chi squared, geographic information system, multinomial logistic regression, remote sensing, road side vegetation
Procedia PDF Downloads 4322084 Utilizing Grid Computing to Enhance Power Systems Performance
Authors: Rafid A. Al-Khannak, Fawzi M. Al-Naima
Abstract:
Power load is one of the most important controlling keys which decide power demands and illustrate power usage to shape power market. Hence, power load forecasting is the parameter which facilitates understanding and analyzing all these aspects. In this paper, power load forecasting is solved under MATLAB environment by constructing a neural network for the power load to find an accurate simulated solution with the minimum error. A developed algorithm to achieve load forecasting application with faster technique is the aim for this paper. The algorithm is used to enable MATLAB power application to be implemented by multi machines in the Grid computing system, and to accomplish it within much less time, cost and with high accuracy and quality. Grid Computing, the modern computational distributing technology, has been used to enhance the performance of power applications by utilizing idle and desired Grid contributor(s) by sharing computational power resources.Keywords: DeskGrid, Grid Server, idle contributor(s), grid computing, load forecasting
Procedia PDF Downloads 4762083 A Prediction Model of Adopting IPTV
Authors: Jeonghwan Jeon
Abstract:
With the advent of IPTV in the fierce competition with existing broadcasting system, it is emerged as an important issue to predict how much the adoption of IPTV service will be. This paper aims to suggest a prediction model for adopting IPTV using classification and Ranking Belief Simplex (CaRBS). A simplex plot method of representing data allows a clear visual representation to the degree of interaction of the support from the variables to the prediction of the objects. CaRBS is applied to the survey data on the IPTV adoption.Keywords: prediction, adoption, IPTV, CaRBS
Procedia PDF Downloads 4122082 Advanced Combinatorial Method for Solving Complex Fault Trees
Authors: José de Jesús Rivero Oliva, Jesús Salomón Llanes, Manuel Perdomo Ojeda, Antonio Torres Valle
Abstract:
Combinatorial explosion is a common problem to both predominant methods for solving fault trees: Minimal Cut Set (MCS) approach and Binary Decision Diagram (BDD). High memory consumption impedes the complete solution of very complex fault trees. Only approximated non-conservative solutions are possible in these cases using truncation or other simplification techniques. The paper proposes a method (CSolv+) for solving complex fault trees, without any possibility of combinatorial explosion. Each individual MCS is immediately discarded after its contribution to the basic events importance measures and the Top gate Upper Bound Probability (TUBP) has been accounted. An estimation of the Top gate Exact Probability (TEP) is also provided. Therefore, running in a computer cluster, CSolv+ will guarantee the complete solution of complex fault trees. It was successfully applied to 40 fault trees from the Aralia fault trees database, performing the evaluation of the top gate probability, the 1000 Significant MCSs (SMCS), and the Fussell-Vesely, RRW and RAW importance measures for all basic events. The high complexity fault tree nus9601 was solved with truncation probabilities from 10-²¹ to 10-²⁷ just to limit the execution time. The solution corresponding to 10-²⁷ evaluated 3.530.592.796 MCSs in 3 hours and 15 minutes.Keywords: system reliability analysis, probabilistic risk assessment, fault tree analysis, basic events importance measures
Procedia PDF Downloads 452081 Combined Machine That Fertilizes Evenly under Plowing on Slopes and Planning an Experiment
Authors: Qurbanov Huseyn Nuraddin
Abstract:
The results of scientific research on a machine that pours an equal amount of mineral fertilizer under the soil to increase the productivity of grain in mountain farming and obtain quality grain are substantiated. The average yield of the crop depends on the nature of the distribution of fertilizers in the soil. Therefore, the study of effective energy-saving methods for the application of mineral fertilizers is the actual task of modern agriculture. Depending on the type and variety of plants in mountain farming, there is an optimal norm of mineral fertilizers. Applying an equal amount of fertilizer to the soil is one of the conditions that increase the efficiency of the field. One of the main agro-technical indicators of the work of mineral fertilizing machines is to ensure equal distribution of mineral fertilizers in the field. Taking into account the above-mentioned issues, a combined plough has been improved in our laboratory.Keywords: combined plough, mineral fertilizers, sprinkle fluently, fertilizer rate, cereals
Procedia PDF Downloads 732080 Tibial Plateau Fractures During Covid-19 In A Trauma Unit. Impact of Lockdown and The Pressures on the Healthcare Provider
Authors: R. Gwynn, P. Panwalkar, K. Veravalli , M. Tofighi, R. Clement, A. Mofidi
Abstract:
The aim of this study was to access the impact of Covid-19 and lockdown on the incidence, injury pattern, and treatment of tibial plateau fractures in a combined rural and urban population in wales. Methods: Retrospective study was performed to identify tibial plateau fractures in 15-month period of Covid-19 lockdown 15-month period immediately before lockdown. Patient demographics, injury mechanism, injury severity (based on Schatzker classification), and associated injuries, treatment methods, and outcome of fractures in the Covid-19 period was studied. Results: The incidence oftibial plateau fracture was 9 per 100000 during Covid-19, and 8.5 per 100000, and both were similar to previous studies. The average age was 52, and female to male ratio was 1:1 in both control and study group. High energy injury was seen in only 20% of the patients and 35% in the control groups (2=12, p<0025). 14% of the covid-19 population sustained other injuries as opposed 16% in the control group(2=0.09, p>0.95). Lower severity isolated lateral condyle fracturesinjury (Schatzker 1-3) were seen in 40% of fractures this was 60% in the control populations. Higher bicondylar and shaft fractures (Schatzker 5-6) were seen in 60% of the Covid-19 group and 35% in the control groups(2=7.8, p<0.02). Treatment mode was not impacted by Covid-19. The complication rate was low in spite of higher number of complex fractures and the impact of covid-19 pandemic. Conclusion: The associated injuries were similar in spite of a significantly lower mechanism of injury. There were unexpectedly worst tibial plateau fracture based Schatzker classification in the Covid-19 period as compared to the control groups. This was especially relevant for medial condyle and shaft fractures. This was postulated to be caused by reduction in bone density caused by lack of vitamin D and reduction in activity. The treatment mode and outcome was not impacted by the impact of Covid-19 on care for tibial plateau fractures.Keywords: Covid-19, knee, tibial plateau fracture, trauma
Procedia PDF Downloads 1252079 Gas While Drilling (GWD) Classification in Betara Complex; An Effective Approachment to Optimize Future Candidate of Gumai Reservoir
Authors: I. Gusti Agung Aditya Surya Wibawa, Andri Syafriya, Beiruny Syam
Abstract:
Gumai Formation which acts as regional seal for Talang Akar Formation becomes one of the most prolific reservoir in South Sumatra Basin and the primary exploration target in this area. Marine conditions were eventually established during the continuation of transgression sequence leads an open marine facies deposition in Early Miocene. Marine clastic deposits where calcareous shales, claystone and siltstones interbedded with fine-grained calcareous and glauconitic sandstones are the domination of lithology which targeted as the hydrocarbon reservoir. All this time, the main objective of PetroChina’s exploration and production in Betara area is only from Lower Talang Akar Formation. Successful testing in some exploration wells which flowed gas & condensate from Gumai Formation, opened the opportunity to optimize new reservoir objective in Betara area. Limitation of conventional wireline logs data in Gumai interval is generating technical challenge in term of geological approach. A utilization of Gas While Drilling indicator initiated with the objective to determine the next Gumai reservoir candidate which capable to increase Jabung hydrocarbon discoveries. This paper describes how Gas While Drilling indicator is processed to generate potential and non-potential zone by cut-off analysis. Validation which performed by correlation and comparison with well logs, Drill Stem Test (DST), and Reservoir Performance Monitor (RPM) data succeed to observe Gumai reservoir in Betara Complex. After we integrated all of data, we are able to generate a Betara Complex potential map and overlaid with reservoir characterization distribution as a part of risk assessment in term of potential zone presence. Mud log utilization and geophysical data information successfully covered the geological challenges in this study.Keywords: Gumai, gas while drilling, classification, reservoir, potential
Procedia PDF Downloads 3552078 A Numerical Method for Diffusion and Cahn-Hilliard Equations on Evolving Spherical Surfaces
Authors: Jyh-Yang Wu, Sheng-Gwo Chen
Abstract:
In this paper, we present a simple effective numerical geometric method to estimate the divergence of a vector field over a curved surface. The conservation law is an important principle in physics and mathematics. However, many well-known numerical methods for solving diffusion equations do not obey conservation laws. Our presented method in this paper combines the divergence theorem with a generalized finite difference method and obeys the conservation law on discrete closed surfaces. We use the similar method to solve the Cahn-Hilliard equations on evolving spherical surfaces and observe stability results in our numerical simulations.Keywords: conservation laws, diffusion equations, Cahn-Hilliard equations, evolving surfaces
Procedia PDF Downloads 4962077 An Experimental Testbed Using Virtual Containers for Distributed Systems
Authors: Parth Patel, Ying Zhu
Abstract:
Distributed systems have become ubiquitous, and they continue their growth through a range of services. With advances in resource virtualization technology such as Virtual Machines (VM) and software containers, developers no longer require high-end servers to test and develop distributed software. Even in commercial production, virtualization has streamlined the process of rapid deployment and service management. This paper introduces a distributed systems testbed that utilizes virtualization to enable distributed systems development on commodity computers. The testbed can be used to develop new services, implement theoretical distributed systems concepts for understanding, and experiment with virtual network topologies. We show its versatility through two case studies that utilize the testbed for implementing a theoretical algorithm and developing our own methodology to find high-risk edges. The results of using the testbed for these use cases have proven the effectiveness and versatility of this testbed across a range of scenarios.Keywords: distributed systems, experimental testbed, peer-to-peer networks, virtual container technology
Procedia PDF Downloads 1462076 The Application on Interactivity of Light in New Media Art
Authors: Yansong Chen
Abstract:
In the age of media convergence, new media technology is constantly impacting, changing, and even reshaping the limits of Art. From the technological ontology of the new media art, the concept of interaction design has always been dominated by I/O (Input/Output) systems through the ages, which ignores the content of systems and kills the aura of art. Light, as a fusion media, basically comes from the extension of some human feelings and can be the content of the input or the effect of output. In this paper, firstly, on the basis of literature review, the interaction characteristics research was conducted on light. Secondly, starting from discourse patterns of people and machines, people and people, people, and imagining things, we propose three light modes: object-oriented interaction, Immersion interaction, Tele-Presence interaction. Finally, this paper explains how to regain the aura of art through light elements in new media art and understand multiple levels of 'Interaction design'. In addition, the new media art, especially the light-based interaction art, enriches the language patterns and motivates emerging art forms to be more widespread and popular, which achieves its aesthetics growth.Keywords: new media art, interaction design, light art, immersion
Procedia PDF Downloads 2362075 Molecular Topology and TLC Retention Behaviour of s-Triazines: QSRR Study
Authors: Lidija R. Jevrić, Sanja O. Podunavac-Kuzmanović, Strahinja Z. Kovačević
Abstract:
Quantitative structure-retention relationship (QSRR) analysis was used to predict the chromatographic behavior of s-triazine derivatives by using theoretical descriptors computed from the chemical structure. Fundamental basis of the reported investigation is to relate molecular topological descriptors with chromatographic behavior of s-triazine derivatives obtained by reversed-phase (RP) thin layer chromatography (TLC) on silica gel impregnated with paraffin oil and applied ethanol-water (φ = 0.5-0.8; v/v). Retention parameter (RM0) of 14 investigated s-triazine derivatives was used as dependent variable while simple connectivity index different orders were used as independent variables. The best QSRR model for predicting RM0 value was obtained with simple third order connectivity index (3χ) in the second-degree polynomial equation. Numerical values of the correlation coefficient (r=0.915), Fisher's value (F=28.34) and root mean square error (RMSE = 0.36) indicate that model is statistically significant. In order to test the predictive power of the QSRR model leave-one-out cross-validation technique has been applied. The parameters of the internal cross-validation analysis (r2CV=0.79, r2adj=0.81, PRESS=1.89) reflect the high predictive ability of the generated model and it confirms that can be used to predict RM0 value. Multivariate classification technique, hierarchical cluster analysis (HCA), has been applied in order to group molecules according to their molecular connectivity indices. HCA is a descriptive statistical method and it is the most frequently used for important area of data processing such is classification. The HCA performed on simple molecular connectivity indices obtained from the 2D structure of investigated s-triazine compounds resulted in two main clusters in which compounds molecules were grouped according to the number of atoms in the molecule. This is in agreement with the fact that these descriptors were calculated on the basis of the number of atoms in the molecule of the investigated s-triazine derivatives.Keywords: s-triazines, QSRR, chemometrics, chromatography, molecular descriptors
Procedia PDF Downloads 3932074 Translation Training in the AI Era
Authors: Min Gao
Abstract:
In the past year, the advent of large language models (LLMs) has brought about a revolution in the language service industry, making it possible to efficiently produce more satisfactory and higher-quality translations. This is groundbreaking news for commercial companies involved in language services since much of a translator's work can now be completed by machines. However, it may be bad news for universities that provide translation training programs. They need to confront the challenges posed by AI in education by reconsidering issues such as the reform of traditional teaching methods, the translation ethics of students, and the new demands of the job market for their graduates. This article is an exploratory study of these issues based on the author's experiences in translation teaching. The research combines methods in the form of questionnaires and interviews. The findings include: (1) students may lose their motivation to learn in the AI era, but this can be compensated for by encouragement from the lecturer; (2) Translation ethics are not a serious problem in schools, considering the strict policies and regulations in place; (3) The role of translators has evolved in the new era, necessitating a reform of the traditional teaching methods.Keywords: job market of translation, large language model, translation ethics, translation training
Procedia PDF Downloads 682073 Process of Production of an Artisanal Brewery in a City in the North of the State of Mato Grosso, Brazil
Authors: Ana Paula S. Horodenski, Priscila Pelegrini, Salli Baggenstoss
Abstract:
The brewing industry with artisanal concepts seeks to serve a specific market, with diversified production that has been gaining ground in the national environment, also in the Amazon region. This growth is due to the more demanding consumer, with a diversified taste that wants to try new types of beer, enjoying products with new aromas, flavors, as a differential of what is so widely spread through the big industrial brands. Thus, through qualitative research methods, the study aimed to investigate how is the process of managing the production of a craft brewery in a city in the northern State of Mato Grosso (BRAZIL), providing knowledge of production processes and strategies in the industry. With the efficient use of resources, it is possible to obtain the necessary quality and provide better performance and differentiation of the company, besides analyzing the best management model. The research is descriptive with a qualitative approach through a case study. For the data collection, a semi-structured interview was elaborated, composed of the areas: microbrewery characterization, artisan beer production process, and the company supply chain management. Also, production processes were observed during technical visits. With the study, it was verified that the artisan brewery researched develops preventive maintenance strategies with the inputs, machines, and equipment, so that the quality of the product and the production process are achieved. It was observed that the distance from the supplying centers makes the management of processes and the supply chain be carried out with a longer planning time so that the delivery of the final product is satisfactory. The production process of the brewery is composed of machines and equipment that allows the control and quality of the product, which the manager states that for the productive capacity of the industry and its consumer market, the available equipment meets the demand. This study also contributes to highlight one of the challenges for the development of small breweries in front of the market giants, that is, the legislation, which fits the microbreweries as producers of alcoholic beverages. This makes the micro and small business segment to be taxed as a major, who has advantages in purchasing large batches of raw materials and tax incentives because they are large employers and tax pickers. It was possible to observe that the supply chain management system relies on spreadsheets and notes that are done manually, which could be simplified with a computer program to streamline procedures and reduce risks and failures of the manual process. In relation to the control of waste and effluents affected by the industry is outsourced and meets the needs. Finally, the results showed that the industry uses preventive maintenance as a productive strategy, which allows better conditions for the production and quality of artisanal beer. The quality is directly related to the satisfaction of the final consumer, being prized and performed throughout the production process, with the selection of better inputs, the effectiveness of the production processes and the relationship with the commercial partners.Keywords: artisanal brewery, production management, production processes, supply chain
Procedia PDF Downloads 1202072 Performance Analysis and Energy Consumption of Routing Protocol in Manet Using Grid Topology
Authors: Vivek Kumar Singh, Tripti Singh
Abstract:
An ad hoc wireless network consists of mobile networks which creates an underlying architecture for communication without the help of traditional fixed-position routers. Ad-hoc On-demand Distance Vector (AODV) is a routing protocol used for Mobile Ad hoc Network (MANET). Nevertheless, the architecture must maintain communication routes although the hosts are mobile and they have limited transmission range. There are different protocols for handling the routing in the mobile environment. Routing protocols used in fixed infrastructure networks cannot be efficiently used for mobile ad-hoc networks, so that MANET requires different protocols. This paper presents the performance analysis of the routing protocols used various parameter-patterns with Two-ray model.Keywords: AODV, packet transmission rate, pause time, ZRP, QualNet 6.1
Procedia PDF Downloads 8282071 Julia-Based Computational Tool for Composite System Reliability Assessment
Authors: Josif Figueroa, Kush Bubbar, Greg Young-Morris
Abstract:
The reliability evaluation of composite generation and bulk transmission systems is crucial for ensuring a reliable supply of electrical energy to significant system load points. However, evaluating adequacy indices using probabilistic methods like sequential Monte Carlo Simulation can be computationally expensive. Despite this, it is necessary when time-varying and interdependent resources, such as renewables and energy storage systems, are involved. Recent advances in solving power network optimization problems and parallel computing have improved runtime performance while maintaining solution accuracy. This work introduces CompositeSystems, an open-source Composite System Reliability Evaluation tool developed in Julia™, to address the current deficiencies of commercial and non-commercial tools. This work introduces its design, validation, and effectiveness, which includes analyzing two different formulations of the Optimal Power Flow problem. The simulations demonstrate excellent agreement with existing published studies while improving replicability and reproducibility. Overall, the proposed tool can provide valuable insights into the performance of transmission systems, making it an important addition to the existing toolbox for power system planning.Keywords: open-source software, composite system reliability, optimization methods, Monte Carlo methods, optimal power flow
Procedia PDF Downloads 742070 Study of Land Use Changes around an Archaeological Site Using Satellite Imagery Analysis: A Case Study of Hathnora, Madhya Pradesh, India
Authors: Pranita Shivankar, Arun Suryawanshi, Prabodhachandra Deshmukh, S. V. C. Kameswara Rao
Abstract:
Many undesirable significant changes in landscapes and the regions in the vicinity of historically important structures occur as impacts due to anthropogenic activities over a period of time. A better understanding of such influences using recently developed satellite remote sensing techniques helps in planning the strategies for minimizing the negative impacts on the existing environment. In 1982, a fossilized hominid skull cap was discovered at a site located along the northern bank of the east-west flowing river Narmada in the village Hathnora. Close to the same site, the presence of Late Acheulian and Middle Palaeolithic tools have been discovered in the immediately overlying pebbly gravel, suggesting that the ‘Narmada skull’ may be from the Middle Pleistocene age. The reviews of recently carried out research studies relevant to hominid remains all over the world from Late Acheulian and Middle Palaeolithic sites suggest succession and contemporaneity of cultures there, enhancing the importance of Hathnora as a rare precious site. In this context, the maximum likelihood classification using digital interpretation techniques was carried out for this study area using the satellite imagery from Landsat ETM+ for the year 2006 and Landsat TM (OLI and TIRS) for the year 2016. The overall accuracy of Land Use Land Cover (LULC) classification of 2016 imagery was around 77.27% based on ground truth data. The significant reduction in the main river course and agricultural activities and increase in the built-up area observed in remote sensing data analysis are undoubtedly the outcome of human encroachments in the vicinity of the eminent heritage site.Keywords: cultural succession, digital interpretation, Hathnora, Homo Sapiens, Late Acheulian, Middle Palaeolithic
Procedia PDF Downloads 172