Search results for: computer application
8284 Radiographic Predictors of Mandibular Third Molar Extraction Difficulties under General Anaesthetic
Authors: Carolyn Whyte, Tina Halai, Sonita Koshal
Abstract:
Aim: There are many methods available to assess the potential difficulty of third molar surgery. This study investigated various factors to assess whether they had a bearing on the difficulties encountered. Study design: A retrospective study was completed of 62 single mandibular third molar teeth removed under day case general anaesthesia between May 2016 and August 2016 by 3 consultant oral surgeons. Method: Data collection was by examining the OPG radiographs of each tooth and recording the necessary data. This was depth of impaction, angulation, bony impaction, point of application in relation to second molar, root morphology, Pell and Gregory classification and Winters Lines. This was completed by one assessor and verified by another. Information on medical history, anxiety, ethnicity and age were recorded. Case notes and surgical entries were examined for any difficulties encountered. Results: There were 5 cases which encountered surgical difficulties which included fracture of root apices (3) which were left in situ, prolonged bleeding (1) and post-operative numbness >6 months(1). Four of the 5 cases had Pell and Gregory classification as (B) where the occlusal plane of the impacted tooth is between the occlusal plane and the cervical line of the adjacent tooth. 80% of cases had the point of application as either coronal or apical one third (1/3) in relation to the second molar. However, there was variability in all other aspects of assessment in predicting difficulty of removal. Conclusions: Of the cases which encountered difficulties they all had at least one predictor of potential complexity but these varied case by case.Keywords: impaction, mandibular third molar, radiographic assessment, surgical removal
Procedia PDF Downloads 1828283 Valorisation of Waste Chicken Feathers: Electrospun Antibacterial Nanoparticles-Embedded Keratin Composite Nanofibers
Authors: Lebogang L. R. Mphahlele, Bruce B. Sithole
Abstract:
Chicken meat is the highest consumed meat in south Africa, with a per capita consumption of >33 kg yearly. Hence, South Africa produces over 250 million kg of waste chicken feathers each year, the majority of which is landfilled or incinerated. The discarded feathers have caused environmental pollution and natural protein resource waste. Therefore, the valorisation of waste chicken feathers is measured as a more environmentally friendly and cost-effective treatment. Feather contains 91% protein, the main component being beta-keratin, a fibrous and insoluble structural protein extensively cross linked by disulfide bonds. Keratin is usually converted it into nanofibers via electrospinning for a variety of applications. keratin nanofiber composites have many potential biomedical applications for their attractive features, such as high surface-to-volume ratio and very high porosity. The application of nanofibers in the biomedical wound dressing requires antimicrobial properties for materials. One approach is incorporating inorganic nanoparticles, among which silver nanoparticles played an important alternative antibacterial agent and have been studied against many types of microbes. The objective of this study is to combine synthetic polymer, chicken feather keratin, and antibacterial nanoparticles to develop novel electrospun antibacterial nanofibrous composites for possible wound dressing application. Furthermore, this study will converting a two-dimensional electrospun nanofiber membrane to three-dimensional fiber networks that resemble the structure of the extracellular matrix (ECM)Keywords: chicken feather keratin, nanofibers, nanoparticles, nanocomposites, wound dressing
Procedia PDF Downloads 1338282 Count of Trees in East Africa with Deep Learning
Authors: Nubwimana Rachel, Mugabowindekwe Maurice
Abstract:
Trees play a crucial role in maintaining biodiversity and providing various ecological services. Traditional methods of counting trees are time-consuming, and there is a need for more efficient techniques. However, deep learning makes it feasible to identify the multi-scale elements hidden in aerial imagery. This research focuses on the application of deep learning techniques for tree detection and counting in both forest and non-forest areas through the exploration of the deep learning application for automated tree detection and counting using satellite imagery. The objective is to identify the most effective model for automated tree counting. We used different deep learning models such as YOLOV7, SSD, and UNET, along with Generative Adversarial Networks to generate synthetic samples for training and other augmentation techniques, including Random Resized Crop, AutoAugment, and Linear Contrast Enhancement. These models were trained and fine-tuned using satellite imagery to identify and count trees. The performance of the models was assessed through multiple trials; after training and fine-tuning the models, UNET demonstrated the best performance with a validation loss of 0.1211, validation accuracy of 0.9509, and validation precision of 0.9799. This research showcases the success of deep learning in accurate tree counting through remote sensing, particularly with the UNET model. It represents a significant contribution to the field by offering an efficient and precise alternative to conventional tree-counting methods.Keywords: remote sensing, deep learning, tree counting, image segmentation, object detection, visualization
Procedia PDF Downloads 808281 The Application and Relevance of Costing Techniques in Service Oriented Business Organisations: A Review of the Activity-Based Costing (ABC) Technique
Authors: Udeh Nneka Evelyn
Abstract:
The shortcomings of traditional costing system, in terms of validity, accuracy, consistency and relevance increased the need for modern management accounting system. ABC (Activity-Based Costing) can be used as a modern tool for planning, control and decision making for management. Past studies on activity-based costing (ABC) system have focused on manufacturing firms thereby making the studies on service firms scanty to some extent. This paper reviewed the application and relevance of activity-based costing techniques in service oriented business organisations by employing a qualitative research method which relied heavily on literature review of past and current relevant articles focusing on activity-based costing (ABC). Findings suggest that ABC is not only appropriate for use in a manufacturing environment; it is also most appropriate for service organizations such as financial institutions, the healthcare industry, and government organizations. In fact, some banking and financial institutions have been applying the concept for years under other names. One of them is unit costing, which is used to calculate the cost of banking services by determining the cost and consumption of each unit of output of functions required to deliver the service. ABC in very basic terms may provide very good payback for businesses. Some of the benefits that relate directly to the financial services industry are: Identification of the most profitable customers; more accurate product and service pricing; increase product profitability; well-organized process costs.Keywords: profitability, activity-based costing (ABC), management accounting, manufacture
Procedia PDF Downloads 5818280 Scalable UI Test Automation for Large-scale Web Applications
Authors: Kuniaki Kudo, Raviraj Solanki, Kaushal Patel, Yash Virani
Abstract:
This research mainly concerns optimizing UI test automation for large-scale web applications. The test target application is the HHAexchange homecare management WEB application that seamlessly connects providers, state Medicaid programs, managed care organizations (MCOs), and caregivers through one platform with large-scale functionalities. This study focuses on user interface automation testing for the WEB application. The quality assurance team must execute many manual users interface test cases in the development process to confirm no regression bugs. The team automated 346 test cases; the UI automation test execution time was over 17 hours. The business requirement was reducing the execution time to release high-quality products quickly, and the quality assurance automation team modernized the test automation framework to optimize the execution time. The base of the WEB UI automation test environment is Selenium, and the test code is written in Python. Adopting a compilation language to write test code leads to an inefficient flow when introducing scalability into a traditional test automation environment. In order to efficiently introduce scalability into Test Automation, a scripting language was adopted. The scalability implementation is mainly implemented with AWS's serverless technology, an elastic container service. The definition of scalability here is the ability to automatically set up computers to test automation and increase or decrease the number of computers running those tests. This means the scalable mechanism can help test cases run parallelly. Then test execution time is dramatically decreased. Also, introducing scalable test automation is for more than just reducing test execution time. There is a possibility that some challenging bugs are detected by introducing scalable test automation, such as race conditions, Etc. since test cases can be executed at same timing. If API and Unit tests are implemented, the test strategies can be adopted more efficiently for this scalability testing. However, in WEB applications, as a practical matter, API and Unit testing cannot cover 100% functional testing since they do not reach front-end codes. This study applied a scalable UI automation testing strategy to the large-scale homecare management system. It confirmed the optimization of the test case execution time and the detection of a challenging bug. This study first describes the detailed architecture of the scalable test automation environment, then describes the actual performance reduction time and an example of challenging issue detection.Keywords: aws, elastic container service, scalability, serverless, ui automation test
Procedia PDF Downloads 1088279 Novel Fluorescent High Density Polyethylene Composites for Fused Deposition Modeling 3D Printing in Packaging Security Features
Authors: Youssef R. Hassan, Mohamed S. Hasanin, Reda M. Abdelhameed
Abstract:
Recently, innovations in packaging security features become more important to see the originality of packaging in industrial application. Luminescent 3d printing materials have been a promising property which can provides a unique opportunity for the design and application of 3D printing. Lack emission of terbium ions, as a source of green emission, in salt form prevent its uses in industrial applications, so searching about stable and highly emitter material become essential. Nowadays, metal organic frameworks (MOFs) play an important role in designing light emitter material. In this work, fluorescent high density polyethylene (FHDPE) composite filament with Tb-benzene 1,3,5-tricarboxylate (Tb-BTC) MOFs for 3D printing have been successfully developed.HDPE pellets were mixed with Tb-BTC and melting extrustion with single screw extruders. It was found that Tb-BTCuniformly dispersed in the HDPE matrix and significantly increased the crystallinity of PE, which not only maintained the good thermal property but also improved the mechanical properties of Tb-BTC@HDPE composites. Notably, the composite filaments emitted ultra-bright green light under UV lamp, and the fluorescence intensity increased as the content of Tb-BTC increased. Finally, several brightly luminescent exquisite articles could be manufactured by fused deposition modeling (FDM) 3D printer with these new fluorescent filaments. In this context, the development of novel fluorescent Tb-BTC@HDPE composites was combined with 3D printing technology to amplified the packaging Security Features.Keywords: 3D printing, fluorescent, packaging, security
Procedia PDF Downloads 1018278 Antioxidant Activity of Probiotic Lactic Acid Bacteria and Their Application in Fermented Milk Products
Authors: Vitheejongjaroen P., Jaisin Y., Pachekrepapol U., Taweechotipatr M.
Abstract:
Lactic acid bacteria (LAB) are the most common type of microorganisms that had been used as probiotics also known for many beneficial health effects. The antioxidant activity of LAB is associated with numerous health-protective effects. This research aimed to investigate the antioxidant activity of lactic acid bacteria isolated from Thai sour pork sausage for their application in fermented milk products. Antioxidant activity determined by DPPH (2,2-diphenyl-1-picrylhydrazyl) radical scavenging assay showed that the isolate FN33-7, as 1 of 8 isolated exhibited scavenging activity in intact cell 5-7%, and supernatant 13-16%, intracellular cell free extract 42-48% respectively. This isolate was identified using 16S ribosomal DNA sequence analysis as Lactobacillus plantarum. The effect of milk fermented with L. plantarum FN33-7 on microbial count, pH and syneresis was assessed during refrigerated storage period of 28 days. The strain showed increased viability, pH level decreased, while syneresis increased. These results are similar to dairy products fermented with commercial starter cultures. Additionally, microstructure analysis of fermented milk by fluorescent microscopy showed that curd structure appeared to be dense and less porous in this fermented milk than commercial yogurt. The results of this study indicated that L. plantarum FN33-7 was a good probiotic candidate to be used in cultured milk products to reduce the risk of diseases caused by oxidative stress.Keywords: Lactobacillus plantarum, probiotics, free radical, antioxidant, oxidative stress, fermented milk products
Procedia PDF Downloads 1328277 Textile-Based Sensing System for Sleep Apnea Detection
Authors: Mary S. Ruppert-Stroescu, Minh Pham, Bruce Benjamin
Abstract:
Sleep apnea is a condition where a person stops breathing and can lead to cardiovascular disease, hypertension, and stroke. In the United States, approximately forty percent of overnight sleep apnea detection tests are cancelled. The purpose of this study was to develop a textile-based sensing system that acquires biometric signals relevant to cardiovascular health, to transmit them wirelessly to a computer, and to quantitatively assess the signals for sleep apnea detection. Patient interviews, literature review and market analysis defined a need for a device that ubiquitously integrated into the patient’s lifestyle. A multi-disciplinary research team of biomedical scientists, apparel designers, and computer engineers collaborated to design a textile-based sensing system that gathers EKG, Sp02, and respiration, then wirelessly transmits the signals to a computer in real time. The electronic components were assembled from existing hardware, the Health Kit which came pre-set with EKG and Sp02 sensors. The respiration belt was purchased separately and its electronics were built and integrated into the Health Kit mother board. Analog ECG signals were amplified and transmitted to the Arduino™ board where the signal was converted from analog into digital. By using textile electrodes, ECG lead-II was collected, and it reflected the electrical activity of the heart. Signals were collected when the subject was in sitting position and at sampling rate of 250 Hz. Because sleep apnea most often occurs in people with obese body types, prototypes were developed for a man’s size medium, XL, and XXL. To test user acceptance and comfort, wear tests were performed on 12 subjects. Results of the wear tests indicate that the knit fabric and t-shirt-like design were acceptable from both lifestyle and comfort perspectives. The airflow signal and respiration signal sensors return good signals regardless of movement intensity. Future study includes reconfiguring the hardware to a smaller size, developing the same type of garment for the female body, and further enhancing the signal quality.Keywords: sleep apnea, sensors, electronic textiles, wearables
Procedia PDF Downloads 2758276 Kantian Epistemology in Examination of the Axiomatic Principles of Economics: The Synthetic a Priori in the Economic Structure of Society
Authors: Mirza Adil Ahmad Mughal
Abstract:
Transcendental analytics, in the critique of pure reason, combines space and time as conditions of the possibility of the phenomenon from the transcendental aesthetic with the pure magnitude-intuition notion. The property of continuity as a qualitative result of the additive magnitude brings the possibility of connecting with experience, even though only as a potential because of the a priori necessity from assumption, as syntheticity of the a priori task of a scientific method of philosophy given by Kant, which precludes the application of categories to something not empirically reducible to the content of such a category's corresponding and possible object. This continuity as the qualitative result of a priori constructed notion of magnitude lies as a fundamental assumption and property of, what in Microeconomic theory is called as, 'choice rules' which combine the potentially-empirical and practical budget-price pairs with preference relations. This latter result is the purest qualitative side of the choice rules', otherwise autonomously, quantitative nature. The theoretical, barring the empirical, nature of this qualitative result is a synthetic a priori truth, which, if at all, it should be, if the axiomatic structure of the economic theory is held to be correct. It has a potentially verifiable content as its possible object in the form of quantitative price-budget pairs. Yet, the object that serves the respective Kantian category is qualitative itself, which is utility. This article explores the validity of Kantian qualifications for this application of 'categories' to the economic structure of society.Keywords: categories of understanding, continuity, convexity, psyche, revealed preferences, synthetic a priori
Procedia PDF Downloads 1008275 Application of Value Engineering Approach for Improving the Quality and Productivity of Ready-Mixed Concrete Used in Construction and Hydraulic Projects
Authors: Adel Mohamed El-Baghdady, Walid Sayed Abdulgalil, Ahmad Asran, Ibrahim Nosier
Abstract:
This paper studies the effectiveness of applying value engineering to actual concrete mixtures. The study was conducted in the State of Qatar on a number of strategic construction projects with international engineering specifications for the 2022 World Cup projects. The study examined the concrete mixtures of Doha Metro project and the development of KAHRAMAA’s (Qatar Electricity and Water Company) Abu Funtas Strategic Desalination Plant, in order to generally improve the quality and productivity of ready-mixed concrete used in construction and hydraulic projects. The application of value engineering to such concrete mixtures resulted in the following: i) improving the quality of concrete mixtures and increasing the durability of buildings in which they are used; ii) reducing the waste of excess materials of concrete mixture, optimizing the use of resources, and enhancing sustainability; iii) reducing the use of cement, thus reducing CO₂ emissions which ensures the protection of environment and public health; iv) reducing actual costs of concrete mixtures and, in turn, reducing the costs of construction projects; and v) increasing the market share and competitiveness of concrete producers. This research shows that applying the methodology of value engineering to ready-mixed concrete is an effective way to save around 5% of the total cost of concrete mixtures supplied to construction and hydraulic projects, improve the quality according to the technical requirements and as per the standards and specifications for ready-mixed concrete, improve the environmental impact, and promote sustainability.Keywords: value management, cost of concrete, performance, optimization, sustainability, environmental impact
Procedia PDF Downloads 3568274 Predictive Analytics in Oil and Gas Industry
Authors: Suchitra Chnadrashekhar
Abstract:
Earlier looked as a support function in an organization information technology has now become a critical utility to manage their daily operations. Organizations are processing huge amount of data which was unimaginable few decades before. This has opened the opportunity for IT sector to help industries across domains to handle the data in the most intelligent manner. Presence of IT has been a leverage for the Oil & Gas industry to store, manage and process the data in most efficient way possible thus deriving the economic value in their day-to-day operations. Proper synchronization between Operational data system and Information Technology system is the need of the hour. Predictive analytics supports oil and gas companies by addressing the challenge of critical equipment performance, life cycle, integrity, security, and increase their utilization. Predictive analytics go beyond early warning by providing insights into the roots of problems. To reach their full potential, oil and gas companies need to take a holistic or systems approach towards asset optimization and thus have the functional information at all levels of the organization in order to make the right decisions. This paper discusses how the use of predictive analysis in oil and gas industry is redefining the dynamics of this sector. Also, the paper will be supported by real time data and evaluation of the data for a given oil production asset on an application tool, SAS. The reason for using SAS as an application for our analysis is that SAS provides an analytics-based framework to improve uptimes, performance and availability of crucial assets while reducing the amount of unscheduled maintenance, thus minimizing maintenance-related costs and operation disruptions. With state-of-the-art analytics and reporting, we can predict maintenance problems before they happen and determine root causes in order to update processes for future prevention.Keywords: hydrocarbon, information technology, SAS, predictive analytics
Procedia PDF Downloads 3618273 Remote Controlled of In-Situ Forming Thermo-sensitive Hydrogel Nanocomposite for Hyperthermia Therapy Application: Synthesis and Characterizations
Authors: Elbadawy A. Kamoun
Abstract:
Magnetically responsive hydrogel nanocomposite (NCH) based on composites of superparamagnetic of Fe3O4 nano-particles and temperature responsive hydrogel matrices were developed. The nanocomposite hydrogel system based on the temperature sensitive N-isopropylacrylamide hydrogels crosslinked by poly(ethylene glycol)-400 dimethacrylate (PEG400DMA) incorporating with chitosan derivative, was synthesized and characterized. Likewise, the NCH system was synthesized by visible-light free radical photopolymerization, using carboxylated camphorquinone-amine system to avoid the common risks of the use of UV-light especially in hyperthermia treatment. Superparamagnetic of iron oxide nanoparticles were introduced into the hydrogel system by polymerizing mixture technique and monomer solution. FT-IR with Raman spectroscopy and Wide angle-XRD analysis were utilized to verify the chemical structure of NCH and exfoliation reaction for nanoparticles, respectively. Additionally, morphological structure of NCH was investigated using SEM and TEM photographs. The swelling responsive of the current nanocomposite hydrogel system with different crosslinking conditions, temperature, magnetic field efficiency, and the presence effect of magnetic nanoparticles were evaluated. Notably, hydrolytic degradation of this system was proved in vitro application. While, in-vivo release profile behavior is under investigation nowadays. Moreover, the compatibility and cytotoxicity tests were previously investigated in our studies for photoinitiating system. These systems show promised polymeric material candidate devices and are expected to have a wide applicability in various biomedical applications as mildly.Keywords: hydrogel nanocomposites, tempretaure-responsive hydrogel, superparamagnetic nanoparticles, hyperthermia therapy
Procedia PDF Downloads 2808272 A Transformer-Based Approach for Multi-Human 3D Pose Estimation Using Color and Depth Images
Authors: Qiang Wang, Hongyang Yu
Abstract:
Multi-human 3D pose estimation is a challenging task in computer vision, which aims to recover the 3D joint locations of multiple people from multi-view images. In contrast to traditional methods, which typically only use color (RGB) images as input, our approach utilizes both color and depth (D) information contained in RGB-D images. We also employ a transformer-based model as the backbone of our approach, which is able to capture long-range dependencies and has been shown to perform well on various sequence modeling tasks. Our method is trained and tested on the Carnegie Mellon University (CMU) Panoptic dataset, which contains a diverse set of indoor and outdoor scenes with multiple people in varying poses and clothing. We evaluate the performance of our model on the standard 3D pose estimation metrics of mean per-joint position error (MPJPE). Our results show that the transformer-based approach outperforms traditional methods and achieves competitive results on the CMU Panoptic dataset. We also perform an ablation study to understand the impact of different design choices on the overall performance of the model. In summary, our work demonstrates the effectiveness of using a transformer-based approach with RGB-D images for multi-human 3D pose estimation and has potential applications in real-world scenarios such as human-computer interaction, robotics, and augmented reality.Keywords: multi-human 3D pose estimation, RGB-D images, transformer, 3D joint locations
Procedia PDF Downloads 818271 A Combined Approach Based on Artificial Intelligence and Computer Vision for Qualitative Grading of Rice Grains
Authors: Hemad Zareiforoush, Saeed Minaei, Ahmad Banakar, Mohammad Reza Alizadeh
Abstract:
The quality inspection of rice (Oryza sativa L.) during its various processing stages is very important. In this research, an artificial intelligence-based model coupled with computer vision techniques was developed as a decision support system for qualitative grading of rice grains. For conducting the experiments, first, 25 samples of rice grains with different levels of percentage of broken kernels (PBK) and degree of milling (DOM) were prepared and their qualitative grade was assessed by experienced experts. Then, the quality parameters of the same samples examined by experts were determined using a machine vision system. A grading model was developed based on fuzzy logic theory in MATLAB software for making a relationship between the qualitative characteristics of the product and its quality. Totally, 25 rules were used for qualitative grading based on AND operator and Mamdani inference system. The fuzzy inference system was consisted of two input linguistic variables namely, DOM and PBK, which were obtained by the machine vision system, and one output variable (quality of the product). The model output was finally defuzzified using Center of Maximum (COM) method. In order to evaluate the developed model, the output of the fuzzy system was compared with experts’ assessments. It was revealed that the developed model can estimate the qualitative grade of the product with an accuracy of 95.74%.Keywords: machine vision, fuzzy logic, rice, quality
Procedia PDF Downloads 4218270 A Survey of Field Programmable Gate Array-Based Convolutional Neural Network Accelerators
Authors: Wei Zhang
Abstract:
With the rapid development of deep learning, neural network and deep learning algorithms play a significant role in various practical applications. Due to the high accuracy and good performance, Convolutional Neural Networks (CNNs) especially have become a research hot spot in the past few years. However, the size of the networks becomes increasingly large scale due to the demands of the practical applications, which poses a significant challenge to construct a high-performance implementation of deep learning neural networks. Meanwhile, many of these application scenarios also have strict requirements on the performance and low-power consumption of hardware devices. Therefore, it is particularly critical to choose a moderate computing platform for hardware acceleration of CNNs. This article aimed to survey the recent advance in Field Programmable Gate Array (FPGA)-based acceleration of CNNs. Various designs and implementations of the accelerator based on FPGA under different devices and network models are overviewed, and the versions of Graphic Processing Units (GPUs), Application Specific Integrated Circuits (ASICs) and Digital Signal Processors (DSPs) are compared to present our own critical analysis and comments. Finally, we give a discussion on different perspectives of these acceleration and optimization methods on FPGA platforms to further explore the opportunities and challenges for future research. More helpfully, we give a prospect for future development of the FPGA-based accelerator.Keywords: deep learning, field programmable gate array, FPGA, hardware accelerator, convolutional neural networks, CNN
Procedia PDF Downloads 1298269 A Physiological Approach for Early Detection of Hemorrhage
Authors: Rabie Fadil, Parshuram Aarotale, Shubha Majumder, Bijay Guargain
Abstract:
Hemorrhage is the loss of blood from the circulatory system and leading cause of battlefield and postpartum related deaths. Early detection of hemorrhage remains the most effective strategy to reduce mortality rate caused by traumatic injuries. In this study, we investigated the physiological changes via non-invasive cardiac signals at rest and under different hemorrhage conditions simulated through graded lower-body negative pressure (LBNP). Simultaneous electrocardiogram (ECG), photoplethysmogram (PPG), blood pressure (BP), impedance cardiogram (ICG), and phonocardiogram (PCG) were acquired from 10 participants (age:28 ± 6 year, weight:73 ± 11 kg, height:172 ± 8 cm). The LBNP protocol consisted of applying -20, -30, -40, -50, and -60 mmHg pressure to the lower half of the body. Beat-to-beat heart rate (HR), systolic blood pressure (SBP), diastolic blood pressure (DBP), and mean aerial pressure (MAP) were extracted from ECG and blood pressure. Systolic amplitude (SA), systolic time (ST), diastolic time (DT), and left ventricle Ejection time (LVET) were extracted from PPG during each stage. Preliminary results showed that the application of -40 mmHg i.e. moderate stage simulated hemorrhage resulted significant changes in HR (85±4 bpm vs 68 ± 5bpm, p < 0.01), ST (191 ± 10 ms vs 253 ± 31 ms, p < 0.05), LVET (350 ± 14 ms vs 479 ± 47 ms, p < 0.05) and DT (551 ± 22 ms vs 683 ± 59 ms, p < 0.05) compared to rest, while no change was observed in SA (p > 0.05) as a consequence of LBNP application. These findings demonstrated the potential of cardiac signals in detecting moderate hemorrhage. In future, we will analyze all the LBNP stages and investigate the feasibility of other physiological signals to develop a predictive machine learning model for early detection of hemorrhage.Keywords: blood pressure, hemorrhage, lower-body negative pressure, LBNP, machine learning
Procedia PDF Downloads 1678268 Global Analysis of Modern Economic Sanctions
Authors: I. L. Yakushev
Abstract:
Economic sanctions are an integral part of the foreign policy repertoire of States. Increasingly, States and international organizations are resorting to sanctions to address a variety of issues -from fighting corruption to preventing the use of nuclear weapons. Over time, the ways in which economic sanctions have been used have changed, especially over the past two decades. In the late 1990s, the recognition of the humanitarian harm of economic sanctions and the "War on Terrorism" after the events of September 11, 2001, led to serious changes in the structure and mechanisms of their application. Questions about how these coercive tools work, when they are applied, what consequences they have, and when they are successful are still being determined by research conducted in the second half of the 20th century. The conclusions drawn from past cases of sanctions may not be fully applicable to the current sanctions policy. In the second half of the 20th century, most cases of sanctions were related to the United States, and it covered restrictions on international trade. However, over the past two decades, the European Union, the United Nations, and China have also been the main initiators of sanctions. Modern sanctions include targeted and financial restrictions and are applied against individuals, organizations, and companies. Changing the senders, targets, stakeholders, and economic instruments used in the sanctions policy has serious implications for effectiveness and results. The regulatory and bureaucratic infrastructure necessary to implement and comply with modern economic sanctions has become more reliable. This evolution of sanctions has provided the scientific community with an opportunity to study new issues of coercion and return to the old ones. The economic sanctions research program should be developed to be relevant for understanding the application of modern sanctions and their consequences.Keywords: global analysis, economic sanctions, targeted sanctions, foreign policy, domestic policy, United Nations, European Union, USA, economic pressure
Procedia PDF Downloads 618267 Concentrated Whey Protein Drink with Orange Flavor: Protein Modification and Formulation
Authors: Shahram Naghizadeh Raeisi, Ali Alghooneh
Abstract:
The application of whey protein in drink industry to enhance the nutritional value of the products is important. Furthermore, the gelification of protein during thermal treatment and shelf life makes some limitations in its application. So, the main goal of this research is manufacturing of high concentrate whey protein orange drink with appropriate shelf life. In this way, whey protein was 5 to 30% hydrolyzed ( in 5 percent intervals at six stages), then thermal stability of samples with 10% concentration of protein was tested in acidic condition (T= 90 °C, pH=4.2, 5 minutes ) and neutral condition (T=120° C, pH:6.7, 20 minutes.) Furthermore, to study the shelf life of heat treated samples in 4 months at 4 and 24 °C, the time sweep rheological test were done. At neutral conditions, 5 to 20% hydrolyzed sample showed gelling during thermal treatment, whereas at acidic condition, was happened only in 5 to 10 percent hydrolyzed samples. This phenomenon could be related to the difference in hydrodynamic radius and zeta potential of samples with different level of hydrolyzation at acidic and neutral conditions. To study the gelification of heat resistant protein solutions during shelf life, for 4 months with 7 days intervals, the time sweep analysis were performed. Cross over was observed for all heat resistant neutral samples at both storage temperature, while in heat resistant acidic samples with degree of hydrolysis, 25 and 30 percentage at 4 and 20 °C, it was not seen. It could be concluded that the former sample was stable during heat treatment and 4 months storage, which made them a good choice for manufacturing high protein drinks. The Scheffe polynomial model and numerical optimization were employed for modeling and high protein orange drink formula optimization. Scheffe model significantly predicted the overal acceptance index (Pvalue<0.05) of sensorial analysis. The coefficient of determination (R2) of 0.94, the adjusted coefficient of determination (R2Adj) of 0.90, insignificance of the lack-of-fit test and F value of 64.21 showed the accuracy of the model. Moreover, the coefficient of variable (C.V) was 6.8% which suggested the replicability of the experimental data. The desirability function had been achieved to be 0.89, which indicates the high accuracy of optimization. The optimum formulation was found as following: Modified whey protein solution (65.30%), natural orange juice (33.50%), stevia sweetener (0.05%), orange peel oil (0.15%) and citric acid (1 %), respectively. Its worth mentioning that this study made an appropriate model for application of whey protein in drink industry without bitter flavor and gelification during heat treatment and shelf life.Keywords: croos over, orange beverage, protein modification, optimization
Procedia PDF Downloads 628266 Secure Automatic Key SMS Encryption Scheme Using Hybrid Cryptosystem: An Approach for One Time Password Security Enhancement
Authors: Pratama R. Yunia, Firmansyah, I., Ariani, Ulfa R. Maharani, Fikri M. Al
Abstract:
Nowadays, notwithstanding that the role of SMS as a means of communication has been largely replaced by online applications such as WhatsApp, Telegram, and others, the fact that SMS is still used for certain and important communication needs is indisputable. Among them is for sending one time password (OTP) as an authentication media for various online applications ranging from chatting, shopping to online banking applications. However, the usage of SMS does not pretty much guarantee the security of transmitted messages. As a matter of fact, the transmitted messages between BTS is still in the form of plaintext, making it extremely vulnerable to eavesdropping, especially if the message is confidential, for instance, the OTP. One solution to overcome this problem is to use an SMS application which provides security services for each transmitted message. Responding to this problem, in this study, an automatic key SMS encryption scheme was designed as a means to secure SMS communication. The proposed scheme allows SMS sending, which is automatically encrypted with keys that are constantly changing (automatic key update), automatic key exchange, and automatic key generation. In terms of the security method, the proposed scheme applies cryptographic techniques with a hybrid cryptosystem mechanism. Proofing the proposed scheme, a client to client SMS encryption application was developed using Java platform with AES-256 as encryption algorithm, RSA-768 as public and private key generator and SHA-256 for message hashing function. The result of this study is a secure automatic key SMS encryption scheme using hybrid cryptosystem which can guarantee the security of every transmitted message, so as to become a reliable solution in sending confidential messages through SMS although it still has weaknesses in terms of processing time.Keywords: encryption scheme, hybrid cryptosystem, one time password, SMS security
Procedia PDF Downloads 1308265 Pragmatic Development of Chinese Sentence Final Particles via Computer-Mediated Communication
Authors: Qiong Li
Abstract:
This study investigated in which condition computer-mediated communication (CMC) could promote pragmatic development. The focal feature included four Chinese sentence final particles (SFPs), a, ya, ba, and ne. They occur frequently in Chinese, and function as mitigators to soften the tone of speech. However, L2 acquisition of SFPs is difficult, suggesting the necessity of additional exposure to or explicit instruction on Chinese SFPs. This study follows this line and aims to explore two research questions: (1) Is CMC combined with data-driven instruction more effective than CMC alone in promoting L2 Chinese learners’ SFP use? (2) How does L2 Chinese learners’ SFP use change over time, as compared to the production of native Chinese speakers? The study involved 19 intermediate-level learners of Chinese enrolled at a private American university. They were randomly assigned to two groups: (1) the control group (N = 10), which was exposed to SFPs through CMC alone, (2) the treatment group (N = 9), which was exposed to SFPs via CMC and data-driven instruction. Learners interacted with native speakers on given topics through text-based CMC over Skype. Both groups went through six 30-minute CMC sessions on a weekly basis, with a one-week interval after the first two CMC sessions and a two-week interval after the second two CMC sessions (nine weeks in total). The treatment group additionally received a data-driven instruction after the first two sessions. Data analysis focused on three indices: token frequency, type frequency, and acceptability of SFP use. Token frequency was operationalized as the raw occurrence of SFPs per clause. Type frequency was the range of SFPs. Acceptability was rated by two native speakers using a rating rubric. The results showed that the treatment group made noticeable progress over time on the three indices. The production of SFPs approximated the native-like level. In contrast, the control group only slightly improved on token frequency. Only certain SFPs (a and ya) reached the native-like use. Potential explanations for the group differences were discussed in two aspects: the property of Chinese SFPs and the role of CMC and data-driven instruction. Though CMC provided the learners with opportunities to notice and observe SFP use, as a feature with low saliency, SFPs were not easily noticed in input. Data-driven instruction in the treatment group directed the learners’ attention to these particles, which facilitated the development.Keywords: computer-mediated communication, data-driven instruction, pragmatic development, second language Chinese, sentence final particles
Procedia PDF Downloads 4188264 Floodnet: Classification for Post Flood Scene with a High-Resolution Aerial Imaginary Dataset
Authors: Molakala Mourya Vardhan Reddy, Kandimala Revanth, Koduru Sumanth, Beena B. M.
Abstract:
Emergency response and recovery operations are severely hampered by natural catastrophes, especially floods. Understanding post-flood scenarios is essential to disaster management because it facilitates quick evaluation and decision-making. To this end, we introduce FloodNet, a brand-new high-resolution aerial picture collection created especially for comprehending post-flood scenes. A varied collection of excellent aerial photos taken during and after flood occurrences make up FloodNet, which offers comprehensive representations of flooded landscapes, damaged infrastructure, and changed topographies. The dataset provides a thorough resource for training and assessing computer vision models designed to handle the complexity of post-flood scenarios, including a variety of environmental conditions and geographic regions. Pixel-level semantic segmentation masks are used to label the pictures in FloodNet, allowing for a more detailed examination of flood-related characteristics, including debris, water bodies, and damaged structures. Furthermore, temporal and positional metadata improve the dataset's usefulness for longitudinal research and spatiotemporal analysis. For activities like flood extent mapping, damage assessment, and infrastructure recovery projection, we provide baseline standards and evaluation metrics to promote research and development in the field of post-flood scene comprehension. By integrating FloodNet into machine learning pipelines, it will be easier to create reliable algorithms that will help politicians, urban planners, and first responders make choices both before and after floods. The goal of the FloodNet dataset is to support advances in computer vision, remote sensing, and disaster response technologies by providing a useful resource for researchers. FloodNet helps to create creative solutions for boosting communities' resilience in the face of natural catastrophes by tackling the particular problems presented by post-flood situations.Keywords: image classification, segmentation, computer vision, nature disaster, unmanned arial vehicle(UAV), machine learning.
Procedia PDF Downloads 818263 Screening Deformed Red Blood Cells Irradiated by Ionizing Radiations Using Windowed Fourier Transform
Authors: Dahi Ghareab Abdelsalam Ibrahim, R. H. Bakr
Abstract:
Ionizing radiation, such as gamma radiation and X-rays, has many applications in medical diagnoses and cancer treatment. In this paper, we used the windowed Fourier transform to extract the complex image of the deformed red blood cells. The real values of the complex image are used to extract the best fitting of the deformed cell boundary. Male albino rats are irradiated by γ-rays from ⁶⁰Co. The male albino rats are anesthetized with ether, and then blood samples are collected from the eye vein by heparinized capillary tubes for studying the radiation-damaging effect in-vivo by the proposed windowed Fourier transform. The peripheral blood films are prepared according to the Brown method. The peripheral blood film is photographed by using an Automatic Image Contour Analysis system (SAMICA) from ELBEK-Bildanalyse GmbH, Siegen, Germany. The SAMICA system is provided with an electronic camera connected to a computer through a built-in interface card, and the image can be magnified up to 1200 times and displayed by the computer. The images of the peripheral blood films are then analyzed by the windowed Fourier transform method to extract the precise deformation from the best fitting. Based on accurate deformation evaluation of the red blood cells, diseases can be diagnosed in their primary stages.Keywords: windowed Fourier transform, red blood cells, phase wrapping, Image processing
Procedia PDF Downloads 858262 Data Science in Military Decision-Making: A Semi-Systematic Literature Review
Authors: H. W. Meerveld, R. H. A. Lindelauf
Abstract:
In contemporary warfare, data science is crucial for the military in achieving information superiority. Yet, to the authors’ knowledge, no extensive literature survey on data science in military decision-making has been conducted so far. In this study, 156 peer-reviewed articles were analysed through an integrative, semi-systematic literature review to gain an overview of the topic. The study examined to what extent literature is focussed on the opportunities or risks of data science in military decision-making, differentiated per level of war (i.e. strategic, operational, and tactical level). A relatively large focus on the risks of data science was observed in social science literature, implying that political and military policymakers are disproportionally influenced by a pessimistic view on the application of data science in the military domain. The perceived risks of data science are, however, hardly addressed in formal science literature. This means that the concerns on the military application of data science are not addressed to the audience that can actually develop and enhance data science models and algorithms. Cross-disciplinary research on both the opportunities and risks of military data science can address the observed research gaps. Considering the levels of war, relatively low attention for the operational level compared to the other two levels was observed, suggesting a research gap with reference to military operational data science. Opportunities for military data science mostly arise at the tactical level. On the contrary, studies examining strategic issues mostly emphasise the risks of military data science. Consequently, domain-specific requirements for military strategic data science applications are hardly expressed. Lacking such applications may ultimately lead to a suboptimal strategic decision in today’s warfare.Keywords: data science, decision-making, information superiority, literature review, military
Procedia PDF Downloads 1708261 Multicollinearity and MRA in Sustainability: Application of the Raise Regression
Authors: Claudia García-García, Catalina B. García-García, Román Salmerón-Gómez
Abstract:
Much economic-environmental research includes the analysis of possible interactions by using Moderated Regression Analysis (MRA), which is a specific application of multiple linear regression analysis. This methodology allows analyzing how the effect of one of the independent variables is moderated by a second independent variable by adding a cross-product term between them as an additional explanatory variable. Due to the very specification of the methodology, the moderated factor is often highly correlated with the constitutive terms. Thus, great multicollinearity problems arise. The appearance of strong multicollinearity in a model has important consequences. Inflated variances of the estimators may appear, there is a tendency to consider non-significant regressors that they probably are together with a very high coefficient of determination, incorrect signs of our coefficients may appear and also the high sensibility of the results to small changes in the dataset. Finally, the high relationship among explanatory variables implies difficulties in fixing the individual effects of each one on the model under study. These consequences shifted to the moderated analysis may imply that it is not worth including an interaction term that may be distorting the model. Thus, it is important to manage the problem with some methodology that allows for obtaining reliable results. After a review of those works that applied the MRA among the ten top journals of the field, it is clear that multicollinearity is mostly disregarded. Less than 15% of the reviewed works take into account potential multicollinearity problems. To overcome the issue, this work studies the possible application of recent methodologies to MRA. Particularly, the raised regression is analyzed. This methodology mitigates collinearity from a geometrical point of view: the collinearity problem arises because the variables under study are very close geometrically, so by separating both variables, the problem can be mitigated. Raise regression maintains the available information and modifies the problematic variables instead of deleting variables, for example. Furthermore, the global characteristics of the initial model are also maintained (sum of squared residuals, estimated variance, coefficient of determination, global significance test and prediction). The proposal is implemented to data from countries of the European Union during the last year available regarding greenhouse gas emissions, per capita GDP and a dummy variable that represents the topography of the country. The use of a dummy variable as the moderator is a special variant of MRA, sometimes called “subgroup regression analysis.” The main conclusion of this work is that applying new techniques to the field can improve in a substantial way the results of the analysis. Particularly, the use of raised regression mitigates great multicollinearity problems, so the researcher is able to rely on the interaction term when interpreting the results of a particular study.Keywords: multicollinearity, MRA, interaction, raise
Procedia PDF Downloads 1078260 Bioaccessible Phenolics, Phenolic Bioaccessibility and Antioxidant Activity of Pumpkin Flour
Authors: Emine Aydin, Duygu Gocmen
Abstract:
Pumpkin flour (PF) has a long shelf life and can be used as a nutritive, functional (antioxidant properties, phenolic contents, etc.) and coloring agent in many food items, especially in bakery products, sausages, instant noodles, pasta and flour mixes. Pre-treatment before drying is one of the most important factors affecting the quality of a final powdered product. Pretreatment, such as soaking in a bisulfite solution, provides that total carotenoids in raw materials rich in carotenoids, especially pumpkins, are retained in the dried product. This is due to the beneficial effect of antioxidant additives in the protection of carotenoids in the dehydrated plant foods. The oxygen present in the medium is removed by the radical SO₂, and thus the carotene degradation caused by the molecular oxygen is inhibited by the presence of SO₂. In this study, pumpkin flours (PFs) produced by two different applications (with or without metabisulfite pre-treatment) and then dried in a freeze dryer. The phenolic contents and antioxidant activities of pumpkin flour were determined. In addition to this, the compound of bioavailable phenolic substances which is obtained by PF has also been investigated using in vitro methods. As a result of researches made in recent years, it has been determined that all nutrients taken with foodstuffs are not bioavailable. Bioavailability changes depending on physical properties, chemical compounds, and capacities of individual digestion of foods. Therefore in this study; bioaccessible phenolics and phenolic bioaccessibility were also determined. The phenolics of the samples with metabisulfite application were higher than those of the samples without metabisulfite pre-treatment. Soaking in metabisulfite solution might have a protective effect for phenolic compounds. Phenolics bioaccessibility of pumpkin flours was investigated in order to assess pumpkin flour as sources of accessible phenolics. The higher bioaccessible phenolics (384.19 mg of GAE 100g⁻¹ DW) and phenolic bioaccessibility values (33.65 mL 100 mL⁻¹) were observed in the pumpkin flour with metabisulfite pre-treatment. Metabisulfite application caused an increase in bioaccessible phenolics of pumpkin flour. According to all assay (ABTS, CUPRAC, DPPH, and FRAP) results, both free and bound phenolics of pumpkin flour with metabisulfite pre-treatment had higher antioxidant activity than those of the sample without metabisulfite pre-treatment. The samples subjected to MS pre-treatment exhibited higher antioxidant activities than those of the samples without MS pre-treatment, this possibly due to higher phenolic contents of the samples with metabisulfite applications. As a result, metabisulfite application caused an increase in phenolic contents, bioaccessible phenolics, phenolic bioaccessibility and antioxidant activities of pumpkin flour. It can be said that pumpkin flour can be used as an alternative functional and nutritional ingredient in bakery products, dairy products (yoghurt, ice-cream), soups, sauces, infant formulae, confectionery, etc.Keywords: pumpkin flour, bioaccessible phenolics, phenolic bioaccessibility, antioxidant activity
Procedia PDF Downloads 3268259 Variability Management of Contextual Feature Model in Multi-Software Product Line
Authors: Muhammad Fezan Afzal, Asad Abbas, Imran Khan, Salma Imtiaz
Abstract:
Software Product Line (SPL) paradigm is used for the development of the family of software products that share common and variable features. Feature model is a domain of SPL that consists of common and variable features with predefined relationships and constraints. Multiple SPLs consist of a number of similar common and variable features, such as mobile phones and Tabs. Reusability of common and variable features from the different domains of SPL is a complex task due to the external relationships and constraints of features in the feature model. To increase the reusability of feature model resources from domain engineering, it is required to manage the commonality of features at the level of SPL application development. In this research, we have proposed an approach that combines multiple SPLs into a single domain and converts them to a common feature model. Extracting the common features from different feature models is more effective, less cost and time to market for the application development. For extracting features from multiple SPLs, the proposed framework consists of three steps: 1) find the variation points, 2) find the constraints, and 3) combine the feature models into a single feature model on the basis of variation points and constraints. By using this approach, reusability can increase features from the multiple feature models. The impact of this research is to reduce the development of cost, time to market and increase products of SPL.Keywords: software product line, feature model, variability management, multi-SPLs
Procedia PDF Downloads 708258 Architecture - Performance Relationship in GPU Computing - Composite Process Flow Modeling and Simulations
Authors: Ram Mohan, Richard Haney, Ajit Kelkar
Abstract:
Current developments in computing have shown the advantage of using one or more Graphic Processing Units (GPU) to boost the performance of many computationally intensive applications but there are still limits to these GPU-enhanced systems. The major factors that contribute to the limitations of GPU(s) for High Performance Computing (HPC) can be categorized as hardware and software oriented in nature. Understanding how these factors affect performance is essential to develop efficient and robust applications codes that employ one or more GPU devices as powerful co-processors for HPC computational modeling. This research and technical presentation will focus on the analysis and understanding of the intrinsic interrelationship of both hardware and software categories on computational performance for single and multiple GPU-enhanced systems using a computationally intensive application that is representative of a large portion of challenges confronting modern HPC. The representative application uses unstructured finite element computations for transient composite resin infusion process flow modeling as the computational core, characteristics and results of which reflect many other HPC applications via the sparse matrix system used for the solution of linear system of equations. This work describes these various software and hardware factors and how they interact to affect performance of computationally intensive applications enabling more efficient development and porting of High Performance Computing applications that includes current, legacy, and future large scale computational modeling applications in various engineering and scientific disciplines.Keywords: graphical processing unit, software development and engineering, performance analysis, system architecture and software performance
Procedia PDF Downloads 3658257 The Application of Lesson Study Model in Writing Review Text in Junior High School
Authors: Sulastriningsih Djumingin
Abstract:
This study has some objectives. It aims at describing the ability of the second-grade students to write review text without applying the Lesson Study model at SMPN 18 Makassar. Second, it seeks to describe the ability of the second-grade students to write review text by applying the Lesson Study model at SMPN 18 Makassar. Third, it aims at testing the effectiveness of the Lesson Study model in writing review text at SMPN 18 Makassar. This research was true experimental design with posttest Only group design involving two groups consisting of one class of the control group and one class of the experimental group. The research populations were all the second-grade students at SMPN 18 Makassar amounted to 250 students consisting of 8 classes. The sampling technique was purposive sampling technique. The control class was VIII2 consisting of 30 students, while the experimental class was VIII8 consisting of 30 students. The research instruments were in the form of observation and tests. The collected data were analyzed using descriptive statistical techniques and inferential statistical techniques with t-test types processed using SPSS 21 for windows. The results shows that: (1) of 30 students in control class, there are only 14 (47%) students who get the score more than 7.5, categorized as inadequate; (2) in the experimental class, there are 26 (87%) students who obtain the score of 7.5, categorized as adequate; (3) the Lesson Study models is effective to be applied in writing review text. Based on the comparison of the ability of the control class and experimental class, it indicates that the value of t-count is greater than the value of t-table (2.411> 1.667). It means that the alternative hypothesis (H1) proposed by the researcher is accepted.Keywords: application, lesson study, review text, writing
Procedia PDF Downloads 2038256 Aging-Related Changes in Calf Muscle Function: Implications for Venous Hemodynamic and the Role of External Mechanical Activation
Authors: Bhavatharani S., Boopathy V., Kavin S., Naveethkumar R.
Abstract:
Context: Resistance training with blood flow restriction (BFR) has increased in clinical rehabilitation due to the substantial benefits observed in augmenting muscle mass and strength using low loads. However, there is a great variability of training pressures for clinical populations as well as methods to estimate it. The aim of this study was to estimate the percentage of maximal BFR that could result by applying different methodologies based on arbitrary or individual occlusion levels using a cuff width between 9 and 13 cm. Design: A secondary analysis was performed on the combined databases of 2 previous larger studies using BFR training. Methods: To estimate these percentages, the occlusion values needed to reach complete BFR (100% limb occlusion pressure [LOP]) were estimated by Doppler ultrasound. Seventy-five participants (age 24.32 [4.86] y; weight: 78.51 [14.74] kg; height: 1.77 [0.09] m) were enrolled in the laboratory study for measuring LOP in the thigh, arm, or calf. Results: When arbitrary values of restriction are applied, a supra-occlusive LOP between 120% and 190% LOP may result. Furthermore, the application of 130% resting brachial systolic blood pressure creates a similar occlusive stimulus as 100% LOP. Conclusions: Methods using 100 mm Hg and the resting brachial systolic blood pressure could represent the safest application prescriptions as they resulted in applied pressures between 60% and 80% LOP. One hundred thirty percent of the resting brachial systolic blood pressure could be used to indirectly estimate 100% LOP at cuff widths between 9 and 13 cm. Finally, methodologies that use standard values of 200 and, 300 mm Hg far exceed LOP and may carry additional risk during BFR exercise.Keywords: lower limb rehabilitation, ESP32, pneumatics for medical, programmed rehabilitation
Procedia PDF Downloads 848255 Analysis of the Cutting Force with Ultrasonic Assisted Manufacturing of Steel (S235JR)
Authors: Philipp Zopf, Franz Haas
Abstract:
Manufacturing of very hard and refractory materials like ceramics, glass or carbide poses particular challenges on tools and machines. The company Sauer GmbH developed especially for this application area ultrasonic tool holders working in a frequency range from 15 to 60 kHz and superimpose the common tool movement in the vertical axis. This technique causes a structural weakening in the contact area and facilitates the machining. The possibility of the force reduction for these special materials especially in drilling of carbide with diamond tools up to 30 percent made the authors try to expand the application range of this method. To make the results evaluable, the authors decide to start with existing processes in which the positive influence of the ultrasonic assistance is proven to understand the mechanism. The comparison of a grinding process the Institute use to machine materials mentioned in the beginning and steel could not be more different. In the first case, the authors use tools with geometrically undefined edges. In the second case, the edges are geometrically defined. To get valid results of the tests, the authors decide to investigate two manufacturing methods, drilling and milling. The main target of the investigation is to reduce the cutting force measured with a force measurement platform underneath the workpiece. Concerning to the direction of the ultrasonic assistance, the authors expect lower cutting forces and longer endurance of the tool in the drilling process. To verify the frequencies and the amplitudes an FFT-analysis is performed. It shows the increasing damping depending on the infeed rate of the tool. The reducing of amplitude of the cutting force comes along.Keywords: drilling, machining, milling, ultrasonic
Procedia PDF Downloads 274