Search results for: forming tools
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4870

Search results for: forming tools

3940 Determining Which Material Properties Resist the Tool Wear When Machining Pre-Sintered Zirconia

Authors: David Robert Irvine

Abstract:

In the dental restoration sector, there has been a shift to using zirconia. With the ever increasing need to decrease lead times to deliver restorations faster the zirconia is machined in its pre-sintered state instead of grinding the very hard sintered state. As with all machining, there is tool wear and while investigating the tooling used to machine pre-sintered zirconia it became apparent that the wear rate is based more on material build up and abrasion than it is on plastic deformation like conventional metal machining. It also came to light that the tool material can currently not be selected based on wear resistance, as there is no data. Different works have analysed the effect of the individual wear mechanism separately using similar if not the same material. In this work, the testing method used to analyse the wear was a modified from ISO 8688:1989 to use the pre-sintered zirconia and the cutting conditions used in dental to machine it. This understanding was developed through a series of tests based in machining operations, to give the best representation of the multiple wear factors that can occur in machining of pre-sintered zirconia such as 3 body abrasion, material build up, surface welding, plastic deformation, tool vibration and thermal cracking. From the testing, it found that carbide grades with low trans-granular rupture toughness would fail due to abrasion while those with high trans-granular rupture toughness failed due to edge chipping from build up or thermal properties. The results gained can assist the development of these tools and the restorative dental process. This work was completed with the aim of assisting in the selection of tool material for future tools along with a deeper understanding of the properties that assist in abrasive wear resistance and material build up.

Keywords: abrasive wear, cemented carbide, pre-sintered zirconia, tool wear

Procedia PDF Downloads 159
3939 An Investigation of the Integration of Synchronous Online Tools into Task-Based Language Teaching: The Example of SpeakApps

Authors: Nouf Aljohani

Abstract:

The research project described in this presentation focuses on designing and evaluating oral tasks related to students’ needs and levels to foster communication and negotiation of meaning for a group of female Saudi university students. The significance of the current research project lies in its contribution to determining the usefulness of synchronous technology-mediated interactive group discussion in improving different speaking strategies through using synchronous technology. Also, it discovers how to optimize learning outcomes, expand evaluation for online learning tasks and engaging students’ experience in evaluating synchronous interactive tools and tasks. The researcher used SpeakApps, a synchronous technology, that allows the students to practice oral interaction outside the classroom. Such a course of action was considered necessary due to low English proficiency among Saudi students. According to the author's knowledge, the main factor that causes poor speaking skills is that students do not have sufficient time to communicate outside English language classes. Further, speaking and listening course contents are not well designed to match the Saudi learning context. The methodology included designing speaking tasks to match the educational setting; a CALL framework for designing and evaluating tasks; participant involvement in evaluating these tasks in each online session; and an investigation of the factors that led to the successful implementation of Task-based Language Teaching (TBLT) and using SpeakApps. The analysis and data were drawn from the technology acceptance model surveys, a group interview, teachers’ and students’ weekly reflections, and discourse analysis of students’ interactions.

Keywords: CALL evaluation, synchronous technology, speaking skill, task-based language teaching

Procedia PDF Downloads 310
3938 Artificial Law: Legal AI Systems and the Need to Satisfy Principles of Justice, Equality and the Protection of Human Rights

Authors: Begum Koru, Isik Aybay, Demet Celik Ulusoy

Abstract:

The discipline of law is quite complex and has its own terminology. Apart from written legal rules, there is also living law, which refers to legal practice. Basic legal rules aim at the happiness of individuals in social life and have different characteristics in different branches such as public or private law. On the other hand, law is a national phenomenon. The law of one nation and the legal system applied on the territory of another nation may be completely different. People who are experts in a particular field of law in one country may have insufficient expertise in the law of another country. Today, in addition to the local nature of law, international and even supranational law rules are applied in order to protect basic human values and ensure the protection of human rights around the world. Systems that offer algorithmic solutions to legal problems using artificial intelligence (AI) tools will perhaps serve to produce very meaningful results in terms of human rights. However, algorithms to be used should not be developed by only computer experts, but also need the contribution of people who are familiar with law, values, judicial decisions, and even the social and political culture of the society to which it will provide solutions. Otherwise, even if the algorithm works perfectly, it may not be compatible with the values of the society in which it is applied. The latest developments involving the use of AI techniques in legal systems indicate that artificial law will emerge as a new field in the discipline of law. More AI systems are already being applied in the field of law, with examples such as predicting judicial decisions, text summarization, decision support systems, and classification of documents. Algorithms for legal systems employing AI tools, especially in the field of prediction of judicial decisions and decision support systems, have the capacity to create automatic decisions instead of judges. When the judge is removed from this equation, artificial intelligence-made law created by an intelligent algorithm on its own emerges, whether the domain is national or international law. In this work, the aim is to make a general analysis of this new topic. Such an analysis needs both a literature survey and a perspective from computer experts' and lawyers' point of view. In some societies, the use of prediction or decision support systems may be useful to integrate international human rights safeguards. In this case, artificial law can serve to produce more comprehensive and human rights-protective results than written or living law. In non-democratic countries, it may even be thought that direct decisions and artificial intelligence-made law would be more protective instead of a decision "support" system. Since the values of law are directed towards "human happiness or well-being", it requires that the AI algorithms should always be capable of serving this purpose and based on the rule of law, the principle of justice and equality, and the protection of human rights.

Keywords: AI and law, artificial law, protection of human rights, AI tools for legal systems

Procedia PDF Downloads 73
3937 Cyber Security in Russia: Offense, Defense and Strategy in Cyberspace

Authors: Da Eun Sung

Abstract:

In today’s world, cyber security has become an important international agenda. As the information age has arrived, the need for cyber defense against cyber attacks is mounting, and the significance of cyber cooperation in the international community is drawing attention. Through the course, international society has agreed that the institutionalization of international norms dealing with cyber space and cyber security is crucial ever. Nevertheless, the West, led by the United States of America, and 'the East', composed of Russia and China, have shown conflicting views on forming international norms and principles which would regulate and ward off the possible threats in cyber space. Thus, the international community hasn’t yet to reach an agreement on cyber security. In other words, the difference between both sides on the approach and understanding of principles, objects, and the definition has rendered such. Firstly, this dissertation will cover the Russia’s perception, strategy, and definition on cyber security through analyzing primary source. Then, it will delve into the two contrasting cyber security strategy between Russia and the US by comparing them. And in the conclusion, it will seek the possible solution for the cooperation in the field of cyber security. It is quite worthwhile to look into Russia’s views, which is the main counterpart to the US in this field, especially when the efforts to institutionalize cyber security by the US-led international community have met with their boundaries, and when the legitimacy of them have been challenged.

Keywords: cyber security, cyber security strategic, international relation in cyberspace, Russia

Procedia PDF Downloads 319
3936 Development of Medical Intelligent Process Model Using Ontology Based Technique

Authors: Emmanuel Chibuogu Asogwa, Tochukwu Sunday Belonwu

Abstract:

An urgent demand for creative solutions has been created by the rapid expansion of medical knowledge, the complexity of patient care, and the requirement for more precise decision-making. As a solution to this problem, the creation of a Medical Intelligent Process Model (MIPM) utilizing ontology-based appears as a promising way to overcome this obstacle and unleash the full potential of healthcare systems. The development of a Medical Intelligent Process Model (MIPM) using ontology-based techniques is motivated by a lack of quick access to relevant medical information and advanced tools for treatment planning and clinical decision-making, which ontology-based techniques can provide. The aim of this work is to develop a structured and knowledge-driven framework that leverages ontology, a formal representation of domain knowledge, to enhance various aspects of healthcare. Object-Oriented Analysis and Design Methodology (OOADM) were adopted in the design of the system as we desired to build a usable and evolvable application. For effective implementation of this work, we used the following materials/methods/tools: the medical dataset for the test of our model in this work was obtained from Kaggle. The ontology-based technique was used with Confusion Matrix, MySQL, Python, Hypertext Markup Language (HTML), Hypertext Preprocessor (PHP), Cascaded Style Sheet (CSS), JavaScript, Dreamweaver, and Fireworks. According to test results on the new system using Confusion Matrix, both the accuracy and overall effectiveness of the medical intelligent process significantly improved by 20% compared to the previous system. Therefore, using the model is recommended for healthcare professionals.

Keywords: ontology-based, model, database, OOADM, healthcare

Procedia PDF Downloads 78
3935 Impact on Cost of Equity of Accounting and Disclosures

Authors: Abhishek Ranga

Abstract:

The study examined the effect of accounting choice and level of disclosure on the firm’s implied cost of equity in Indian environment. For the study accounting choice was classified as aggressive or conservative depending upon the firm’s choice of accounting methods, accounting policies and accounting estimates. Level of disclosure is the quantum of financial and non-financial information disclosed in firm’s annual report, essentially in note to accounts section, schedules forming part of financial statements and Management Discussion and Analysis report. Regression models were developed with cost of equity as a dependent variable and accounting choice, level of disclosure as an independent variable along with selected control variables. Cost of equity was measured using Edward-Bell-Ohlson (EBO) valuation model, to measure accounting choice Modified-Jones-Model (MJM) was used and level of disclosure was measured using a disclosure index essentially drawn from Botosan study. Results indicated a negative association between the implied cost of equity and conservative accounting choice and also between level of disclosure and cost of equity.

Keywords: aggressive accounting choice, conservative accounting choice, disclosure, implied cost of equity

Procedia PDF Downloads 462
3934 Intercultural Intelligence: How to Turn Cultural Difference into a Key Added Value with Tree Lighting Design Project Examples

Authors: Fanny Soulard

Abstract:

Today work environment is more multicultural than ever: spatial limits have been blown out, encouraging people and ideas mobility all around the globe. Indeed, opportunities to design with culturally diverse team workers, clients, or end-users, have become within everyone's reach. We enjoy traveling to discover other civilizations, but when it comes to business, we often take for granted that our own work methodology will be generic enough to federate each party and cover the project needs. This paper aims to explore why, by skipping cultural awareness, we often create misunderstandings, frustration, and even counterproductive design. Tree lighting projects successively developed by a French lighting studio, a Vietnamese lighting studio, and an Australian Engineering company will be assessed from their concept stage to completion. All these study cases are based in Vietnam, where the construction market is equally led by local and international consultants. Core criteria such as lighting standard reference, service scope, communication tools, internal team organization, delivery package content, key priorities, and client relationship will help to spot and list when and how cultural diversity has impacted the design output and effectiveness. On the second hand, we will demonstrate through the same selected projects how intercultural intelligence tools and mindset can not only respond positively to previous situations and avoid major clashes but also turn cultural differences into a key added value to generate significant benefits for individuals, teams, and companies. By understanding the major importance of including a cultural factor within any design, intercultural intelligence will quickly turn out as a “must have” skill to be developed and acquired by any designer.

Keywords: intercultural intelligence, lighting design, work methodology, multicultural diversity

Procedia PDF Downloads 95
3933 Achieving Them Both: Business and Wellness Outcomes in Health Organizations – the 'Tip' Laser Intervention

Authors: Shosh Kazaz, Shmuel Banai, Vered Zilberberg

Abstract:

Optimizing high business performance and employee's well-being simultaneously often challenges organizations. 'TIP' intervention enables achieving them both as the given project demonstrates. Increasing outcomes and improving performance were the initial motivators for this explorative project, followed by a request of the head of the Cardiology department: 'I know we are the best at our clinical practice, but we need to take it further and break our own glass ceiling.' Two guided interventions were conducted in two different units within the department, designed to implement advanced managerial and business-oriented tools, along with 'soft tools' based on coaching psychology and particularly wellness coaching. The organ department multi-disciplinary teams were assembled, aiming to manage and lead the process: mapping the patients' flow, creating solutions, implementing, assessing, improving and assimilating them. Approximately four months later, without additional external resources, meaningful results emerged by the teams in terms of business and performance: shortening the hospitalization length at a given procedure (from 7 to 2.1 days); increasing the availability of Catheterization laboratory by 16% daily – resulting profitability raise; improving patients' journey and experience. A year later, those results are maintained. Furthermore, interviews with the participants revealed positive perceptions regarding the department; a higher sense of joyfulness, connectedness, belonging and a better department climate were reported. Additionally, participants reported a higher sense of fulfillment as opposed to their earliest skepticism and cynicism about their ability to enhance outcomes without more resources (budget and/or manpower), experiencing a mindset change toward the possibility of leading personal and professional growth processes. These reports were supported by analyzing a set of questionnaires that the participants completed, parallel to a control group of non-participating colleagues. Although the assessment was taken a year after the completion of the project and during 'covid-19th-3rd national quarantine, the results indicated a significant impact on several personal parameters associated with wellness, compared to the control group. The participants were higher in self-efficacy and organizational commitment; men were higher in resilience and optimism and women were higher in well-being. In conclusion, the 'TIP' relatively short intervention integrates advanced managerial and wellness coaching tools, empowers organizational resources: Team, Individual and Process and by that generates multi-impact measurable results in terms of employee's wellness parameters along with business performance and patient care.

Keywords: coaching, health and wellness, health management, leadership and well-being

Procedia PDF Downloads 183
3932 Barriers to Marital Expectation among Individuals with Hearing Impairment in Oyo State

Authors: Adebomi M. Oyewumi, Sunday Amaize

Abstract:

The study was designed to examine the barriers to marital expectations among unmarried persons with hearing impairment in Oyo State, Nigeria. Descriptive survey research design was adopted. Purposive sampling technique was used to select one hundred participants made up forty-four (44) males and fifty-six (56) females, all with varying degrees of hearing impairment. Eight research questions were raised and answered. The instrument used was Marital Expectations Scale with reliability coefficient of 0.86. Data was analyzed using descriptive statistics tools of frequency count and simple percentage as well as inferential statistics tools of T-TEST and ANOVA. The findings revealed that there was a significant relationship existing among the main identified barriers (environmental barrier, communication barrier, hearing loss, unemployment and poor sexuality education) to the marital expectations of unmarried persons with hearing impairment. The joint contribution of the independent variables (identified barriers) to the dependent variable (marital expectations) was significant, F = 5.842, P < 0.05, accounting for about 89% of the variance. The relative contribution of the identified barriers to marital expectations of unmarried persons with hearing impairment is as follows: environmental barrier (β = 0.808, t = 5.176, P < 0.05), communication barrier (β = 0.533, t = 3.305, P < 0.05), hearing loss (β = 0.550, t = 2.233, P < 0.05), unemployment (β = 0.431, t = 2.102, P < 0.05), poor sexuality education (β = 0.361, t = 1.985, P < 0.05). Environmental barrier proved to be the most potent contributor to the poor marital expectations among unmarried persons with hearing impairment. Therefore, it is recommended that society dismantles the nagging environmental barrier through positive identification with individuals suffering from hearing impairment. In this connection, members of society should change their negative attitudes and do away with all the wrong notions about the marital ability of individuals with hearing impairment.

Keywords: environmental barrier, hearing impairment, marriage, marital expectations

Procedia PDF Downloads 369
3931 Estimation of Energy Losses of Photovoltaic Systems in France Using Real Monitoring Data

Authors: Mohamed Amhal, Jose Sayritupac

Abstract:

Photovoltaic (PV) systems have risen as one of the modern renewable energy sources that are used in wide ranges to produce electricity and deliver it to the electrical grid. In parallel, monitoring systems have been deployed as a key element to track the energy production and to forecast the total production for the next days. The reliability of the PV energy production has become a crucial point in the analysis of PV systems. A deeper understanding of each phenomenon that causes a gain or a loss of energy is needed to better design, operate and maintain the PV systems. This work analyzes the current losses distribution in PV systems starting from the available solar energy, going through the DC side and AC side, to the delivery point. Most of the phenomena linked to energy losses and gains are considered and modeled, based on real time monitoring data and datasheets of the PV system components. An analysis of the order of magnitude of each loss is compared to the current literature and commercial software. To date, the analysis of PV systems performance based on a breakdown structure of energy losses and gains is not covered enough in the literature, except in some software where the concept is very common. The cutting-edge of the current analysis is the implementation of software tools for energy losses estimation in PV systems based on several energy losses definitions and estimation technics. The developed tools have been validated and tested on some PV plants in France, which are operating for years. Among the major findings of the current study: First, PV plants in France show very low rates of soiling and aging. Second, the distribution of other losses is comparable to the literature. Third, all losses reported are correlated to operational and environmental conditions. For future work, an extended analysis on further PV plants in France and abroad will be performed.

Keywords: energy gains, energy losses, losses distribution, monitoring, photovoltaic, photovoltaic systems

Procedia PDF Downloads 176
3930 Quality Improvement of the Sand Moulding Process in Foundries Using Six Sigma Technique

Authors: Cindy Sithole, Didier Nyembwe, Peter Olubambi

Abstract:

The sand casting process involves pattern making, mould making, metal pouring and shake out. Every step in the sand moulding process is very critical for production of good quality castings. However, waste generated during the sand moulding operation and lack of quality are matters that influences performance inefficiencies and lack of competitiveness in South African foundries. Defects produced from the sand moulding process are only visible in the final product (casting) which results in increased number of scrap, reduced sales and increases cost in the foundry. The purpose of this Research is to propose six sigma technique (DMAIC, Define, Measure, Analyze, Improve and Control) intervention in sand moulding foundries and to reduce variation caused by deficiencies in the sand moulding process in South African foundries. Its objective is to create sustainability and enhance productivity in the South African foundry industry. Six sigma is a data driven method to process improvement that aims to eliminate variation in business processes using statistical control methods .Six sigma focuses on business performance improvement through quality initiative using the seven basic tools of quality by Ishikawa. The objectives of six sigma are to eliminate features that affects productivity, profit and meeting customers’ demands. Six sigma has become one of the most important tools/techniques for attaining competitive advantage. Competitive advantage for sand casting foundries in South Africa means improved plant maintenance processes, improved product quality and proper utilization of resources especially scarce resources. Defects such as sand inclusion, Flashes and sand burn on were some of the defects that were identified as resulting from the sand moulding process inefficiencies using six sigma technique. The courses were we found to be wrong design of the mould due to the pattern used and poor ramming of the moulding sand in a foundry. Six sigma tools such as the voice of customer, the Fishbone, the voice of the process and process mapping were used to define the problem in the foundry and to outline the critical to quality elements. The SIPOC (Supplier Input Process Output Customer) Diagram was also employed to ensure that the material and process parameters were achieved to ensure quality improvement in a foundry. The process capability of the sand moulding process was measured to understand the current performance to enable improvement. The Expected results of this research are; reduced sand moulding process variation, increased productivity and competitive advantage.

Keywords: defects, foundries, quality improvement, sand moulding, six sigma (DMAIC)

Procedia PDF Downloads 194
3929 Photocaged Carbohydrates: Versatile Tools for Biotechnological Applications

Authors: Claus Bier, Dennis Binder, Alexander Gruenberger, Dagmar Drobietz, Dietrich Kohlheyer, Anita Loeschcke, Karl Erich Jaeger, Thomas Drepper, Joerg Pietruszka

Abstract:

Light absorbing chromophoric systems are important optogenetic tools for biotechnical and biophysical investigations. Processes such as fluorescence or photolysis can be triggered by light-absorption of chromophores. These play a central role in life science. Photocaged compounds belong to such chromophoric systems. The photo-labile protecting groups enable them to release biologically active substances with high temporal and spatial resolution. The properties of photocaged compounds are specified by the characteristics of the caging group as well as the characteristics of the linked effector molecule. In our research, we work with different types of photo-labile protecting groups and various effector molecules giving us possible access to a large library of caged compounds. As a function of the caged effector molecule, a nearly limitless number of biological systems can be directed. Our main interest focusses on photocaging carbohydrates (e.g. arabinose) and their derivatives as effector molecules. Based on these resulting photocaged compounds a precisely controlled photoinduced gene expression will give us access to studies of numerous biotechnological and synthetic biological applications. It could be shown, that the regulation of gene expression via light is possible with photocaged carbohydrates achieving a higher-order control over this processes. With the one-step cleavable photocaged carbohydrate, a homogeneous expression was achieved in comparison to free carbohydrates.

Keywords: bacterial gene expression, biotechnology, caged compounds, carbohydrates, optogenetics, photo-removable protecting group

Procedia PDF Downloads 227
3928 Dynamic Degradation Mechanism of SiC VDMOS under Proton Irradiation

Authors: Junhong Feng, Wenyu Lu, Xinhong Cheng, Li Zheng, Yuehui Yu

Abstract:

The effects of proton irradiation on the properties of gate oxide were evaluated by monitoring the static parameters (such as threshold voltage and on-resistance) and dynamic parameters (Miller plateau time) of 1700V SiC VDMOS before and after proton irradiation. The incident proton energy was 3MeV, and the doses were 5 × 10¹² P / cm², 1 × 10¹³ P / cm², respectively. The results show that the threshold voltage of MOS exhibits negative drift under proton irradiation, and the near-interface traps in the gate oxide layer are occupied by holes generated by the ionization effect of irradiation, thus forming more positive charges. The basis for selecting TMiller is that the change time of Vgs is the time when Vds just shows an upward trend until it rises to a stable value. The degradation of the turn-off time of the Miller platform verifies that the capacitance Cgd becomes larger, reflecting that the gate oxide layer is introduced into the trap by the displacement effect caused by proton irradiation, and the interface state deteriorates. As a more sensitive area in the irradiation process, the gate oxide layer will be optimized for its parameters (such as thickness, type, etc.) in subsequent studies.

Keywords: SiC VDMOS, proton radiation, Miller time, gate oxide

Procedia PDF Downloads 90
3927 Charge Transport in Biological Molecules

Authors: E. L. Albuquerque, U. L. Fulco, G. S. Ourique

Abstract:

The focus of this work is on the numerical investigation of the charge transport properties of the de novo-designed alpha3 polypeptide, as well as in its variants, all of them probed by gene engineering. The theoretical framework makes use of a tight-binding model Hamiltonian, together with ab-initio calculations within quantum chemistry simulation. The alpha3 polypeptide is a 21-residue with three repeats of the seven-residue amino acid sequence Leu-Glu-Thr-Leu-Ala-Lys-Ala, forming an alpha–helical bundle structure. Its variants are obtained by Ala→Gln substitution at the e (5th) and g (7th) position, respectively, of the alpha3 polypeptide amino acid sequence. Using transmission electron microscopy and atomic force microscopy, it was observed that the alpha3 polypeptide and one of its variant do have the ability to form fibrous assemblies, while the other does not. Our main aim is to investigate whether or not the biased alpha3 polypeptide and its variants can be also identified by quantum charge transport measurements through current-voltage (IxV) curves as a pattern to characterize their fibrous assemblies. It was observed that each peptide has a characteristic current pattern, which may be distinguished by charge transport measurements, suggesting that it might be a useful tool for the development of biosensors.

Keywords: charge transport properties, electronic transmittance, current-voltage characteristics, biological sensor

Procedia PDF Downloads 665
3926 Determination of Material Constants and Zener-Hollomon Parameter of AA2017 Aluminium Alloy under Hot Compression Test

Authors: C. H. Shashikanth, M. J. Davidson, V. Suresh Babu

Abstract:

The formability of metals depends on a number of variables such as strain, strain rate, and temperature. Though most of the metals are formable at room temperature, few are not. To evaluate the workability of such metals at elevated temperatures, thermomechanical experiments should be carried out to find out the forming temperatures and strain rates. Though a number of constitutive relations are available to correlate the material parameters and the corresponding formability at elevated temperatures, the constitutive rule proposed by Arrhenius has been used in this work. Thus, in the present work, the material constants such as A (constant), α (stress multiplier), β (constant), and n (stress exponent) of AA 2017 has been found by conducting a series of hot compression tests at different temperatures such as 400°C, 450°C, 500°C, and 550°C and at different strain rates such as 0.16, 0.18, and 0.2. True stress (σt), true strains (εt) deformation activation energy (Q), and the Zener-Hollomon parameter (Z value) were also calculated. The results indicate that the value of ln (Z) decreases as the temperature increases and it increases as the strain rate increases.

Keywords: hot compression test, aluminium alloy, flow stress, activation energy

Procedia PDF Downloads 621
3925 Characterization of Cement Mortar Based on Fine Quartz

Authors: K. Arroudj, M. Lanez, M. N. Oudjit

Abstract:

The introduction of siliceous mineral additions in cement production allows, in addition to the ecological and economic gain, improvement of concrete performance. This improvement is mainly due to the fixing of Portlandite, released during the hydration of cement, by fine siliceous, forming denser calcium silicate hydrates and therefore a more compact cementitious matrix. This research is part of the valuation of the Dune Sand (DS) in the cement industry in Algeria. The high silica content of DS motivated us to study its effect, at ground state, on the properties of mortars in fresh and hardened state. For this purpose, cement pastes and mortars based on ground dune sand (fine quartz) has been analyzed with a replacement to cement of 15%, 20% and 25%. This substitution has reduced the amount of heat of hydration and avoids any risk of initial cracking. In addition, the grinding of the dune sand provides amorphous thin populations adsorbed at the surface of the crystal particles of quartz. Which gives to ground quartz pozzolanic character. This character results an improvement of mechanical strength of mortar (66 MPa in the presence of 25% of ground quartz).

Keywords: mineralogical structure, pozzolanic reactivity, Quartz, mechanical strength

Procedia PDF Downloads 285
3924 Fast Aerodynamic Evaluation of Transport Aircraft in Early Phases

Authors: Xavier Bertrand, Alexandre Cayrel

Abstract:

The early phase of an aircraft development is instrumental as it really drives the potential of a new concept. Any weakness in the high-level design (wing planform, moveable surfaces layout etc.) will be extremely difficult and expensive to recover later in the aircraft development process. Aerodynamic evaluation in this very early development phase is driven by two main criteria: a short lead-time to allow quick iterations of the geometrical design, and a high quality of the calculations to get an accurate & reliable assessment of the current status. These two criteria are usually quite contradictory. Actually, short lead time of a couple of hours from end-to-end can be obtained with very simple tools (semi-empirical methods for instance) although their accuracy is limited, whereas higher quality calculations require heavier/more complex tools, which obviously need more complex inputs as well, and a significantly longer lead time. At this point, the choice has to be done between accuracy and lead-time. A brand new approach has been developed within Airbus, aiming at obtaining quickly high quality evaluations of the aerodynamic of an aircraft. This methodology is based on a joint use of Surrogate Modelling and a lifting line code. The Surrogate Modelling is used to get the wing sections characteristics (e.g. lift coefficient vs. angle of attack), whatever the airfoil geometry, the status of the moveable surfaces (aileron/spoilers) or the high-lift devices deployment. From these characteristics, the lifting line code is used to get the 3D effects on the wing whatever the flow conditions (low/high Mach numbers etc.). This methodology has been applied successfully to a concept of medium range aircraft.

Keywords: aerodynamics, lifting line, surrogate model, CFD

Procedia PDF Downloads 359
3923 Performance Evaluation of Refinement Method for Wideband Two-Beams Formation

Authors: C. Bunsanit

Abstract:

This paper presents the refinement method for two beams formation of wideband smart antenna. The refinement method for weighting coefficients is based on Fully Spatial Signal Processing by taking Inverse Discrete Fourier Transform (IDFT), and its simulation results are presented using MATLAB. The radiation pattern is created by multiplying the incoming signal with real weights and then summing them together. These real weighting coefficients are computed by IDFT method; however, the range of weight values is relatively wide. Therefore, for reducing this range, the refinement method is used. The radiation pattern concerns with five input parameters to control. These parameters are maximum weighting coefficient, wideband signal, direction of mainbeam, beamwidth, and maximum of minor lobe level. Comparison of the obtained simulation results between using refinement method and taking only IDFT shows that the refinement method works well for wideband two beams formation.

Keywords: fully spatial signal processing, beam forming, refinement method, smart antenna, weighting coefficient, wideband

Procedia PDF Downloads 226
3922 Embedded Test Framework: A Solution Accelerator for Embedded Hardware Testing

Authors: Arjun Kumar Rath, Titus Dhanasingh

Abstract:

Embedded product development requires software to test hardware functionality during development and finding issues during manufacturing in larger quantities. As the components are getting integrated, the devices are tested for their full functionality using advanced software tools. Benchmarking tools are used to measure and compare the performance of product features. At present, these tests are based on a variety of methods involving varying hardware and software platforms. Typically, these tests are custom built for every product and remain unusable for other variants. A majority of the tests goes undocumented, not updated, unusable when the product is released. To bridge this gap, a solution accelerator in the form of a framework can address these issues for running all these tests from one place, using an off-the-shelf tests library in a continuous integration environment. There are many open-source test frameworks or tools (fuego. LAVA, AutoTest, KernelCI, etc.) designed for testing embedded system devices, with each one having several unique good features, but one single tool and framework may not satisfy all of the testing needs for embedded systems, thus an extensible framework with the multitude of tools. Embedded product testing includes board bring-up testing, test during manufacturing, firmware testing, application testing, and assembly testing. Traditional test methods include developing test libraries and support components for every new hardware platform that belongs to the same domain with identical hardware architecture. This approach will have drawbacks like non-reusability where platform-specific libraries cannot be reused, need to maintain source infrastructure for individual hardware platforms, and most importantly, time is taken to re-develop test cases for new hardware platforms. These limitations create challenges like environment set up for testing, scalability, and maintenance. A desirable strategy is certainly one that is focused on maximizing reusability, continuous integration, and leveraging artifacts across the complete development cycle during phases of testing and across family of products. To get over the stated challenges with the conventional method and offers benefits of embedded testing, an embedded test framework (ETF), a solution accelerator, is designed, which can be deployed in embedded system-related products with minimal customizations and maintenance to accelerate the hardware testing. Embedded test framework supports testing different hardwares including microprocessor and microcontroller. It offers benefits such as (1) Time-to-Market: Accelerates board brings up time with prepacked test suites supporting all necessary peripherals which can speed up the design and development stage(board bring up, manufacturing and device driver) (2) Reusability-framework components isolated from the platform-specific HW initialization and configuration makes the adaptability of test cases across various platform quick and simple (3) Effective build and test infrastructure with multiple test interface options and preintegrated with FUEGO framework (4) Continuos integration - pre-integrated with Jenkins which enabled continuous testing and automated software update feature. Applying the embedded test framework accelerator throughout the design and development phase enables to development of the well-tested systems before functional verification and improves time to market to a large extent.

Keywords: board diagnostics software, embedded system, hardware testing, test frameworks

Procedia PDF Downloads 145
3921 Exploring the Impact of ChatGPT on the English Writing Skills of a Group of International EFL Uzbek Students: A Qualitative Case Study Conducted at a Private University College in Malaysia

Authors: Uranus Saadat

Abstract:

ChatGPT, as one of the well-known artificial intelligence (AI) tools, has recently been integrated into English language education and has had several impacts on learners. Accordingly, concerns regarding the overuse of this tool among EFL/ESL learners are rising, which could lead to several disadvantages in their writing skills development. The use of ChatGPT in facilitating writing skills is a novel concept that demands further studies in different contexts and learners. In this study, a qualitative case study is applied to investigate the impact of ChatGPT on the writing skills of a group of EFL bachelor’s students from Uzbekistan studying Teaching English as the Second Language (TESL) at a private university in Malaysia. The data was collected through the triangulation of document analysis, semi-structured interviews, classroom observations, and focus group discussions. Subsequently, the data was analyzed by using thematic analysis. Some of the emerging themes indicated that ChatGPT is helpful in engaging students by reducing their anxiety in class and providing them with constructive feedback and support. Conversely, certain emerging themes revealed excessive reliance on ChatGPT, resulting in a decrease in students’ creativity and critical thinking skills, memory span, and tolerance for ambiguity. The study suggests a number of strategies to alleviate its negative impacts, such as peer review activities, workshops for familiarizing students with AI, and gradual withdrawal of AI support activities. This study emphasizes the need for cautious AI integration into English language education to cultivate independent learners with higher-order thinking skills.

Keywords: ChatGPT, EFL/ESL learners, English writing skills, artificial intelligence tools, critical thinking skills

Procedia PDF Downloads 20
3920 A Study of The STEAM Toy Pedagogy Plan Evaluation for Elementary School

Authors: Wen-Te Chang, Yun-Hsin Pai

Abstract:

Purpose: Based on the interdisciplinary of lower grade Elementary School with the integration of STEAM concept, related wooden toy and pedagogy plans were developed and evaluated. The research goal was to benefit elementary school education. Design/methodology/approach: The subjects were teachers from two primary school teachers and students from the department of design of universities in Taipei. Amount of 103participants (Male: 34, Female: 69) were invited to participate in the research. The research tools are “STEAM toy design” and “questionnaire of STEAM toy Pedagogy plan.” The STEAM toy pedagogy plans were evaluated after the activity of “The interdisciplinary literacy discipline guiding study program--STEAM wooden workshop,” Finding/results: The study results: (1) As factors analyzing of the questionnaire indicated the percentage on the major factors were cognition teaching 68.61%, affection 80.18% and technique 80.14%, with α=.936 of validity. The assessment tools were proved to be valid for STEAM pedagogy plan evaluation; (2) The analysis of the questionnaires investigation confirmed that the main effect of the teaching factors was not significant (affection = technique = cognition); however, the interaction between STEAM factors revealed to be significant (F (8, 1164) =5.51, p < .01); (3) The main effect of the six pedagogy plans was significant (climbing toy > bird toy = gondola toy > frog castanets > train toy > balancing toy), and an interactive effect between STEAM factors also reached a significant level, (F (8, 1164) =5.51, p < .01), especially on the artistic (A/ Art) aspect. Originality/value: The main achievement of research: (1) A pedagogy plan evaluation was successfully developed. (2) The interactive effect between the STEAM and the teaching factors reached a significant level. (3) An interactive effect between the STEAM factors and the pedagogy plans reached a significant level too.

Keywords: STEAM, toy design, pedagogy plans, evaluation

Procedia PDF Downloads 283
3919 Understanding Cyber Kill Chains: Optimal Allocation of Monitoring Resources Using Cooperative Game Theory

Authors: Roy. H. A. Lindelauf

Abstract:

Cyberattacks are complex processes consisting of multiple interwoven tasks conducted by a set of agents. Interdictions and defenses against such attacks often rely on cyber kill chain (CKC) models. A CKC is a framework that tries to capture the actions taken by a cyber attacker. There exists a growing body of literature on CKCs. Most of this work either a) describes the CKC with respect to one or more specific cyberattacks or b) discusses the tools and technologies used by the attacker at each stage of the CKC. Defenders, facing scarce resources, have to decide where to allocate their resources given the CKC and partial knowledge on the tools and techniques attackers use. In this presentation CKCs are analyzed through the lens of covert projects, i.e., interrelated tasks that have to be conducted by agents (human and/or computer) with the aim of going undetected. Various aspects of covert project models have been studied abundantly in the operations research and game theory domain, think of resource-limited interdiction actions that maximally delay completion times of a weapons project for instance. This presentation has investigated both cooperative and non-cooperative game theoretic covert project models and elucidated their relation to CKC modelling. To view a CKC as a covert project each step in the CKC is broken down into tasks and there are players of which each one is capable of executing a subset of the tasks. Additionally, task inter-dependencies are represented by a schedule. Using multi-glove cooperative games it is shown how a defender can optimize the allocation of his scarce resources (what, where and how to monitor) against an attacker scheduling a CKC. This study presents and compares several cooperative game theoretic solution concepts as metrics for assigning resources to the monitoring of agents.

Keywords: cyber defense, cyber kill chain, game theory, information warfare techniques

Procedia PDF Downloads 140
3918 Climate Change Vulnerability and Capacity Assessment in Coastal Areas of Sindh Pakistan and Its Impact on Water Resources

Authors: Falak Nawaz

Abstract:

The Climate Change Vulnerability and Capacity Assessment carried out in the coastal regions of Thatta and Malir districts underscore the potential risks and challenges associated with climate change affecting water resources. This study was conducted by the author using participatory rural appraisal tools, with a greater focus on conducting focus group discussions, direct observations, key informant interviews, and other PRA tools. The assessment delves into the specific impacts of climate change along the coastal belt, concentrating on aspects such as rising sea levels, depletion of freshwater, alterations in precipitation patterns, fluctuations in water table levels, and the intrusion of saltwater into rivers. These factors have significant consequences for the availability and quality of water resources in coastal areas, manifesting in frequent migration and alterations in agriculture-based livelihood practices. Furthermore, the assessment assesses the adaptive capacity of communities and organizations in these coastal regions to effectively confront and alleviate the effects of climate change on water resources. It considers various measures, including infrastructure enhancements, water management practices, adjustments in agricultural approaches, and disaster preparedness, aiming to bolster adaptive capacity. The study's findings emphasize the necessity for prompt actions to address identified vulnerabilities and fortify the adaptive capacities of Sindh's coastal areas. This calls for comprehensive strategies and policies promoting sustainable water resource management, integrating climate change considerations, and providing essential resources and support to vulnerable communities.

Keywords: climate, climate change adaptation, disaster reselience, vulnerability, capacity, assessment

Procedia PDF Downloads 59
3917 Complementing Assessment Processes with Standardized Tests: A Work in Progress

Authors: Amparo Camacho

Abstract:

ABET accredited programs must assess the development of student learning outcomes (SOs) in engineering programs. Different institutions implement different strategies for this assessment, and they are usually designed “in house.” This paper presents a proposal for including standardized tests to complement the ABET assessment model in an engineering college made up of six distinct engineering programs. The engineering college formulated a model of quality assurance in education to be implemented throughout the six engineering programs to regularly assess and evaluate the achievement of SOs in each program offered. The model uses diverse techniques and sources of data to assess student performance and to implement actions of improvement based on the results of this assessment. The model is called “Assessment Process Model” and it includes SOs A through K, as defined by ABET. SOs can be divided into two categories: “hard skills” and “professional skills” (soft skills). The first includes abilities, such as: applying knowledge of mathematics, science, and engineering and designing and conducting experiments, as well as analyzing and interpreting data. The second category, “professional skills”, includes communicating effectively, and understanding professional and ethnical responsibility. Within the Assessment Process Model, various tools were used to assess SOs, related to both “hard” as well as “soft” skills. The assessment tools designed included: rubrics, surveys, questionnaires, and portfolios. In addition to these instruments, the Engineering College decided to use tools that systematically gather consistent quantitative data. For this reason, an in-house exam was designed and implemented, based on the curriculum of each program. Even though this exam was administered during various academic periods, it is not currently considered standardized. In 2017, the Engineering College included three standardized tests: one to assess mathematical and scientific reasoning and two more to assess reading and writing abilities. With these exams, the college hopes to obtain complementary information that can help better measure the development of both hard and soft skills of students in the different engineering programs. In the first semester of 2017, the three exams were given to three sample groups of students from the six different engineering programs. Students in the sample groups were either from the first, fifth, and tenth semester cohorts. At the time of submission of this paper, the engineering college has descriptive statistical data and is working with various statisticians to have a more in-depth and detailed analysis of the sample group of students’ achievement on the three exams. The overall objective of including standardized exams in the assessment model is to identify more precisely the least developed SOs in order to define and implement educational strategies necessary for students to achieve them in each engineering program.

Keywords: assessment, hard skills, soft skills, standardized tests

Procedia PDF Downloads 284
3916 Perception of Health Care Providers: A Need to Introduce Screening of Maternal Mental Health at Primary Health Care in Nepal

Authors: Manisha Singh, Padam Simkhada

Abstract:

Background: Although mental health policy has been adapted in Nepal since 1997, the implementation of the policy framework is yet to happen. The fact that mental health services are largely concentrated in urban areas more specific to treatment only provides a clear picture of the scarcity of mental health services in the country. The shreds of evidence from around the world, along with WHO’s (World Health Organization) Mental Health Gap Action Program (mhGAP) suggest that effective mental health services can be provided from Primary Health Care (PHC) centers through community-based programs without having to place a specialized health worker. However, the country is still facing the same challenges to date with very few psychiatrists and psychologists, but they are largely based in cities. Objectives: The main objectives of this study are; (a) to understand the perception of health workers at PHC on maternal mental health, and (b) to assess the availability of the mental health services at PHC to address maternal mental health. Methods: This study used a qualitative approach where an in-depth interview was conducted with the health workers at the primary level. “Mayadevi” rural municipality in Rupendehi District that comprised of 13 small villages, was chosen as the study site. A total 8 health institutions which covered all 13 sites were included where either the health post in- charge or health worker working in maternal and child health care was interviewed for the study. All the health posts in the study area were included in the study. The interviews were conducted in Nepali; later, they were translated in English, transcribed, and triangulated. NViVO was used for the analysis. Results: The findings show that most of the health workers understood what maternal mental health was and deemed it as a public health issue. They could explain the symptoms and knew what medication to prescribe if need be. However, the majority of them failed to name the screening tools in place for maternal mental health. Moreover, they hadn’t even seen one. None of the health care centers had any provision for screening mental health status. However, one of the centers prescribed medication when the patients displayed symptoms of depression. But they believed there were a significant number of hidden cases in the community due to the stigma around mental health and being a woman with mental health problem makes the situation even difficult. Nonetheless, the health workers understood the importance of having screening tools and acknowledged the need of training and support in order to provide the services from PHC. Conclusion: Community health workers can identify cases with mental health problems and prevent them from deteriorating further. But there is a need for robust training and support to build the capacity of the health workers. The screening tools on mental health needs to be encouraged to be used in the PHC levels. Furthermore, community-based culture-sensitive programs need to be initiated and implemented to mitigate the stigma related issues around mental health.

Keywords: maternal mental health, health care providers, screening, Nepal

Procedia PDF Downloads 127
3915 Meanings and Concepts of Standardization in Systems Medicine

Authors: Imme Petersen, Wiebke Sick, Regine Kollek

Abstract:

In systems medicine, high-throughput technologies produce large amounts of data on different biological and pathological processes, including (disturbed) gene expressions, metabolic pathways and signaling. The large volume of data of different types, stored in separate databases and often located at different geographical sites have posed new challenges regarding data handling and processing. Tools based on bioinformatics have been developed to resolve the upcoming problems of systematizing, standardizing and integrating the various data. However, the heterogeneity of data gathered at different levels of biological complexity is still a major challenge in data analysis. To build multilayer disease modules, large and heterogeneous data of disease-related information (e.g., genotype, phenotype, environmental factors) are correlated. Therefore, a great deal of attention in systems medicine has been put on data standardization, primarily to retrieve and combine large, heterogeneous datasets into standardized and incorporated forms and structures. However, this data-centred concept of standardization in systems medicine is contrary to the debate in science and technology studies (STS) on standardization that rather emphasizes the dynamics, contexts and negotiations of standard operating procedures. Based on empirical work on research consortia that explore the molecular profile of diseases to establish systems medical approaches in the clinic in Germany, we trace how standardized data are processed and shaped by bioinformatics tools, how scientists using such data in research perceive such standard operating procedures and which consequences for knowledge production (e.g. modeling) arise from it. Hence, different concepts and meanings of standardization are explored to get a deeper insight into standard operating procedures not only in systems medicine, but also beyond.

Keywords: data, science and technology studies (STS), standardization, systems medicine

Procedia PDF Downloads 341
3914 A Methodology to Integrate Data in the Company Based on the Semantic Standard in the Context of Industry 4.0

Authors: Chang Qin, Daham Mustafa, Abderrahmane Khiat, Pierre Bienert, Paulo Zanini

Abstract:

Nowadays, companies are facing lots of challenges in the process of digital transformation, which can be a complex and costly undertaking. Digital transformation involves the collection and analysis of large amounts of data, which can create challenges around data management and governance. Furthermore, it is also challenged to integrate data from multiple systems and technologies. Although with these pains, companies are still pursuing digitalization because by embracing advanced technologies, companies can improve efficiency, quality, decision-making, and customer experience while also creating different business models and revenue streams. In this paper, the issue that data is stored in data silos with different schema and structures is focused. The conventional approaches to addressing this issue involve utilizing data warehousing, data integration tools, data standardization, and business intelligence tools. However, these approaches primarily focus on the grammar and structure of the data and neglect the importance of semantic modeling and semantic standardization, which are essential for achieving data interoperability. In this session, the challenge of data silos in Industry 4.0 is addressed by developing a semantic modeling approach compliant with Asset Administration Shell (AAS) models as an efficient standard for communication in Industry 4.0. The paper highlights how our approach can facilitate the data mapping process and semantic lifting according to existing industry standards such as ECLASS and other industrial dictionaries. It also incorporates the Asset Administration Shell technology to model and map the company’s data and utilize a knowledge graph for data storage and exploration.

Keywords: data interoperability in industry 4.0, digital integration, industrial dictionary, semantic modeling

Procedia PDF Downloads 94
3913 Psychological Distress Screening in Patients with Esophageal Cancer after Esophagectomy: A Scoping Review

Authors: Erietta-Christina Arnaoutaki, Stelios-Elion Bousi, Marinos Zachiotis, Simoni Zarkada, Alexandra Chrysagi, Mamdouh Fahad Alenazi, Dimitri Aristotle Raptis

Abstract:

Objective: This review aimed to evaluate the mental health status of patients with esophageal cancer following surgical treatment, as well as the role of psychological distress screening tests in this patient population. Methods: Studies reporting psychometric screening tools used in esophageal cancer patients after esophagectomy, published before January 2024 on PubMed, Scopus, and CENTRAL databases, were searched and analyzed. Results: Six non-randomized control trials were selected for inclusion in this scoping review, which involved 1059 patients undergoing esophagectomy for esophageal cancer. Among the included studies, five employed the Hospital Anxiety and Depression Scale (HADS) for anxiety and/or depression screening, while one used the MD Anderson Symptom Inventory for gastrointestinal cancer (MDASI-GI) for sadness screening. A range of time points was used to evaluate these patients: 102 patients were evaluated at 1 month, 230 patients at 3 months, 218 patients at 6 months, 653 patients at 12 months, and 154 patients at 24 months postoperatively. Analysis of data pooled from three studies employing the HADS revealed a prevalence of 19.45% for anxiety and 17.92% for depression at the 12-month follow-up and mean scores of 3.91 (3.12) and 3.56 (3.12) for the HADS anxiety (HADS-A) and depression (HADS-D) subscales respectively, at any time postoperatively. Conclusion: The findings show a neglected concern regarding the mental health of esophageal cancer survivors following surgical treatment. The use of psychometric screening tools is essential to address psychological distress and improve the quality of life of these patients.

Keywords: esophageal cancer, esophagectomy, psychological distress, anxiety, depression, psychometric tests, HADS, MDASI-GI

Procedia PDF Downloads 17
3912 An Exploratory Study Applied to Search Relationship between Humans and Universe

Authors: Mohamed Hashelaf, Ahmed Al-Osdody

Abstract:

In this paper, we focused our efforts on one of the vaguest subjects in astrophysics that is the formation and evolution of the universe until the arrival of humans. Through an in-depth exploration of the origins of the universe, understanding what has happened since the Big Bang until now and checking the history of creation, we can answer questions about the future of life, the possibility of its existence elsewhere in the universe and to be able to understand how we came, what our role in the circle of life is and what the future of our development will be. Here is where we used systematic steps that allowed us first and foremost to identify the reason behind the big bang itself that formed a large cloud of cosmic dust. Then after a period of time from the expansion of the universe and its coolness, the initial molecules of gases from the cosmic cloud began to condense, forming a very dense field of gravity that after millions of years led to the formation of stars, galaxies, even earth and the else planets. Finally, it became clear before us that after the earth has formed, the existence of liquid water made it possible for life to form, starting from the bacteria all the way until the appearance of the humans that we know today. But it does not stop here. If we look and contemplate in ourselves as humans, we will understand that the universe is inside us and that’s what makes us exceptional. All of this means that just as life on earth was created, it could have been on other planets as well. It also means that we are the universe’s key to understand itself.

Keywords: Big Bang, cosmic dust, primary elements, universe

Procedia PDF Downloads 134
3911 Surface Activation of Carbon Nanotubes Generating a Chemical Interaction in Epoxy Nanocomposite

Authors: Mohamed Eldessouki, Ebraheem Shady, Yasser Gowayed

Abstract:

Carbon nanotubes (CNTs) are known for having high elastic properties with high surface area that promote them as good candidates for reinforcing polymeric matrices. In composite materials, CNTs lack chemical bonding with the surrounding matrix which decreases the possibility of better stress transfer between the components. In this work, a chemical treatment for activating the surface of the multi-wall carbon nanotubes (MWCNT) was applied and the effect of this functionalization on the elastic properties of the epoxy nanocomposites was studied. Functional amino-groups were added to the surface of the CNTs and it was evaluated to be about 34% of the total weight of the CNTs. Elastic modulus was found to increase by about 40% of the neat epoxy resin at CNTs’ weight fraction of 0.5%. The elastic modulus was found to decrease after reaching a certain concentration of CNTs which was found to be 1% wt. The scanning electron microscopic pictures showed the effect of the CNTs on the crack propagation through the sample by forming stress concentrated spots at the nanocomposite samples.

Keywords: carbon nanotubes functionalization, crack propagation, elastic modulus, epoxy nanocomposites

Procedia PDF Downloads 405