Search results for: real volume
6502 Breathing New Life into Old Media
Authors: Dennis Schmickle
Abstract:
Introductory statement: Augmented reality (AR) can be used to breathe life into traditional graphic design media, such as posters, book covers, and album art. AR superimposes a unique image/video on a user’s view of the real world, which makes it more immersive and realistic than traditional 2D media. This study developed a series of projects that utilize both traditional and AR media to teach the fundamental principles of graphic design. The results of this study suggest that AR can be an effective tool for teaching graphic design. Abstract: Traditional graphic design media, such as posters, book covers, and album art, could be considered to be “old media.” However, augmented reality (AR) can breathe life into these formats by making them more interactive and engaging for students and audiences alike. AR is a technology that superimposes a computer-generated image on a user’s view of the real world. This allows users to interact with digital content in a way that is more immersive and interactive than traditional 2D media. AR is becoming increasingly popular, as more and more people have access to smartphones and other devices that can support AR experiences. This study is comprised of a series of projects that utilize both traditional and AR media to teach the fundamental principles of graphic design. In these projects, students learn to create traditional design objects, such as posters, book covers, and album art. However, they are also required to create an animated version of their design and to use AR software to create an AR experience with which viewers can interact. The results of this study suggest that AR can be an effective and exciting tool for teaching graphic design. The students who participated in the study were able to learn the fundamental principles of graphic design, and they also developed the skills they need to create effective AR content. This study has implications for the future of graphic design education. As AR becomes more popular, it is likely that it will become an increasingly important tool for teaching graphic design.Keywords: graphic design, augmented reality, print media, new media, AR, old media
Procedia PDF Downloads 686501 Algebraic Coupled Level Set-Volume of Fluid Method with Capillary Pressure Treatment for Surface Tension Dominant Two-Phase Flows
Authors: Majid Haghshenas, James Wilson, Ranganathan Kumar
Abstract:
In this study, an Algebraic Coupled Level Set-Volume of Fluid (A-CLSVOF) method with capillary pressure treatment is proposed for the modeling of two-phase capillary flows. The Volume of Fluid (VOF) method is utilized to incorporate one-way coupling with the Level Set (LS) function in order to further improve the accuracy of the interface curvature calculation and resulting surface tension force. The capillary pressure is determined and treated independently of the hydrodynamic pressure in the momentum balance in order to maintain consistency between cell centered and interpolated values, resulting in a reduction in parasitic currents. In this method, both VOF and LS functions are transported where the new volume fraction determines the interface seed position used to reinitialize the LS field. The Hamilton-Godunov function is used with a second order (in space and time) discretization scheme to produce a signed distance function. The performance of the current methodology has been tested against some common test cases in order to assess the reduction in non-physical velocities and improvements in the interfacial pressure jump. The cases of a static drop, non-linear Rayleigh-Taylor instability and finally a droplets impact on a liquid pool were simulated to compare the performance of the present method to other well-known methods in the area of parasitic current reduction, interface location evolution and overall agreement with experimental results.Keywords: two-phase flow, capillary flow, surface tension force, coupled LS with VOF
Procedia PDF Downloads 3586500 Secondary True to Life Polyethylene Terephthalate Nanoplastics: Obtention, Characterization, and Hazard Evaluation
Authors: Aliro Villacorta, Laura Rubio, Mohamed Alaraby, Montserrat López Mesas, Victor Fuentes-Cebrian, Oscar H. Moriones, Ricard Marcos, Alba Hernández.
Abstract:
Micro and nano plastics (MNPLs) are emergent environmental pollutants requiring urgent information on their potential risks to human health. One of the problems associated with the evaluation of their undesirable effects is the lack of real samples matching those resulting from the environmental degradation of plastic wastes. To such end, we propose an easy method to obtain polyethylene terephthalate nano plastics from water plastic bottles (PET-NPLs) but, in principle, applicable to any other plastic goods sources. An extensive characterization indicates that the proposed process produces uniform samples of PET-NPLs of around 100 nm, as determined by using a multi-angle and dynamic light scattering methodology. An important point to be highlighted is that to avoid the metal contamination resulting from methods using metal blades/burrs for milling, trituration, or sanding, we propose to use diamond burrs to produce metal-free samples. To visualize the toxicological profile of the produced PET-NPLs, we have evaluated their ability to be internalized by cells, their cytotoxicity, and their ability to induce oxidative stress and induce DNA damage. In this preliminary approach, we have detected their cellular uptake, but without the induction of significant biological effects. Thus, no relevant increases in toxicity, reactive oxygen species (ROS) induction, or DNA damage -as detected with the comet assay- have been observed. The use of real samples, as produced in this study, will generate relevant data in the discussion about the potential health risks associated with MNPLs exposures.Keywords: nanoplastics, polyethylene terephthalate, physicochemical characterization, cell uptake, cytotoxicity
Procedia PDF Downloads 976499 A Numerical Study on Micromechanical Aspects in Short Fiber Composites
Authors: I. Ioannou, I. M. Gitman
Abstract:
This study focused on the contribution of micro-mechanical parameters on the macro-mechanical response of short fiber composites, namely polypropylene matrix reinforced by glass fibers. In the framework of this paper, an attention has been given to the glass fibers length, as micromechanical parameter influences the overall macroscopic material’s behavior. Three dimensional numerical models were developed and analyzed through the concept of a Representative Volume Element (RVE). Results of the RVE-based approach were compared with analytical Halpin-Tsai’s model.Keywords: effective properties, homogenization, representative volume element, short fiber reinforced composites
Procedia PDF Downloads 2686498 Process of Analysis, Evaluation and Verification of the 'Real' Redevelopment of the Public Open Space at the Neighborhood’s Stairs: Case Study of Serres, Greece
Authors: Ioanna Skoufali
Abstract:
The present study is directed towards adaptation to climate change closely related to the phenomenon of the urban heat island (UHI). This issue is widespread and common to different urban realities, but particularly in Mediterranean cities that are characterized by dense urban. The attention of this work of redevelopment of the open space is focused on mitigation techniques aiming to solve local problems such as microclimatic parameters and the conditions of thermal comfort in summer, related to urban morphology. This quantitative analysis, evaluation, and verification survey involves the methodological elaboration applied in a real study case by Serres, through the experimental support of the ENVImet Pro V4.1 and BioMet software developed: i) in two phases concerning the anteoperam (phase a1 # 2013) and the post-operam (phase a2 # 2016); ii) in scenario A (+ 25% of green # 2017). The first study tends to identify the main intervention strategies, namely: the application of cool pavements, the increase of green surfaces, the creation of water surface and external fans; moreover, it obtains the minimum results achieved by the National Program 'Bioclimatic improvement project for public open space', EPPERAA (ESPA 2007-2013) related to the four environmental parameters illustrated below: the TAir = 1.5 o C, the TSurface = 6.5 o C, CDH = 30% and PET = 20%. In addition, the second study proposes a greater potential for improvement than postoperam intervention by increasing the vegetation within the district towards the SW/SE. The final objective of this in-depth design is to be transferable in homogeneous cases of urban regeneration processes with obvious effects on the efficiency of microclimatic mitigation and thermal comfort.Keywords: cool pavements, microclimate parameters (TAir, Tsurface, Tmrt, CDH), mitigation strategies, outdoor thermal comfort (PET & UTCI)
Procedia PDF Downloads 2026497 Enhancement of Aircraft Longitudinal Stability Using Tubercles
Authors: Muhammad Umer, Aishwariya Giri, Umaiyma Rakha
Abstract:
Mimicked from the humpback whale flippers, the application of tubercle technology is seen to be particularly advantageous at high angles of attack. This particular advantage is of paramount importance when it comes to structures producing lift at high angles of attack. This characteristic of the technology makes it ideal for horizontal stabilizers and selecting the same as the subject of study to identify and exploit the advantage highlighted by researchers on airfoils, this project aims in establishing a foundation for the application of the bio-mimicked technology on an existing aircraft. Using a baseline and 2 tubercle configuration integrated models, the project targets to achieve the twin aim of highlighting the possibility and merits over the base model and also choosing the right configuration in providing the best characteristic suitable for high angles of attack. To facilitate this study, the required models are generated using Solidworks followed by trials in a virtual aerodynamic environment using Fluent in Ansys for resolving the project objectives. Following a structured plan, the aim is to initially identify the advantages mathematically and then selecting the optimal configuration, simulate the end configuration at angles mimicking the actual operation envelope for the particular structure. Upon simulating the baseline configuration at various angles of attack, the stall angle was determined to be 22 degrees. Thus, the tubercle configurations will be simulated and compared at 4 different angles of attacks: 0, 10, 20, and 24. Further, after providing the optimum configuration of horizontal stabilizers, this study aims at the integration of aircraft structure so that the results better imply the end deliverables of real life application. This draws the project scope closer at this point into longitudinal static stability considerations and improvements in the manoeuvrability characteristics. The objective of the study is to achieve a complete overview ready for real life application with marked benefits obtainable from bio morphing of the tubercle technology.Keywords: flow simulation, horizontal stabilizer, stability enhancement, tubercle
Procedia PDF Downloads 3206496 Assessing the Macroeconomic Effects of Fiscal Policy Changes in Egypt: A Bayesian Structural Vector Autoregression Approach
Authors: Walaa Diab, Baher Atlam, Nadia El Nimer
Abstract:
Egypt faces many obvious economic challenges, and it is so clear that a real economic transformation is needed to address those problems, especially after the recent decisions of floating the Egyptian pound and the gradual subsidy cuts that are trying to meet the needed conditions to get the IMF support of (a £12bn loan) for its economic reform program. Following the post-2008 revival of the interest in the fiscal policy and its vital role in speeding up or slowing down the economic growth. Here comes the value of this paper as it seeks to analyze the macroeconomic effects of fiscal policy in Egypt by applying A Bayesian SVAR Approach. The study uses the Bayesian method because it includes the prior information and no relevant information is omitted and so it is well suited for rational, evidence-based decision-making. Since the study aims to define the effects of fiscal policy shocks in Egypt to help the decision-makers in determining the proper means to correct the structural problems in the Egyptian economy, it has to study the period of 1990s economic reform, but unfortunately; the available data is on an annual frequency. Thus, it uses annual time series to study the period 1991: 2005 And quarterly data over the period 2006–2016. It uses a set of six main variables includes government expenditure and net tax revenues as fiscal policy arms affecting real GDP, unemployment, inflation and the interest rate. The study also tries to assess the 'crowding out' effects by considering the effects of government spending and government revenue shocks on the composition of GDP, namely, on private consumption and private investment. Last but not least the study provides its policy implications regarding the needed role of fiscal policy in Egypt in the upcoming economic reform building on the results it concludes from the previous reform program.Keywords: fiscal policy, government spending, structural vector autoregression, taxation
Procedia PDF Downloads 2796495 Genetic Algorithm for In-Theatre Military Logistics Search-and-Delivery Path Planning
Authors: Jean Berger, Mohamed Barkaoui
Abstract:
Discrete search path planning in time-constrained uncertain environment relying upon imperfect sensors is known to be hard, and current problem-solving techniques proposed so far to compute near real-time efficient path plans are mainly bounded to provide a few move solutions. A new information-theoretic –based open-loop decision model explicitly incorporating false alarm sensor readings, to solve a single agent military logistics search-and-delivery path planning problem with anticipated feedback is presented. The decision model consists in minimizing expected entropy considering anticipated possible observation outcomes over a given time horizon. The model captures uncertainty associated with observation events for all possible scenarios. Entropy represents a measure of uncertainty about the searched target location. Feedback information resulting from possible sensor observations outcomes along the projected path plan is exploited to update anticipated unit target occupancy beliefs. For the first time, a compact belief update formulation is generalized to explicitly include false positive observation events that may occur during plan execution. A novel genetic algorithm is then proposed to efficiently solve search path planning, providing near-optimal solutions for practical realistic problem instances. Given the run-time performance of the algorithm, natural extension to a closed-loop environment to progressively integrate real visit outcomes on a rolling time horizon can be easily envisioned. Computational results show the value of the approach in comparison to alternate heuristics.Keywords: search path planning, false alarm, search-and-delivery, entropy, genetic algorithm
Procedia PDF Downloads 3606494 Cooperative Agents to Prevent and Mitigate Distributed Denial of Service Attacks of Internet of Things Devices in Transportation Systems
Authors: Borhan Marzougui
Abstract:
Road and Transport Authority (RTA) is moving ahead with the implementation of the leader’s vision in exploring all avenues that may bring better security and safety services to the community. Smart transport means using smart technologies such as IoT (Internet of Things). This technology continues to affirm its important role in the context of Information and Transportation Systems. In fact, IoT is a network of Internet-connected objects able to collect and exchange different data using embedded sensors. With the growth of IoT, Distributed Denial of Service (DDoS) attacks is also growing exponentially. DDoS attacks are the major and a real threat to various transportation services. Currently, the defense mechanisms are mainly passive in nature, and there is a need to develop a smart technique to handle them. In fact, new IoT devices are being used into a botnet for DDoS attackers to accumulate for attacker purposes. The aim of this paper is to provide a relevant understanding of dangerous types of DDoS attack related to IoT and to provide valuable guidance for the future IoT security method. Our methodology is based on development of the distributed algorithm. This algorithm manipulates dedicated intelligent and cooperative agents to prevent and to mitigate DDOS attacks. The proposed technique ensure a preventive action when a malicious packets start to be distributed through the connected node (Network of IoT devices). In addition, the devices such as camera and radio frequency identification (RFID) are connected within the secured network, and the data generated by it are analyzed in real time by intelligent and cooperative agents. The proposed security system is based on a multi-agent system. The obtained result has shown a significant reduction of a number of infected devices and enhanced the capabilities of different security dispositives.Keywords: IoT, DDoS, attacks, botnet, security, agents
Procedia PDF Downloads 1436493 Wasting Human and Computer Resources
Authors: Mária Csernoch, Piroska Biró
Abstract:
The legends about “user-friendly” and “easy-to-use” birotical tools (computer-related office tools) have been spreading and misleading end-users. This approach has led us to the extremely high number of incorrect documents, causing serious financial losses in the creating, modifying, and retrieving processes. Our research proved that there are at least two sources of this underachievement: (1) The lack of the definition of the correctly edited, formatted documents. Consequently, end-users do not know whether their methods and results are correct or not. They are not aware of their ignorance. They are so ignorant that their ignorance does not allow them to realize their lack of knowledge. (2) The end-users’ problem-solving methods. We have found that in non-traditional programming environments end-users apply, almost exclusively, surface approach metacognitive methods to carry out their computer related activities, which are proved less effective than deep approach methods. Based on these findings we have developed deep approach methods which are based on and adapted from traditional programming languages. In this study, we focus on the most popular type of birotical documents, the text-based documents. We have provided the definition of the correctly edited text, and based on this definition, adapted the debugging method known in programming. According to the method, before the realization of text editing, a thorough debugging of already existing texts and the categorization of errors are carried out. With this method in advance to real text editing users learn the requirements of text-based documents and also of the correctly formatted text. The method has been proved much more effective than the previously applied surface approach methods. The advantages of the method are that the real text handling requires much less human and computer sources than clicking aimlessly in the GUI (Graphical User Interface), and the data retrieval is much more effective than from error-prone documents.Keywords: deep approach metacognitive methods, error-prone birotical documents, financial losses, human and computer resources
Procedia PDF Downloads 3826492 The Impact of Client Leadership, Building Information Modelling (BIM) and Integrated Project Delivery (IPD) on Construction Project: A Case Study in UAE
Authors: C. W. F. Che Wan Putra, M. Alshawi, M. S. Al Ahbabi, M. Jabakhanji
Abstract:
The construction industry is a multi-disciplinary and multi-national industry, which has an important role to play within the overall economy of any country. There are major challenges to an improved performance within the industry. Particularly lacking is, the ability to capture the large amounts of information generated during the life-cycle of projects and to make these available, in the right format, so that professionals can then evaluate alternative solutions based on life-cycle analysis. The fragmented nature of the industry is the main reason behind the unavailability and ill utilisation of project information. The lack of adequately engaging clients and managing their requirements contributes adversely to construction budget and schedule overruns. This is a difficult task to achieve, particularly if clients are not continuously and formally involved in the design and construction process, which means that the design intent is left to designers that may not always satisfy clients’ requirements. Client lead is strongly recognised in bringing change through better collaboration between project stakeholders. However, one of the major challenges is that collaboration is operated under conventional procurement methods, which hugely limit the stakeholders’ roles and responsibilities to bring about the required level of collaboration. A research has been conducted with a typical project in the UAE. A qualitative research work was conducted including semi-structured interviews with project partners to discover the real reasons behind this delay. The case study also investigated the real causes of the problems and if they can be adequately addressed by BIM and IPD. Special focus was also placed on the Client leadership and the role the Client can play to eliminate/minimize these problems. It was found that part of the ‘key elements’ from which the problems exist can be attributed to the client leadership and the collaborative environment and BIM.Keywords: client leadership, building information modelling (BIM), integrated project delivery (IPD), case study
Procedia PDF Downloads 3236491 Structural Analysis of Polymer Thin Films at Single Macromolecule Level
Authors: Hiroyuki Aoki, Toru Asada, Tomomi Tanii
Abstract:
The properties of a spin-cast film of a polymer material are different from those in the bulk material because the polymer chains are frozen in an un-equilibrium state due to the rapid evaporation of the solvent. However, there has been little information on the un-equilibrated conformation and dynamics in a spin-cast film at the single chain level. The real-space observation of individual chains would provide direct information to discuss the morphology and dynamics of single polymer chains. The recent development of super-resolution fluorescence microscopy methods allows the conformational analysis of single polymer chain. In the current study, the conformation of a polymer chain in a spin-cast film by the super-resolution microscopy. Poly(methyl methacrylate) (PMMA) with the molecular weight of 2.2 x 10^6 was spin-cast onto a glass substrate from toluene and chloroform. For the super-resolution fluorescence imaging, a small amount of the PMMA labeled by rhodamine spiroamide dye was added. The radius of gyration (Rg) was evaluated from the super-resolution fluorescence image of each PMMA chain. The mean-square-root of Rg was 48.7 and 54.0 nm in the spin-cast films prepared from the toluene and chloroform solutions, respectively. On the other hand, the chain dimension in a bulk state (a thermally annealed 10- μm-thick sample) was observed to be 43.1 nm. This indicates that the PMMA chain in the spin-cast film takes an expanded conformation compared to the unperturbed chain and that the chain dimension is dependent on the solvent quality. In a good solvent, the PMMA chain has an expanded conformation by the excluded volume effect. The polymer chain is frozen before the relaxation from an un-equilibrated expanded conformation to an unperturbed one by the rapid solvent evaporation.Keywords: chain conformation, polymer thin film, spin-coating, super-resolution optical microscopy
Procedia PDF Downloads 2876490 Fault Prognostic and Prediction Based on the Importance Degree of Test Point
Authors: Junfeng Yan, Wenkui Hou
Abstract:
Prognostics and Health Management (PHM) is a technology to monitor the equipment status and predict impending faults. It is used to predict the potential fault and provide fault information and track trends of system degradation by capturing characteristics signals. So how to detect characteristics signals is very important. The select of test point plays a very important role in detecting characteristics signal. Traditionally, we use dependency model to select the test point containing the most detecting information. But, facing the large complicated system, the dependency model is not built so easily sometimes and the greater trouble is how to calculate the matrix. Rely on this premise, the paper provide a highly effective method to select test point without dependency model. Because signal flow model is a diagnosis model based on failure mode, which focuses on system’s failure mode and the dependency relationship between the test points and faults. In the signal flow model, a fault information can flow from the beginning to the end. According to the signal flow model, we can find out location and structure information of every test point and module. We break the signal flow model up into serial and parallel parts to obtain the final relationship function between the system’s testability or prediction metrics and test points. Further, through the partial derivatives operation, we can obtain every test point’s importance degree in determining the testability metrics, such as undetected rate, false alarm rate, untrusted rate. This contributes to installing the test point according to the real requirement and also provides a solid foundation for the Prognostics and Health Management. According to the real effect of the practical engineering application, the method is very efficient.Keywords: false alarm rate, importance degree, signal flow model, undetected rate, untrusted rate
Procedia PDF Downloads 3776489 The Angiogenic Activity of α-Mangostin in the Development of Zebrafish Embryo In Vivo
Authors: Titis Indah Adi Rahayu
Abstract:
Angiogenesis is the process of generating new capillary from pre-existing blood vessels. VEGFA is a major regulator in angiogenesis that binds and activates two tyrosine kinase receptors, VEGFR1 (Flt-1) and VEGFR2 (Flk-1/KDR) which regulate pathological and physiological angiogenesis. Disruption of VEGFA and VEGFR2 regulation lead to many diseases. The study of α-Mangostin (derivate of xanthone) as anti-oxidant and anti inflammation has been explored recently and both of them have relation to vasculature however the effect of α-Mangostin in blood vessel formation in healthy tissue in vivo has not been studied. Zebrafish is a powerful model in studying angiogenesis and shared many advantages that is a viable whole animal model for screening small molecules that affect blood vessel formation. Therefore the aim of this study is to evaluate angiogenic activity of α-Mangostin in healthy tissue in vivo in zebrafish embryo in relation of patterning blood vessel. Blood vessel patterning is highly characteristic in the developing of zebrafish embryo and the subintestinal vessel (SIV) can be stained and visualized microscopically as a primary screen for α-Mangostin that effect angiogenesis. The zebrafish embryos are divided into 2 groups. Group one consists of the zebrafish embryos at 1 dpf for 4 days which are tested to α-Mangostin in several concentration 2 µM, 4 µM, 6 µM, 8 µM and 10 µM whereas in group two the zebrafish larva at 4 dpf are exposed to α-Mangostin 1,75 µM, 2,3 µM, 2,9 µM, 3,8 µM dan 5 µM for 2 days. DMSO is served as a control for each group. The level expression of vegfa and vegfr2 are observed quantitatively using real time q-PCR and patterning of SIV are then analized via alkaline phospatase staining. Result shows that the level expression of vegfa and vegfr2 is repressed quantitatively as shown in real time q-PCR in the group of 1-4 days of α-Mangostin exposure where it is increased in the group of 4-6 days of α-Mangostin exposure. The result is then compared to alkaline phospatase staining of SIV using stereo microscope. It indicates that α-Mangostin does not disturb the patterning of SIV formation in zebrafish.Keywords: angiogenesis, Danio rerio, α-Mangostin, SIV, vegfa, vegfr2
Procedia PDF Downloads 3426488 Anti-Obesity Activity of Garcinia xanthochymus: Biochemical Characterization and In vivo Studies in High Fat Diet-Rat Model
Authors: Mahesh M. Patil, K. A. Anu-Appaiah
Abstract:
Overweight and obesity is a serious medical problem, increasing in prevalence, and affecting millions worldwide. Investigators have been trying from decades to articulate the burden of obesity and related risk factors. To answer this problem, we suggest a new therapeutic anti-obesity compounds from Garcinia xanthochymus fruit. However, there is little published scientific information on non-hydroxycitric acid Garcinia species. Our findings include biochemical characterization of the fruit; in vivo toxicity and bio-efficacy study of G. xanthochymus in high fat diet wistar rat model. We observed that Garcinia pericarp is a rich source of organic acids, polyphenols, mono- (40.63%) and poly-unsaturated fatty acids (16.45%; omega-3: 10.02%). Toxicological studies have showed that Garcinia is safe and had no observed adverse effect level up to 400 mg/kg/day. Body weight and food intake was significantly (P<0.05) reduced in oral gavage treated rats (sonicated Garcinia powder) in 13 weeks. Subcutaneous fat was significantly (P<0.05) reduced in Garcinia treated rats. Hepatocytes significantly (p<0.05) overexpressed sterol regulatory element binding protein 2, liver X receptor- α, liver X receptor- β, lipoprotein lipase and monoacylglycerol lipase. Fatty acid binding protein 1 and peroxisome proliferator activated receptor- α were down regulated as assessed by real time qPCR. Currently our research is focused on the adipocyte obesity related gene expressions, effect of Garcinia on 3T3-adipocyte cell lines and high fat diet induced mice model. This in vivo pre-clinical data suggests that G. xanthochymus may have clinical utility for the treatment of obesity. However, further studies are required to establish its potency.Keywords: Garcinia xanthochymus, anti-obesity, high fat diet, real time qPCR
Procedia PDF Downloads 2526487 Delivering Safer Clinical Trials; Using Electronic Healthcare Records (EHR) to Monitor, Detect and Report Adverse Events in Clinical Trials
Authors: Claire Williams
Abstract:
Randomised controlled Trials (RCTs) of efficacy are still perceived as the gold standard for the generation of evidence, and whilst advances in data collection methods are well developed, this progress has not been matched for the reporting of adverse events (AEs). Assessment and reporting of AEs in clinical trials are fraught with human error and inefficiency and are extremely time and resource intensive. Recent research conducted into the quality of reporting of AEs during clinical trials concluded it is substandard and reporting is inconsistent. Investigators commonly send reports to sponsors who are incorrectly categorised and lacking in critical information, which can complicate the detection of valid safety signals. In our presentation, we will describe an electronic data capture system, which has been designed to support clinical trial processes by reducing the resource burden on investigators, improving overall trial efficiencies, and making trials safer for patients. This proprietary technology was developed using expertise proven in the delivery of the world’s first prospective, phase 3b real-world trial, ‘The Salford Lung Study, ’ which enabled robust safety monitoring and reporting processes to be accomplished by the remote monitoring of patients’ EHRs. This technology enables safety alerts that are pre-defined by the protocol to be detected from the data extracted directly from the patients EHR. Based on study-specific criteria, which are created from the standard definition of a serious adverse event (SAE) and the safety profile of the medicinal product, the system alerts the investigator or study team to the safety alert. Each safety alert will require a clinical review by the investigator or delegate; examples of the types of alerts include hospital admission, death, hepatotoxicity, neutropenia, and acute renal failure. This is achieved in near real-time; safety alerts can be reviewed along with any additional information available to determine whether they meet the protocol-defined criteria for reporting or withdrawal. This active surveillance technology helps reduce the resource burden of the more traditional methods of AE detection for the investigators and study teams and can help eliminate reporting bias. Integration of multiple healthcare data sources enables much more complete and accurate safety data to be collected as part of a trial and can also provide an opportunity to evaluate a drug’s safety profile long-term, in post-trial follow-up. By utilising this robust and proven method for safety monitoring and reporting, a much higher risk of patient cohorts can be enrolled into trials, thus promoting inclusivity and diversity. Broadening eligibility criteria and adopting more inclusive recruitment practices in the later stages of drug development will increase the ability to understand the medicinal products risk-benefit profile across the patient population that is likely to use the product in clinical practice. Furthermore, this ground-breaking approach to AE detection not only provides sponsors with better-quality safety data for their products, but it reduces the resource burden on the investigator and study teams. With the data taken directly from the source, trial costs are reduced, with minimal data validation required and near real-time reporting enables safety concerns and signals to be detected more quickly than in a traditional RCT.Keywords: more comprehensive and accurate safety data, near real-time safety alerts, reduced resource burden, safer trials
Procedia PDF Downloads 856486 Partitioning of Non-Metallic Nutrients in Lactating Crossbred Cattle Fed Buffers
Authors: Awadhesh Kishore
Abstract:
The goal of the study was to determine how different non-metallic nutrients are partitioned from feed in various physiological contexts and how buffer addition in ruminant nutrition affects these processes. Six lactating crossbred dairy cows were selected and divided into three groups on the basis of their phenotypic and productive features (374±14 kg LW). Two treatments, T1 and T2, were randomly assigned to one animal from each group. Animals under T1 and T2 were moved to T2 and T1, respectively, after 30 days. T2 was the only group to receive buffers containing magnesium oxide and sodium bicarbonate at 0.0 and 0.01% of LW (the real amounts are equivalent to 75.3±4.0 and 30 7.7±2.0 g/d, respectively). T1 was used as the control. Wheat straw and berseem were part of the base diet, whereas wheat grain and mustard cake were part of the concentrate mixture. Following a 21-day feeding period, metabolic and milk production trials were carried out for seven consecutive days. The Kearl equation used the urine's calorific value to determine its volume. Chemical analyses were performed to determine the levels of nitrogen, carbohydrates, calories, and phosphorus in samples of feed, waste, buffer, mineral mixture, water, feces, urine, and milk that were collected. The information was analyzed statistically. Notable results included decreased nitrogen and carbohydrate partitioning to feces from feed, while increased calorie partitioning to milk and body storage, and increased carbohydrate partitioning to body storage. Phosphorus balance was significantly better in T2. The application of buffers in ruminant diets was found to increase the output of calories in milk, as well as the number of calories and carbohydrates stored in the body, while decreasing the amount of nitrogen in faeces. As a result, it may be advised to introduce buffers to feed crossbred dairy cattle.Keywords: cattle, Magnesium oxide, non-metallic nutrients, partitioning, Sodium bicarbonate
Procedia PDF Downloads 586485 [Keynote Speaker]: Some Similarity Considerations for Design of Experiments for Hybrid Buoyant Aerial Vehicle
Authors: A. U. Haque, W. Asrar, A. A Omar, E. Sulaeman, J. S. M. Ali
Abstract:
Buoyancy force applied on deformable symmetric bodies can be estimated by using Archimedes Principle. Such bodies like ellipsoidal bodies have high volume to surface ratio and are isometrically scaled for mass, length, area and volume to follow square cube law. For scaling up such bodies, it is worthwhile to find out the scaling relationship between the other physical quantities that represent thermodynamic, structural and inertial response etc. So, dimensionless similarities to find an allometric scale can be developed by using Bukingham π theorem which utilizes physical dimensions of important parameters. Base on this fact, physical dependencies of buoyancy system are reviewed to find the set of physical variables for deformable bodies of revolution filled with expandable gas like helium. Due to change in atmospheric conditions, this gas changes its volume and this change can effect the stability of elongated bodies on the ground as well as in te air. Special emphasis was given on the existing similarity parameters which can be used in the design of experiments of such bodies whose shape is affected by the external force like a drag, surface tension and kinetic loads acting on the surface. All these similarity criteria are based on non-dimensionalization, which also needs to be consider for scaling up such bodies.Keywords: Bukhigham pi theorem, similitude, scaling, buoyancy
Procedia PDF Downloads 3766484 Chemical Warfare Agent Simulant by Photocatalytic Filtering Reactor: Effect of Operating Parameters
Authors: Youcef Serhane, Abdelkrim Bouzaza, Dominique Wolbert, Aymen Amin Assadi
Abstract:
Throughout history, the use of chemical weapons is not exclusive to combats between army corps; some of these weapons are also found in very targeted intelligence operations (political assassinations), organized crime, and terrorist organizations. To improve the speed of action, important technological devices have been developed in recent years, in particular in the field of protection and decontamination techniques to better protect and neutralize a chemical threat. In order to assess certain protective, decontaminating technologies or to improve medical countermeasures, tests must be conducted. In view of the great toxicity of toxic chemical agents from (real) wars, simulants can be used, chosen according to the desired application. Here, we present an investigation about using a photocatalytic filtering reactor (PFR) for highly contaminated environments containing diethyl sulfide (DES). This target pollutant is used as a simulant of CWA, namely of Yperite (Mustard Gas). The influence of the inlet concentration (until high concentrations of DES (1200 ppmv, i.e., 5 g/m³ of air) has been studied. Also, the conversion rate was monitored under different relative humidity and different flow rates (respiratory flow - standards: ISO / DIS 8996 and NF EN 14387 + A1). In order to understand the efficacity of pollutant neutralization by PFR, a kinetic model based on the Langmuir–Hinshelwood (L–H) approach and taking into account the mass transfer step was developed. This allows us to determine the adsorption and kinetic degradation constants with no influence of mass transfer. The obtained results confirm that this small configuration of reactor presents an extremely promising way for the use of photocatalysis for treatment to deal with highly contaminated environments containing real chemical warfare agents. Also, they can give birth to an individual protection device (an autonomous cartridge for a gas mask).Keywords: photocatalysis, photocatalytic filtering reactor, diethylsulfide, chemical warfare agents
Procedia PDF Downloads 1056483 Rock Property Calculation for Determine Hydrocarbon Zone Based on Petrophysical Principal and Sequence Stratigraphic Correlation in Blok M
Authors: Muhammad Tarmidzi, Reza M. G. Gani, Andri Luthfi
Abstract:
The purpose of this study is to identify rock zone containing hydrocarbons with calculating rock property includes volume shale, total porosity, effective porosity and water saturation. Identification method rock property based on GR log, resistivity log, neutron log and density rock. Zoning is based on sequence stratigraphic markers that are sequence boundary (SB), transgressive surface (TS) and flooding surface (FS) which correlating ten well log in blok “M”. The results of sequence stratigraphic correlation consist of eight zone that are two LST zone, three TST zone and three HST zone. The result of rock property calculation in each zone is showing two LST zone containing hydrocarbons. LST-1 zone has average volume shale (Vsh) 25%, average total porosity (PHIT) 14%, average effective porosity (PHIE) 11% and average water saturation 0,83. LST-2 zone has average volume shale (Vsh) 19%, average total porosity (PHIT) 21%, average effective porosity (PHIE) 17% and average water saturation 0,82.Keywords: hydrocarbons zone, petrophysic, rock property, sequence stratigraphic
Procedia PDF Downloads 3276482 Models of Environmental, Crack Propagation of Some Aluminium Alloys (7xxx)
Authors: H. A. Jawan
Abstract:
This review describes the models of environmental-related crack propagation of aluminum alloys (7xxx) during the last few decades. Acknowledge on effects of different factors on the susceptibility to SCC permits to propose valuable mechanisms on crack advancement. The reliable mechanism of cracking give a possibility to propose the optimum chemical composition and thermal treatment conditions resulting in microstructure the most suitable for real environmental condition and stress state.Keywords: microstructure, environmental, propagation, mechanism
Procedia PDF Downloads 4186481 Inverse Scattering for a Second-Order Discrete System via Transmission Eigenvalues
Authors: Abdon Choque-Rivero
Abstract:
The Jacobi system with the Dirichlet boundary condition is considered on a half-line lattice when the coefficients are real valued. The inverse problem of recovery of the coefficients from various data sets containing the so-called transmission eigenvalues is analyzed. The Marchenko method is utilized to solve the corresponding inverse problem.Keywords: inverse scattering, discrete system, transmission eigenvalues, Marchenko method
Procedia PDF Downloads 1446480 A Framework of Dynamic Rule Selection Method for Dynamic Flexible Job Shop Problem by Reinforcement Learning Method
Authors: Rui Wu
Abstract:
In the volatile modern manufacturing environment, new orders randomly occur at any time, while the pre-emptive methods are infeasible. This leads to a real-time scheduling method that can produce a reasonably good schedule quickly. The dynamic Flexible Job Shop problem is an NP-hard scheduling problem that hybrid the dynamic Job Shop problem with the Parallel Machine problem. A Flexible Job Shop contains different work centres. Each work centre contains parallel machines that can process certain operations. Many algorithms, such as genetic algorithms or simulated annealing, have been proposed to solve the static Flexible Job Shop problems. However, the time efficiency of these methods is low, and these methods are not feasible in a dynamic scheduling problem. Therefore, a dynamic rule selection scheduling system based on the reinforcement learning method is proposed in this research, in which the dynamic Flexible Job Shop problem is divided into several parallel machine problems to decrease the complexity of the dynamic Flexible Job Shop problem. Firstly, the features of jobs, machines, work centres, and flexible job shops are selected to describe the status of the dynamic Flexible Job Shop problem at each decision point in each work centre. Secondly, a framework of reinforcement learning algorithm using a double-layer deep Q-learning network is applied to select proper composite dispatching rules based on the status of each work centre. Then, based on the selected composite dispatching rule, an available operation is selected from the waiting buffer and assigned to an available machine in each work centre. Finally, the proposed algorithm will be compared with well-known dispatching rules on objectives of mean tardiness, mean flow time, mean waiting time, or mean percentage of waiting time in the real-time Flexible Job Shop problem. The result of the simulations proved that the proposed framework has reasonable performance and time efficiency.Keywords: dynamic scheduling problem, flexible job shop, dispatching rules, deep reinforcement learning
Procedia PDF Downloads 1086479 Responsibility to Protect in Practice: Libya and Syria
Authors: Guram Esakia, Giorgi Goguadze
Abstract:
The following paper is written due to overview the concept of R2P, this new dimension in International Relations field. Paper contains the general description of previously mentioned concept, its advantages and disadvantages. We also compare each other R2P and“humanitarian intervention“, trying to make clear division between these two approaches in conflict solution. There is also discussed R2P in real action, successful one in Libya and yet failed in Syria. Essay doesn’t claim to be the part of scientific chain and is based only on personal subjection as well on information gathered from various scholars and UN resolutions.Keywords: the concept of R2P, humanitarian intervention, Libya, Syria
Procedia PDF Downloads 2786478 Use of Numerical Tools Dedicated to Fire Safety Engineering for the Rolling Stock
Authors: Guillaume Craveur
Abstract:
This study shows the opportunity to use numerical tools dedicated to Fire Safety Engineering for the Rolling Stock. Indeed, some lawful requirements can now be demonstrated by using numerical tools. The first part of this study presents the use of modelling evacuation tool to satisfy the criteria of evacuation time for the rolling stock. The buildingEXODUS software is used to model and simulate the evacuation of rolling stock. Firstly, in order to demonstrate the reliability of this tool to calculate the complete evacuation time, a comparative study was achieved between a real test and simulations done with buildingEXODUS. Multiple simulations are performed to capture the stochastic variations in egress times. Then, a new study is done to calculate the complete evacuation time of a train with the same geometry but with a different interior architecture. The second part of this study shows some applications of Computational Fluid Dynamics. This work presents the approach of a multi scales validation of numerical simulations of standardized tests with Fire Dynamics Simulations software developed by the National Institute of Standards and Technology (NIST). This work highlights in first the cone calorimeter test, described in the standard ISO 5660, in order to characterize the fire reaction of materials. The aim of this process is to readjust measurement results from the cone calorimeter test in order to create a data set usable at the seat scale. In the second step, the modelisation concerns the fire seat test described in the standard EN 45545-2. The data set obtained thanks to the validation of the cone calorimeter test was set up in the fire seat test. To conclude with the third step, after controlled the data obtained for the seat from the cone calorimeter test, a larger scale simulation with a real part of train is achieved.Keywords: fire safety engineering, numerical tools, rolling stock, multi-scales validation
Procedia PDF Downloads 3036477 Artificial Neural Network and Satellite Derived Chlorophyll Indices for Estimation of Wheat Chlorophyll Content under Rainfed Condition
Authors: Muhammad Naveed Tahir, Wang Yingkuan, Huang Wenjiang, Raheel Osman
Abstract:
Numerous models used in prediction and decision-making process but most of them are linear in natural environment, and linear models reach their limitations with non-linearity in data. Therefore accurate estimation is difficult. Artificial Neural Networks (ANN) found extensive acceptance to address the modeling of the complex real world for the non-linear environment. ANN’s have more general and flexible functional forms than traditional statistical methods can effectively deal with. The link between information technology and agriculture will become more firm in the near future. Monitoring crop biophysical properties non-destructively can provide a rapid and accurate understanding of its response to various environmental influences. Crop chlorophyll content is an important indicator of crop health and therefore the estimation of crop yield. In recent years, remote sensing has been accepted as a robust tool for site-specific management by detecting crop parameters at both local and large scales. The present research combined the ANN model with satellite-derived chlorophyll indices from LANDSAT 8 imagery for predicting real-time wheat chlorophyll estimation. The cloud-free scenes of LANDSAT 8 were acquired (Feb-March 2016-17) at the same time when ground-truthing campaign was performed for chlorophyll estimation by using SPAD-502. Different vegetation indices were derived from LANDSAT 8 imagery using ERADAS Imagine (v.2014) software for chlorophyll determination. The vegetation indices were including Normalized Difference Vegetation Index (NDVI), Green Normalized Difference Vegetation Index (GNDVI), Chlorophyll Absorbed Ratio Index (CARI), Modified Chlorophyll Absorbed Ratio Index (MCARI) and Transformed Chlorophyll Absorbed Ratio index (TCARI). For ANN modeling, MATLAB and SPSS (ANN) tools were used. Multilayer Perceptron (MLP) in MATLAB provided very satisfactory results. For training purpose of MLP 61.7% of the data, for validation purpose 28.3% of data and rest 10% of data were used to evaluate and validate the ANN model results. For error evaluation, sum of squares error and relative error were used. ANN model summery showed that sum of squares error of 10.786, the average overall relative error was .099. The MCARI and NDVI were revealed to be more sensitive indices for assessing wheat chlorophyll content with the highest coefficient of determination R²=0.93 and 0.90 respectively. The results suggested that use of high spatial resolution satellite imagery for the retrieval of crop chlorophyll content by using ANN model provides accurate, reliable assessment of crop health status at a larger scale which can help in managing crop nutrition requirement in real time.Keywords: ANN, chlorophyll content, chlorophyll indices, satellite images, wheat
Procedia PDF Downloads 1466476 Assessing the Social Impacts of Regional Services: The Case of a Portuguese Municipality
Authors: A. Camões, M. Ferreira Dias, M. Amorim
Abstract:
In recent years, the social economy is increasingly seen as a viable means to address social problems. Social enterprises, as well as public projects and initiatives targeted to meet social purposes, offer organizational models that assume heterogeneity, flexibility and adaptability to the ‘real world and real problems’. Despite the growing popularity of social initiatives, decision makers still face a paucity in what concerns the available models and tools to adequately assess its sustainability, and its impacts, notably the nature of its contribution to economic growth. This study was carried out at the local level, by analyzing the social impact initiatives and projects promoted by the Municipality of Albergaria-a-Velha (Câmara Municipal de Albergaria-a-Velha -CMA), a municipality of 25,000 inhabitants in the central region of Portugal. This work focuses on the challenges related to the qualifications and employability of citizens, which stands out as one of the key concerns in the Portuguese economy, particularly expressive in the context of small-scale cities and inland territories. The study offers a characterization of the Municipality, its socio-economic structure and challenges, followed by an exploratory analysis of multiple sourced data, collected from the CMA's documental sources as well as from privileged informants. The purpose is to conduct detailed analysis of the CMA's social projects, aimed at characterizing its potential impact for the model of qualifications and employability of the citizens of the Municipality. The study encompasses a discussion of the socio-economic profile of the municipality, notably its asymmetries, the analysis of the social projects and initiatives, as well as of data derived from inquiry actors involved in the implementation of the social projects and its beneficiaries. Finally, the results obtained with the Better Life Index will be included. This study makes it possible to ascertain if what is implicit in the literature goes to the encounter of what one experiences in reality.Keywords: measurement, municipalities, social economy, social impact
Procedia PDF Downloads 1346475 Constructivist Design Approaches to Video Production for Distance Education in Business and Economics
Authors: C. von Essen
Abstract:
This study outlines and evaluates a constructivist design approach to the creation of educational video on postgraduate business degree programmes. Many online courses are tapping into the educational affordances of video, as this form of online learning has the potential to create rich, multimodal experiences. And yet, in many learning contexts video is still being used to transmit instruction to passive learners, rather than promote learner engagement and knowledge creation. Constructivism posits the notion that learning is shaped as students make connections between their experiences and ideas. This paper pivots on the following research question: how can we design educational video in ways which promote constructivist learning and stimulate analytic viewing? By exploring and categorizing over two thousand educational videos created since 2014 for over thirty postgraduate courses in business, economics, mathematics and statistics, this paper presents and critically reflects on a taxonomy of video styles and features. It links the pedagogical intent of video – be it concept explanation, skill demonstration, feedback, real-world application of ideas, community creation, or the cultivation of course narrative – to specific presentational characteristics such as visual effects including diagrammatic and real-life graphics and aminations, commentary and sound options, chronological sequencing, interactive elements, and presenter set-up. The findings of this study inform a framework which captures the pedagogical, technological and production considerations instructional designers and educational media specialists should be conscious of when planning and preparing the video. More broadly, the paper demonstrates how learning theory and technology can coalesce to produce informed and pedagogical grounded instructional design choices. This paper reveals how crafting video in a more conscious and critical manner can produce powerful, new educational design.Keywords: educational video, constructivism, instructional design, business education
Procedia PDF Downloads 2366474 Exploration of RFID in Healthcare: A Data Mining Approach
Authors: Shilpa Balan
Abstract:
Radio Frequency Identification, also popularly known as RFID is used to automatically identify and track tags attached to items. This study focuses on the application of RFID in healthcare. The adoption of RFID in healthcare is a crucial technology to patient safety and inventory management. Data from RFID tags are used to identify the locations of patients and inventory in real time. Medical errors are thought to be a prominent cause of loss of life and injury. The major advantage of RFID application in healthcare industry is the reduction of medical errors. The healthcare industry has generated huge amounts of data. By discovering patterns and trends within the data, big data analytics can help improve patient care and lower healthcare costs. The number of increasing research publications leading to innovations in RFID applications shows the importance of this technology. This study explores the current state of research of RFID in healthcare using a text mining approach. No study has been performed yet on examining the current state of RFID research in healthcare using a data mining approach. In this study, related articles were collected on RFID from healthcare journal and news articles. Articles collected were from the year 2000 to 2015. Significant keywords on the topic of focus are identified and analyzed using open source data analytics software such as Rapid Miner. These analytical tools help extract pertinent information from massive volumes of data. It is seen that the main benefits of adopting RFID technology in healthcare include tracking medicines and equipment, upholding patient safety, and security improvement. The real-time tracking features of RFID allows for enhanced supply chain management. By productively using big data, healthcare organizations can gain significant benefits. Big data analytics in healthcare enables improved decisions by extracting insights from large volumes of data.Keywords: RFID, data mining, data analysis, healthcare
Procedia PDF Downloads 2336473 Tsunami Wave Height and Flow Velocity Calculations Based on Density Measurements of Boulders: Case Studies from Anegada and Pakarang Cape
Authors: Zakiul Fuady, Michaela Spiske
Abstract:
Inundation events, such as storms and tsunamis can leave onshore sedimentary evidence like sand deposits or large boulders. These deposits store indirect information on the related inundation parameters (e.g., flow velocity, flow depth, wave height). One tool to reveal these parameters are inverse models that use the physical characteristics of the deposits to refer to the magnitude of inundation. This study used boulders of the 2004 Indian Ocean Tsunami from Thailand (Pakarang Cape) and form a historical tsunami event that inundated the outer British Virgin Islands (Anegada). For the largest boulder found in Pakarang Cape with a volume of 26.48 m³ the required tsunami wave height is 0.44 m and storm wave height are 1.75 m (for a bulk density of 1.74 g/cm³. In Pakarang Cape the highest tsunami wave height is 0.45 m and storm wave height are 1.8 m for transporting a 20.07 m³ boulder. On Anegada, the largest boulder with a diameter of 2.7 m is the asingle coral head (species Diploria sp.) with a bulk density of 1.61 g/cm³, and requires a minimum tsunami wave height of 0.31 m and storm wave height of 1.25 m. The highest required tsunami wave height on Anegada is 2.12 m for a boulder with a bulk density of 2.46 g/cm³ (volume 0.0819 m³) and the highest storm wave height is 5.48 m (volume 0.216 m³) from the same bulk density and the coral type is limestone. Generally, the higher the bulk density, volume, and weight of the boulders, the higher the minimum tsunami and storm wave heights required to initiate transport. It requires 4.05 m/s flow velocity by Nott’s equation (2003) and 3.57 m/s by Nandasena et al. (2011) to transport the largest boulder in Pakarang Cape, whereas on Anegada, it requires 3.41 m/s to transport a boulder with diameter 2.7 m for both equations. Thus, boulder equations need to be handled with caution because they make many assumptions and simplifications. Second, the physical boulder parameters, such as density and volume need to be determined carefully to minimize any errors.Keywords: tsunami wave height, storm wave height, flow velocity, boulders, Anegada, Pakarang Cape
Procedia PDF Downloads 238