Search results for: free software
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7970

Search results for: free software

6140 Real-Time Demonstration of Visible Light Communication Based on Frequency-Shift Keying Employing a Smartphone as the Receiver

Authors: Fumin Wang, Jiaqi Yin, Lajun Wang, Nan Chi

Abstract:

In this article, we demonstrate a visible light communication (VLC) system over 8 meters free space transmission based on a commercial LED and a receiver in connection with an audio interface of a smart phone. The signal is in FSK modulation format. The successful experimental demonstration validates the feasibility of the proposed system in future wireless communication network.

Keywords: visible light communication, smartphone communication, frequency shift keying, wireless communication

Procedia PDF Downloads 374
6139 Initializing E-Classroom in a Multigrade School in the Philippines

Authors: Karl Erickson I. Ebora

Abstract:

Science and technology are two inseparable terms which bring wonders to all aspects of life such as education, medicine, food production and even the environment. In education, technology has become an integral part as it brings many benefits to the teaching-learning process. However, in the Philippines, being one of the developing countries resources are scarce and not all schools enjoy the fruits brought by technology. Much of this ordeal impacts that of multigrade instruction. These schools are often the last priority in resources allocation since these have limited number of students. In fact, it is not surprising that these schools do not have even a single computer unit much more a computer laboratory. This paper sought to present a plan on how public schools would receive its e-classroom. Specifically, this paper sought to answer questions like the level of the school readiness in terms of facilities and equipment; the attitude of the respondents towards the use of e-classroom; level of teacher’s familiarity in using different e-classroom software and the plans of interventions undertaken by the school to make it e-classroom ready. After gathering and analysing the necessary data, this paper came up with the following conclusions that in terms of facilities and equipment, Guisguis Talon Elementary School (Main), though a multigrade school, is ready to receive e-classroom.; that the respondents show positive disposition in technology utilization in teaching after they strongly agree that technology plays essential role in the teaching-learning process. Also, they strongly agree that technology is a good motivator; it makes the teaching and learning more interesting and effective; it makes teaching easy; and that technology enhances student’s learning. Additionally, Teacher-respondents in Guisguis Talon Elementary School (Main) show familiarity in using software. They are very familiar with MS Word; MS Excel; MS PowerPoint; and internet and email. Moreover, they are very familiar with basic e-classroom computer operations and basic application software. They are very familiar with MS office and can do simple editing and formatting; in accessing and saving information from CD/DVD, external hard drives, USB and the like; and in browsing effectively different search engines and educational sites, download and upload files. Likewise respondents strongly agree to the interventions undertaken by the school to make it e-classroom ready. They strongly agree that funding and support are needed by the school; that stakeholders should be encouraged to consider donating of equipment; and that school and community should try to mobilize their resources in order to help the school; that the teachers should be provided with trainings in order for them to be technologically competent; and that principals and administrators should motivate their teachers to undergo continuous professional development.

Keywords: e-classroom, multi-grade school, DCP, classroom computers

Procedia PDF Downloads 193
6138 Study υ_4 Fundamental Band of 12 CD4 Molecule

Authors: Kaarour Abdelkrim, Ouardi Okkacha, Meskine Mohamed

Abstract:

In this study, the υ_4 fundamental band of 12CD4 molecule has been studied by infrared spectroscopy with high resolution. Using XTDS and SPEVIEW software and the tensor formalism developed by ICB (laboratoire interdisciplinaire de Bourgogne) to several lines have been assigned and fitted with a standard deviation acceptable. This analysis allowed us to calculate several parameters of the molecule 12 CD4.

Keywords: XTDS, SPEVIEW, tetrahedral tensorial formalism, rovibrational band

Procedia PDF Downloads 321
6137 Recent Advances in Data Warehouse

Authors: Fahad Hanash Alzahrani

Abstract:

This paper describes some recent advances in a quickly developing area of data storing and processing based on Data Warehouses and Data Mining techniques, which are associated with software, hardware, data mining algorithms and visualisation techniques having common features for any specific problems and tasks of their implementation.

Keywords: data warehouse, data mining, knowledge discovery in databases, on-line analytical processing

Procedia PDF Downloads 395
6136 Nanoemulsion Formulation of Ethanolic Extracts of Propolis and Its Antioxidant Activity

Authors: Rachmat Mauludin, Dita Sasri Primaviri, Irda Fidrianny

Abstract:

Propolis contains several antioxidant compounds which can be used in topical application to protect skin against free radical, prevent skin cancer and skin aging. Previous study showed that 70% ethanolic extract of propolis (EEP) provided the greatest antioxidant activity. Since EEP has very small solubility in water, the extract was prepared in nanoemulsion (NE). Nanoemulsion is chosen as cosmetic dosage forms according to its properties namely to decrease the risk of skin’s irritation, increase penetration, prolong its time to remain in our skin, and improve stability. Propolis was extracted using reflux methods and concentrated using rotavapor. EEP was characterized with several tests such as phytochemical screening, density, and antioxidant activity using DPPH method. Optimation of total surfactant, co-surfactant, oil, and amount of EEP that can be included in NE were required to get the best NE formulation. The evaluations included to organoleptic observation, globul size, polydispersity index, morphology using TEM, viscosity, pH, centrifuge, stability, Freeze and Thaw test, radical scavenging activity using DPPH method, and primary irritation test. The yield extracts was 11.12% from raw propolis contained of steroid/triterpenoid, flavonoid, and saponin based on phytochemical screening. EEP had the value of DPPH scavenging activity 61.14% and IC50 0.41629 ppm. The best NE formulation consisted of 26.25% Kolliphor RH40; 8.75% glycerine; 5% rice bran oil; and 3% EEP. NE was transparant, had globul size of 21.9 nm; polydispersity index of 0.338; and pH of 5.67. Based on TEM morphology, NE was almost spherical and has particle size below 50 nm. NE propolis revealed to be physically stable after stability test within 63 days at 25oC, centrifuged for 30 mins at 13.000 rpm, and passed 6 cycles of Freeze and Thaw test without separated. NE propolis reduced 58% of free radical DPPH similar to antioxidant activity of the original extracts. Antioxidant activity of NE propolis is relatively stable after stored for 6 weeks. NE Propolis was proven to be safe by primary irritation test with the value of primary irritation index (OECD) was 0. The best formulation for NE propolis contained of 26.25% Kolliphor RH40; 8.75% glycerine; 5% rice bran oil; and 3% EEP with globul size of 21.9 nm and polydispersity index of 0.338. NE propolis was stable and had antioxidant activity similar to EEP.

Keywords: propolis, antioxidant, nanoemulsion, irritation test

Procedia PDF Downloads 296
6135 Diffusion MRI: Clinical Application in Radiotherapy Planning of Intracranial Pathology

Authors: Pomozova Kseniia, Gorlachev Gennadiy, Chernyaev Aleksandr, Golanov Andrey

Abstract:

In clinical practice, and especially in stereotactic radiosurgery planning, the significance of diffusion-weighted imaging (DWI) is growing. This makes the existence of software capable of quickly processing and reliably visualizing diffusion data, as well as equipped with tools for their analysis in terms of different tasks. We are developing the «MRDiffusionImaging» software on the standard C++ language. The subject part has been moved to separate class libraries and can be used on various platforms. The user interface is Windows WPF (Windows Presentation Foundation), which is a technology for managing Windows applications with access to all components of the .NET 5 or .NET Framework platform ecosystem. One of the important features is the use of a declarative markup language, XAML (eXtensible Application Markup Language), with which you can conveniently create, initialize and set properties of objects with hierarchical relationships. Graphics are generated using the DirectX environment. The MRDiffusionImaging software package has been implemented for processing diffusion magnetic resonance imaging (dMRI), which allows loading and viewing images sorted by series. An algorithm for "masking" dMRI series based on T2-weighted images was developed using a deformable surface model to exclude tissues that are not related to the area of interest from the analysis. An algorithm of distortion correction using deformable image registration based on autocorrelation of local structure has been developed. Maximum voxel dimension was 1,03 ± 0,12 mm. In an elementary brain's volume, the diffusion tensor is geometrically interpreted using an ellipsoid, which is an isosurface of the probability density of a molecule's diffusion. For the first time, non-parametric intensity distributions, neighborhood correlations, and inhomogeneities are combined in one segmentation of white matter (WM), grey matter (GM), and cerebrospinal fluid (CSF) algorithm. A tool for calculating the coefficient of average diffusion and fractional anisotropy has been created, on the basis of which it is possible to build quantitative maps for solving various clinical problems. Functionality has been created that allows clustering and segmenting images to individualize the clinical volume of radiation treatment and further assess the response (Median Dice Score = 0.963 ± 0,137). White matter tracts of the brain were visualized using two algorithms: deterministic (fiber assignment by continuous tracking) and probabilistic using the Hough transform. The proposed algorithms test candidate curves in the voxel, assigning to each one a score computed from the diffusion data, and then selects the curves with the highest scores as the potential anatomical connections. White matter fibers were visualized using a Hough transform tractography algorithm. In the context of functional radiosurgery, it is possible to reduce the irradiation volume of the internal capsule receiving 12 Gy from 0,402 cc to 0,254 cc. The «MRDiffusionImaging» will improve the efficiency and accuracy of diagnostics and stereotactic radiotherapy of intracranial pathology. We develop software with integrated, intuitive support for processing, analysis, and inclusion in the process of radiotherapy planning and evaluating its results.

Keywords: diffusion-weighted imaging, medical imaging, stereotactic radiosurgery, tractography

Procedia PDF Downloads 77
6134 Ultra-High Molecular Weight Polyethylene (UHMWPE) for Radiation Dosimetry Applications

Authors: Malik Sajjad Mehmood, Aisha Ali, Hamna Khan, Tariq Yasin, Masroor Ikram

Abstract:

Ultra-high molecular weight polyethylene (UHMWPE) is one of the polymers belongs to polyethylene (PE) family having monomer –CH2– and average molecular weight is approximately 3-6 million g/mol. Due its chemical, mechanical, physical and biocompatible properties, it has been extensively used in the field of electrical insulation, medicine, orthopedic, microelectronics, engineering, chemistry and the food industry etc. In order to alter/modify the properties of UHMWPE for particular application of interest, certain various procedures are in practice e.g. treating the material with high energy irradiations like gamma ray, e-beam, and ion bombardment. Radiation treatment of UHMWPE induces free radicals within its matrix, and these free radicals are the precursors of chain scission, chain accumulation, formation of double bonds, molecular emission, crosslinking etc. All the aforementioned physical and chemical processes are mainly responsible for the modification of polymers properties to use them in any particular application of our interest e.g. to fabricate LEDs, optical sensors, antireflective coatings, polymeric optical fibers, and most importantly for radiation dosimetry applications. It is therefore, to check the feasibility of using UHMWPE for radiation dosimetery applications, the compressed sheets of UHMWPE were irradiated at room temperature (~25°C) for total dose values of 30 kGy and 100 kGy, respectively while one were kept un-irradiated as reference. Transmittance data (from 400 nm to 800 nm) of e-beam irradiated UHMWPE and its hybrids were measured by using Muller matrix spectro-polarimeter. As a result significant changes occur in the absorption behavior of irradiated samples. To analyze these (radiation induced) changes in polymer matrix Urbach edge method and modified Tauc’s equation has been used. The results reveal that optical activation energy decreases with irradiation. The values of activation energies are 2.85 meV, 2.48 meV, and 2.40 meV for control, 30 kGy, and 100 kGy samples, respectively. Direct and indirect energy band gaps were also found to decrease with irradiation due to variation of C=C unsaturation in clusters. We believe that the reported results would open new horizons for radiation dosimetery applications.

Keywords: electron beam, radiation dosimetry, Tauc’s equation, UHMWPE, Urbach method

Procedia PDF Downloads 404
6133 Efficient Study of Substrate Integrated Waveguide Devices

Authors: J. Hajri, H. Hrizi, N. Sboui, H. Baudrand

Abstract:

This paper presents a study of SIW circuits (Substrate Integrated Waveguide) with a rigorous and fast original approach based on Iterative process (WCIP). The theoretical suggested study is validated by the simulation of two different examples of SIW circuits. The obtained results are in good agreement with those of measurement and with software HFSS.

Keywords: convergence study, HFSS, modal decomposition, SIW circuits, WCIP method

Procedia PDF Downloads 493
6132 Temporal and Spacial Adaptation Strategies in Aerodynamic Simulation of Bluff Bodies Using Vortex Particle Methods

Authors: Dario Milani, Guido Morgenthal

Abstract:

Fluid dynamic computation of wind caused forces on bluff bodies e.g light flexible civil structures or high incidence of ground approaching airplane wings, is one of the major criteria governing their design. For such structures a significant dynamic response may result, requiring the usage of small scale devices as guide-vanes in bridge design to control these effects. The focus of this paper is on the numerical simulation of the bluff body problem involving multiscale phenomena induced by small scale devices. One of the solution methods for the CFD simulation that is relatively successful in this class of applications is the Vortex Particle Method (VPM). The method is based on a grid free Lagrangian formulation of the Navier-Stokes equations, where the velocity field is modeled by particles representing local vorticity. These vortices are being convected due to the free stream velocity as well as diffused. This representation yields the main advantages of low numerical diffusion, compact discretization as the vorticity is strongly localized, implicitly accounting for the free-space boundary conditions typical for this class of FSI problems, and a natural representation of the vortex creation process inherent in bluff body flows. When the particle resolution reaches the Kolmogorov dissipation length, the method becomes a Direct Numerical Simulation (DNS). However, it is crucial to note that any solution method aims at balancing the computational cost against the accuracy achievable. In the classical VPM method, if the fluid domain is discretized by Np particles, the computational cost is O(Np2). For the coupled FSI problem of interest, for example large structures such as long-span bridges, the aerodynamic behavior may be influenced or even dominated by small structural details such as barriers, handrails or fairings. For such geometrically complex and dimensionally large structures, resolving the complete domain with the conventional VPM particle discretization might become prohibitively expensive to compute even for moderate numbers of particles. It is possible to reduce this cost either by reducing the number of particles or by controlling its local distribution. It is also possible to increase the accuracy of the solution without increasing substantially the global computational cost by computing a correction of the particle-particle interaction in some regions of interest. In this paper different strategies are presented in order to extend the conventional VPM method to reduce the computational cost whilst resolving the required details of the flow. The methods include temporal sub stepping to increase the accuracy of the particles convection in certain regions as well as dynamically re-discretizing the particle map to locally control the global and the local amount of particles. Finally, these methods will be applied on a test case and the improvements in the efficiency as well as the accuracy of the proposed extension to the method are presented. The important benefits in terms of accuracy and computational cost of the combination of these methods will be thus presented as long as their relevant applications.

Keywords: adaptation, fluid dynamic, remeshing, substepping, vortex particle method

Procedia PDF Downloads 255
6131 Macroscopic Support Structure Design for the Tool-Free Support Removal of Laser Powder Bed Fusion-Manufactured Parts Made of AlSi10Mg

Authors: Tobias Schmithuesen, Johannes Henrich Schleifenbaum

Abstract:

The additive manufacturing process laser powder bed fusion offers many advantages over conventional manufacturing processes. For example, almost any complex part can be produced, such as topologically optimized lightweight parts, which would be inconceivable with conventional manufacturing processes. A major challenge posed by the LPBF process, however, is, in most cases, the need to use and remove support structures on critically inclined part surfaces (α < 45 ° regarding substrate plate). These are mainly used for dimensionally accurate mapping of part contours and to reduce distortion by absorbing process-related internal stresses. Furthermore, they serve to transfer the process heat to the substrate plate and are, therefore, indispensable for the LPBF process. A major challenge for the economical use of the LPBF process in industrial process chains is currently still the high manual effort involved in removing support structures. According to the state of the art (SoA), the parts are usually treated by simple hand tools (e.g., pliers, chisels) or by machining (e.g., milling, turning). New automatable approaches are the removal of support structures by means of wet chemical ablation and thermal deburring. According to the state of the art, the support structures are essentially adapted to the LPBF process and not to potential post-processing steps. The aim of this study is the determination of support structure designs that are adapted to the mentioned post-processing approaches. In the first step, the essential boundary conditions for complete removal by means of the respective approaches are identified. Afterward, a representative demonstrator part with various macroscopic support structure designs will be LPBF-manufactured and tested with regard to a complete powder and support removability. Finally, based on the results, potentially suitable support structure designs for the respective approaches will be derived. The investigations are carried out on the example of the aluminum alloy AlSi10Mg.

Keywords: additive manufacturing, laser powder bed fusion, laser beam melting, selective laser melting, post processing, tool-free, wet chemical ablation, thermal deburring, aluminum alloy, AlSi10Mg

Procedia PDF Downloads 86
6130 Integration of EEG and Motion Tracking Sensors for Objective Measure of Attention-Deficit Hyperactivity Disorder in Pre-Schoolers

Authors: Neha Bhattacharyya, Soumendra Singh, Amrita Banerjee, Ria Ghosh, Oindrila Sinha, Nairit Das, Rajkumar Gayen, Somya Subhra Pal, Sahely Ganguly, Tanmoy Dasgupta, Tanusree Dasgupta, Pulak Mondal, Aniruddha Adhikari, Sharmila Sarkar, Debasish Bhattacharyya, Asim Kumar Mallick, Om Prakash Singh, Samir Kumar Pal

Abstract:

Background: We aim to develop an integrated device comprised of single-probe EEG and CCD-based motion sensors for a more objective measure of Attention-deficit Hyperactivity Disorder (ADHD). While the integrated device (MAHD) relies on the EEG signal (spectral density of beta wave) for the assessment of attention during a given structured task (painting three segments of a circle using three different colors, namely red, green and blue), the CCD sensor depicts movement pattern of the subjects engaged in a continuous performance task (CPT). A statistical analysis of the attention and movement patterns was performed, and the accuracy of the completed tasks was analysed using indigenously developed software. The device with the embedded software, called MAHD, is intended to improve certainty with criterion E (i.e. whether symptoms are better explained by another condition). Methods: We have used the EEG signal from a single-channel dry sensor placed on the frontal lobe of the head of the subjects (3-5 years old pre-schoolers). During the painting of three segments of a circle using three distinct colors (red, green, and blue), absolute power for delta and beta EEG waves from the subjects are found to be correlated with relaxation and attention/cognitive load conditions. While the relaxation condition of the subject hints at hyperactivity, a more direct CCD-based motion sensor is used to track the physical movement of the subject engaged in a continuous performance task (CPT) i.e., separation of the various colored balls from one table to another. We have used our indigenously developed software for the statistical analysis to derive a scale for the objective assessment of ADHD. We have also compared our scale with clinical ADHD evaluation. Results: In a limited clinical trial with preliminary statistical analysis, we have found a significant correlation between the objective assessment of the ADHD subjects with that of the clinician’s conventional evaluation. Conclusion: MAHD, the integrated device, is supposed to be an auxiliary tool to improve the accuracy of ADHD diagnosis by supporting greater criterion E certainty.

Keywords: ADHD, CPT, EEG signal, motion sensor, psychometric test

Procedia PDF Downloads 92
6129 Development of Positron Emission Tomography (PET) Tracers for the in-Vivo Imaging of α-Synuclein Aggregates in α-Synucleinopathies

Authors: Bright Chukwunwike Uzuegbunam, Wojciech Paslawski, Hans Agren, Christer Halldin, Wolfgang Weber, Markus Luster, Thomas Arzberger, Behrooz Hooshyar Yousefi

Abstract:

There is a need to develop a PET tracer that will enable to diagnosis and track the progression of Alpha-synucleinopathies (Parkinson’s disease [PD], dementia with Lewy bodies [DLB], multiple system atrophy [MSA]) in living subjects over time. Alpha-synuclein aggregates (a-syn), which are present in all the stages of disease progression, for instance, in PD, are a suitable target for in vivo PET imaging. For this reason, we have developed some promising a-syn tracers based on a disarylbisthiazole (DABTA) scaffold. The precursors are synthesized via a modified Hantzsch thiazole synthesis. The precursors were then radiolabeled via one- or two-step radiofluorination methods. The ligands were initially screened using a combination of molecular dynamics and quantum/molecular mechanics approaches in order to calculate the binding affinity to a-syn (in silico binding experiments). Experimental in vitro binding assays were also performed. The ligands were further screened in other experiments such as log D, in vitro plasma protein binding & plasma stability, biodistribution & brain metabolite analyses in healthy mice. Radiochemical yields were up to 30% - 72% in some cases. Molecular docking revealed possible binding sites in a-syn and also the free energy of binding to those sites (-28.9 - -66.9 kcal/mol), which correlated to the high binding affinity of the DABTAs to a-syn (Ki as low as 0.5 nM) and selectivity (> 100-fold) over Aβ and tau, which usually co-exist with a-synin some pathologies. The log D values range from 2.88 - 2.34, which correlated with free-protein fraction of 0.28% - 0.5%. Biodistribution experiments revealed that the tracers are taken up (5.6 %ID/g - 7.3 %ID/g) in the brain at 5 min (post-injection) p.i., and cleared out (values as low as 0.39 %ID/g were obtained at 120 min p.i. Analyses of the mice brain 20 min p.i. Revealed almost no radiometabolites in the brain in most cases. It can be concluded that in silico study presents a new venue for the rational development of radioligands with suitable features. The results obtained so far are promising and encourage us to further validate the DABTAs in autoradiography, immunohistochemistry, and in vivo imaging in non-human primates and humans.

Keywords: alpha-synuclein aggregates, alpha-synucleinopathies, PET imaging, tracer development

Procedia PDF Downloads 228
6128 Using Printouts as Social Media Evidence and Its Authentication in the Courtroom

Authors: Chih-Ping Chang

Abstract:

Different from traditional objective evidence, social media evidence has its own characteristics with easily tampering, recoverability, and cannot be read without using other devices (such as a computer). Simply taking a screenshot from social network sites must be questioned its original identity. When the police search and seizure digital information, a common way they use is to directly print out digital data obtained and ask the signature of the parties at the presence, without taking original digital data back. In addition to the issue on its original identity, this conduct to obtain evidence may have another two results. First, it will easily allege that is tampering evidence because the police wanted to frame the suspect and falsified evidence. Second, it is not easy to discovery hidden information. The core evidence associated with crime may not appear in the contents of files. Through discovery the original file, data related to the file, such as the original producer, creation time, modification date, and even GPS location display can be revealed from hidden information. Therefore, how to show this kind of evidence in the courtroom will be arguably the most important task for ruling social media evidence. This article, first, will introduce forensic software, like EnCase, TCT, FTK, and analyze their function to prove the identity with another digital data. Then turning back to the court, the second part of this article will discuss legal standard for authentication of social media evidence and application of that forensic software in the courtroom. As the conclusion, this article will provide a rethinking, that is, what kind of authenticity is this rule of evidence chase for. Does legal system automatically operate the transcription of scientific knowledge? Or furthermore, it wants to better render justice, not only under scientific fact, but through multivariate debating.

Keywords: federal rule of evidence, internet forensic, printouts as evidence, social media evidence, United States v. Vayner

Procedia PDF Downloads 285
6127 Scanning Electronic Microscopy for Analysis of the Effects of Surfactants on De-Wrinkling and Dispersion of Graphene

Authors: Kostandinos Katsamangas, Fawad Inam

Abstract:

Graphene was dispersed using a tip sonicator and the effect of surfactants were analysed. Sodium Dodecyl Sulphate (SDS) and Polyvinyl Alcohol (PVA) were compared to observe whether or not they had any effect on any de-wrinkling, and secondly whether they aided to achieve better dispersions. There is a huge demand for wrinkle free graphene as this will greatly increase its usefulness in various engineering applications. A comprehensive literature on de-wrinkling graphene has been discussed. Low magnification Scanning Electronic Microscopy (SEM) was conducted to assess the quality of graphene de-wrinkling. The utilization of the PVA has a significant effect on de-wrinkling whereas SDS had minimal effect on the de-wrinkling of graphene.

Keywords: Graphene, de-wrinkling, dispersion, surfactants, scanning electronic microscopy

Procedia PDF Downloads 461
6126 Biomechanical Performance of the Synovial Capsule of the Glenohumeral Joint with a BANKART Lesion through Finite Element Analysis

Authors: Duvert A. Puentes T., Javier A. Maldonado E., Ivan Quintero., Diego F. Villegas

Abstract:

Mechanical Computation is a great tool to study the performance of complex models. An example of it is the study of the human body structure. This paper took advantage of different types of software to make a 3D model of the glenohumeral joint and apply a finite element analysis. The main objective was to study the change in the biomechanical properties of the joint when it presents an injury. Specifically, a BANKART lesion, which consists in the detachment of the anteroinferior labrum from the glenoid. Stress and strain distribution of the soft tissues were the focus of this study. First, a 3D model was made of a joint without any pathology, as a control sample, using segmentation software for the bones with the support of medical imagery and a cadaveric model to represent the soft tissue. The joint was built to simulate a compression and external rotation test using CAD to prepare the model in the adequate position. When the healthy model was finished, it was submitted to a finite element analysis and the results were validated with experimental model data. With the validated model, it was sensitized to obtain the best mesh measurement. Finally, the geometry of the 3D model was changed to imitate a BANKART lesion. Then, the contact zone of the glenoid with the labrum was slightly separated simulating a tissue detachment. With this new geometry, the finite element analysis was applied again, and the results were compared with the control sample created initially. With the data gathered, this study can be used to improve understanding of the labrum tears. Nevertheless, it is important to remember that the computational analysis are approximations and the initial data was taken from an in vitro assay.

Keywords: biomechanics, computational model, finite elements, glenohumeral joint, bankart lesion, labrum

Procedia PDF Downloads 156
6125 Interface Designer as Cultural Producer: A Dialectic Materialist Approach to the Role of Visual Designer in the Present Digital Era

Authors: Cagri Baris Kasap

Abstract:

In this study, how interface designers can be viewed as producers of culture in the current era will be interrogated from a critical theory perspective. Walter Benjamin was a German Jewish literary critical theorist who, during 1930s, was engaged in opposing and criticizing the Nazi use of art and media. ‘The Author as Producer’ is an essay that Benjamin has read at the Communist Institute for the Study of Fascism in Paris. In this article, Benjamin relates directly to the dialectics between base and superstructure and argues that authors, normally placed within the superstructure should consider how writing and publishing is production and directly related to the base. Through it, he discusses what it could mean to see author as producer of his own text, as a producer of writing, understood as an ideological construct that rests on the apparatus of production and distribution. So Benjamin concludes that the author must write in ways that relate to the conditions of production, he must do so in order to prepare his readers to become writers and even make this possible for them by engineering an ‘improved apparatus’ and must work toward turning consumers to producers and collaborators. In today’s world, it has become a leading business model within Web 2.0 services of multinational Internet technologies and culture industries like Amazon, Apple and Google, to transform readers, spectators, consumers or users into collaborators and co-producers through platforms such as Facebook, YouTube and Amazon’s CreateSpace Kindle Direct Publishing print-on-demand, e-book and publishing platforms. However, the way this transformation happens is tightly controlled and monitored by combinations of software and hardware. In these global-market monopolies, it has become increasingly difficult to get insight into how one’s writing and collaboration is used, captured, and capitalized as a user of Facebook or Google. In the lens of this study, it could be argued that this criticism could very well be considered by digital producers or even by the mass of collaborators in contemporary social networking software. How do software and design incorporate users and their collaboration? Are they truly empowered, are they put in a position where they are able to understand the apparatus and how their collaboration is part of it? Or has the apparatus become a means against the producers? Thus, when using corporate systems like Google and Facebook, iPhone and Kindle without any control over the means of production, which is closed off by opaque interfaces and licenses that limit our rights of use and ownership, we are already the collaborators that Benjamin calls for. For example, the iPhone and the Kindle combine a specific use of technology to distribute the relations between the ‘authors’ and the ‘prodUsers’ in ways that secure their monopolistic business models by limiting the potential of the technology.

Keywords: interface designer, cultural producer, Walter Benjamin, materialist aesthetics, dialectical thinking

Procedia PDF Downloads 136
6124 Examining the Changes in Complexity, Accuracy, and Fluency in Japanese L2 Writing Over an Academic Semester

Authors: Robert Long

Abstract:

The results of a one-year study on the evolution of complexity, accuracy, and fluency (CAF) in the compositions of Japanese L2 university students throughout a semester are presented in this study. One goal was to determine if any improvement in writing abilities over this academic term had occurred, while another was to examine methods of editing. Participants had 30 minutes to write each essay with an additional 10 minutes allotted for editing. As for editing, participants were divided into two groups, one of which utilized an online grammar checker, while the other half self-edited their initial manuscripts. From the three different institutions, there was a total of 159 students. Research questions focused on determining if the CAF had evolved over the previous year, identifying potential variations in editing techniques, and describing the connections between the CAF dimensions. According to the findings, there was some improvement in accuracy (fewer errors) in all three of the measures), whereas there was a marked decline in complexity and fluency. As for the second research aim relating to the interaction among the three dimensions (CAF) and of possible increases in fluency being offset by decreases in grammatical accuracy, results showed (there is a logical high correlation with clauses and word counts, and mean length of T-unit (MLT) and (coordinate phrase of T-unit (CP/T) as well as MLT and clause per T-unit (C/T); furthermore, word counts and error/100 ratio correlated highly with error-free clause totals (EFCT). Issues of syntactical complexity had a negative correlation with EFCT, indicating that more syntactical complexity relates to decreased accuracy. Concerning a difference in error correction between those who self-edited and those who used an online grammar correction tool, results indicated that the variable of errors-free clause ratios (EFCR) had the greatest difference regarding accuracy, with fewer errors noted with writers using an online grammar checker. As for possible differences between the first and second (edited) drafts regarding CAF, results indicated there were positive changes in accuracy, the most significant change seen in complexity (CP/T and MLT), while there were relatively insignificant changes in fluency. Results also indicated significant differences among the three institutions, with Fujian University of Technology having the most fluency and accuracy. These findings suggest that to raise students' awareness of their overall writing development, teachers should support them in developing more complex syntactic structures, improving their fluency, and making more effective use of online grammar checkers.

Keywords: complexity, accuracy, fluency, writing

Procedia PDF Downloads 26
6123 Lucilia Sericata Netrin-A: Secreted by Salivary Gland Larvae as a Potential to Neuroregeneration

Authors: Hamzeh Alipour, Masoumeh Bagheri, Tahereh Karamzadeh, Abbasali Raz, Kourosh Azizi

Abstract:

Netrin-A, a protein identified for conducting commissural axons, has a similar role in angiogenesis. In addition, studies have shown that one of the netrin-A receptors is expressed in the growing cells of small capillaries. It will be interesting to study this new group of molecules because their role in wound healing will become clearer in the future due to angiogenesis. The greenbottle blowfly Luciliasericata (L. sericata) larvae are increasingly used in maggot therapy of chronic wounds. This aim of this was the identification of moleculareatures of Netrin-A in L. sericata larvae. Larvae were reared under standard maggotarium conditions. The nucleic acid sequence of L. sericataNetrin-A (LSN-A) was then identified using Rapid Amplification of cDNA Ends (RACE) and Rapid Amplification of Genomic Ends (RAGE). Parts of the Netrin-A gene, including the middle, 3′-, and 5′-ends were identified, TA cloned in pTG19 plasmid, and transferred into DH5ɑ Escherichia coli. Each part was sequenced and assembled using SeqMan software. This gene structure was further subjected to in silico analysis. The DNA of LSN-A was identified to be 2407 bp, while its mRNA sequence was recognized as 2115 bp by Oligo0.7 software. It translated the Netrin-A protein with 704 amino acid residues. Its molecular weight is estimated to be 78.6 kDa. The 3-D structure ofNetrin-A drawn by SWISS-MODEL revealed its similarity to the Netrin-1 of humans with 66.8% identity. The LSN-A protein conduces to repair the myelin membrane in neuronal cells. Ultimately, it can be an effective candidate in neural regeneration and wound healing. Furthermore, our next attempt is to deplore recombinant proteins for use in medical sciences.

Keywords: maggot therapy, netrin-A, RACE, RAGE, lucilia sericata

Procedia PDF Downloads 102
6122 The Use of STIMULAN Resorbable Antibiotic Beads in Conjunction with Autologous Tissue Transfer to Treat Recalcitrant Infections and Osteomyelitis in Diabetic Foot Wounds

Authors: Hayden R Schott, John M Felder III

Abstract:

Introduction: Chronic lower extremity wounds in the diabetic and vasculopathic populations are associated with a high degree of morbidity.When wounds require more extensive treatment than can be offered by wound care centers, more aggressive solutions involve local tissue transfer and microsurgical free tissue transfer for achieving definitive soft tissue coverage. These procedures of autologous tissue transfer (ATT) offer resilient, soft tissue coverage of limb-threatening wounds and confer promising limb salvage rates. However, chronic osteomyelitis and recalcitrant soft tissue infections are common in severe diabetic foot wounds and serve to significantly complicate ATT procedures. Stimulan is a resorbable calcium sulfate antibiotic carrier. The use of stimulan antibiotic beads to treat chronic osteomyelitis is well established in the orthopedic and plastic surgery literature. In these procedures, the beads are placed beneath the skin flap to directly deliver antibiotics to the infection site. The purpose of this study was to quantify the success of Stimulan antibiotic beads in treating recalcitrant infections in patients with diabetic foot wounds receiving ATT. Methods: A retrospective review of clinical and demographic information was performed on patients who underwent ATT with the placement of Stimulan antibiotic beads for attempted limb salvage from 2018-21. Patients were analyzed for preoperative wound characteristics, demographics, infection recurrence, and adverse outcomes as a result of product use. The primary endpoint was 90 day infection recurrence, with secondary endpoints including 90 day complications. Outcomes were compared using basic statistics and Fisher’s exact tests. Results: In this time span, 14 patients were identified. At the time of surgery, all patients exhibited clinical signs of active infection, including positive cultures and erythema. 57% of patients (n=8) exhibited chronic osteomyelitis prior to surgery, and 71% (n=10) had exposed bone at the wound base. In 57% of patients (n=8), Stimulan beads were placed beneath a free tissue flap and beneath a pedicle tissue flap in 42% of patients (n=6). In all patients, Stimulan beads were only applied once. Recurrent infections were observed in 28% of patients (n=4) at 90 days post-op, and flap nonadherence was observed in 7% (n=1). These were the only Stimulan related complications observed. Ultimately, lower limb salvage was successful in 85% of patients (n=12). Notably, there was no significant association between the preoperative presence of osteomyelitis and recurrent infections. Conclusions: The use of Stimulanantiobiotic beads to treat recalcitrant infections in patients receiving definitive skin coverage of diabetic foot wounds does not appear to demonstrate unnecessary risk. Furthermore, the lack of significance between the preoperative presence of osteomyelitis and recurrent infections indicates the successful use of Stimulan to dampen infection in patients with osteomyelitis, as is consistent with the literature. Further research is needed to identify Stimulan as the significant contributor to infection treatment using future cohort and case control studies with more patients. Nonetheless, the use of Stimulan antibiotic beads in patients with diabetic foot wounds demonstrates successful infection suppression and maintenance of definitive soft tissue coverage.

Keywords: wound care, stimulan antibiotic beads, free tissue transfer, plastic surgery, wound, infection

Procedia PDF Downloads 82
6121 Life Cycle Datasets for the Ornamental Stone Sector

Authors: Isabella Bianco, Gian Andrea Blengini

Abstract:

The environmental impact related to ornamental stones (such as marbles and granites) is largely debated. Starting from the industrial revolution, continuous improvements of machineries led to a higher exploitation of this natural resource and to a more international interaction between markets. As a consequence, the environmental impact of the extraction and processing of stones has increased. Nevertheless, if compared with other building materials, ornamental stones are generally more durable, natural, and recyclable. From the scientific point of view, studies on stone life cycle sustainability have been carried out, but these are often partial or not very significant because of the high percentage of approximations and assumptions in calculations. This is due to the lack, in life cycle databases (e.g. Ecoinvent, Thinkstep, and ELCD), of datasets about the specific technologies employed in the stone production chain. For example, databases do not contain information about diamond wires, chains or explosives, materials commonly used in quarries and transformation plants. The project presented in this paper aims to populate the life cycle databases with specific data of specific stone processes. To this goal, the methodology follows the standardized approach of Life Cycle Assessment (LCA), according to the requirements of UNI 14040-14044 and to the International Reference Life Cycle Data System (ILCD) Handbook guidelines of the European Commission. The study analyses the processes of the entire production chain (from-cradle-to-gate system boundaries), including the extraction of benches, the cutting of blocks into slabs/tiles and the surface finishing. Primary data have been collected in Italian quarries and transformation plants which use technologies representative of the current state-of-the-art. Since the technologies vary according to the hardness of the stone, the case studies comprehend both soft stones (marbles) and hard stones (gneiss). In particular, data about energy, materials and emissions were collected in marble basins of Carrara and in Beola and Serizzo basins located in the province of Verbano Cusio Ossola. Data were then elaborated through an appropriate software to build a life cycle model. The model was realized setting free parameters that allow an easy adaptation to specific productions. Through this model, the study aims to boost the direct participation of stone companies and encourage the use of LCA tool to assess and improve the stone sector environmental sustainability. At the same time, the realization of accurate Life Cycle Inventory data aims at making available, to researchers and stone experts, ILCD compliant datasets of the most significant processes and technologies related to the ornamental stone sector.

Keywords: life cycle assessment, LCA datasets, ornamental stone, stone environmental impact

Procedia PDF Downloads 226
6120 Analysis on the Effectiveness of the "Three-Exemption" Policy Aimed at Promoting Unpaid Blood Donation in Zhejiang

Authors: Ni Tang, Jinping Zhang

Abstract:

An effective and sustainable volunteer team is needed to create a more available blood supply system. In order to promote the sustainable development of blood donation in Zhejiang Province, China, a “three-exemption” policy was proposed in 2014: blood donors who received the National Award for unpaid blood donation may government-invested and funded parks, scenic spots and other places for free, visit non-profit medical institutions for free outpatient fees, and be exempted from urban public transportation fees. As the policy has been in place for seven years, this study evaluated the effectiveness of the policy by comparing the increasing rate of blood donation in Hangzhou (capital city of Zhejiang) before and after the policy using the intermittent time series analysis. The blood donation in Anhui, a Province near Zhejiang, was also compared as a negative control. Blood donation data from 2012 to 2018 were obtained from the donation center's official websites. The increasing rate of blood donation volume since 2012 in Hangzhou is 34.37 units/month, and after 2014, the increasing rate additionally increases 71.69 (p=0.1442), which indicating a statistically non-significant change after the policy. While as a negative control, in Anhui, the increasing rate of blood donation volume since 2012 is -163.3 unit/month, and the increasing rate additionally increases 167.2 (p=5.63e-07) after 2014. The result shows that the three-exemption policy had a certain level of impact on encouraging volunteers to donate blood, but the effect was not substantial. One possible reason for the ineffectiveness of the policy might be a lack of public awareness of the policy. On the other hand, this policy mainly waived unnecessary life expenses, such as fares and scenic entrance fees, and requires a certain number of blood donations, registration procedures, and blood donation certificates. Perhaps, reducing life-related expenses such as oil, water and electricity, could better attract people to participate in blood donation. This current study on the three-exemption policy provides a new direction for promoting people's blood donation. Incentive policies may require greater publicity and incentives. In order to better ensure the operation of the blood donation system, other policies, especially incentive policies, should be further explored.

Keywords: blood donation, policy, Zhejiang, unpaid blood donation, three-exemption policy

Procedia PDF Downloads 201
6119 Virtual Reality and Avatars in Education

Authors: Michael Brazley

Abstract:

Virtual Reality (VR) and 3D videos are the most current generation of learning technology today. Virtual Reality and 3D videos are being used in professional offices and Schools now for marketing and education. Technology in the field of design has progress from two dimensional drawings to 3D models, using computers and sophisticated software. Virtual Reality is being used as collaborative means to allow designers and others to meet and communicate inside models or VR platforms using avatars. This research proposes to teach students from different backgrounds how to take a digital model into a 3D video, then into VR, and finally VR with multiple avatars communicating with each other in real time. The next step would be to develop the model where people from three or more different locations can meet as avatars in real time, in the same model and talk to each other. This research is longitudinal, studying the use of 3D videos in graduate design and Virtual Reality in XR (Extended Reality) courses. The research methodology is a combination of quantitative and qualitative methods. The qualitative methods begin with the literature review and case studies. The quantitative methods come by way of student’s 3D videos, survey, and Extended Reality (XR) course work. The end product is to develop a VR platform with multiple avatars being able to communicate in real time. This research is important because it will allow multiple users to remotely enter your model or VR platform from any location in the world and effectively communicate in real time. This research will lead to improved learning and training using Virtual Reality and Avatars; and is generalizable because most Colleges, Universities, and many citizens own VR equipment and computer labs. This research did produce a VR platform with multiple avatars having the ability to move and speak to each other in real time. Major implications of the research include but not limited to improved: learning, teaching, communication, marketing, designing, planning, etc. Both hardware and software played a major role in project success.

Keywords: virtual reality, avatars, education, XR

Procedia PDF Downloads 93
6118 Split Health System for Diabetes Care in Urban Area: Experience from an Action Research Project in an Urban Poor Neighborhood in Bengaluru

Authors: T. S. Beerenahally, S. Amruthavalli, C. M. Munegowda, Leelavathi, Nagarathna

Abstract:

Introduction: In majority of urban India, the health system is split between different authorities being responsible for the health care of urban population. We believe that, apart from poor awareness and financial barriers to care, there are other health system barriers which affect quality and access to care for people with diabetes. In this paper, we attempted to identify health system complexity that determines access to public health system for diabetes care in KG Halli, a poor urban neighborhood in Bengaluru. The KG Halli has been a locus of a health systems research from 2009 to 2015. Methodology: The source of data is from the observational field-notes written by research team as part of urban health action research project (UHARP). Field notes included data from the community and the public primary care center. The data was generated by the community health assistants and the other research team members during regular home visits and interaction with individuals who self-reported to be diabetic over four years as part of UHARP. Results: It emerged during data analysis that the patients were not keen on utilizing primary public health center for many reasons. Patient has felt that the service provided at the center was not integrated. There was lack of availability of medicines, with a regular stock out of medicines in a year and laboratory service for investigation was limited. Many of them said that the time given by the providers was not sufficient and there was also a feeling of providers not listening to them attentively. The power dynamics played a huge role in communication. Only the consultation was available for free of cost at the public primary care center. The patient had to spend for the investigations and the major portion for medicine. Conclusion: Diabetes is a chronic disease that poses an important emerging public health concern. Most of the financial burden is borne by the family as the public facilities have failed to provide free care in India. Our study indicated various factors including individual beliefs, stigma and financial constraints affecting compliance to diabetes care.

Keywords: diabetes care, disintegrated health system, quality of care, urban health

Procedia PDF Downloads 154
6117 Statistical Pattern Recognition for Biotechnological Process Characterization Based on High Resolution Mass Spectrometry

Authors: S. Fröhlich, M. Herold, M. Allmer

Abstract:

Early stage quantitative analysis of host cell protein (HCP) variations is challenging yet necessary for comprehensive bioprocess development. High resolution mass spectrometry (HRMS) provides a high-end technology for accurate identification alongside with quantitative information. Hereby we describe a flexible HRMS assay platform to quantify HCPs relevant in microbial expression systems such as E. Coli in both up and downstream development by means of MVDA tools. Cell pellets were lysed and proteins extracted, purified samples not further treated before applying the SMART tryptic digest kit. Peptides separation was optimized using an RP-UHPLC separation platform. HRMS-MSMS analysis was conducted on an Orbitrap Velos Elite applying CID. Quantification was performed label-free taking into account ionization properties and physicochemical peptide similarities. Results were analyzed using SIEVE 2.0 (Thermo Fisher Scientific) and SIMCA (Umetrics AG). The developed HRMS platform was applied to an E. Coli expression set with varying productivity and the corresponding downstream process. Selected HCPs were successfully quantified within the fmol range. Analysing HCP networks based on pattern analysis facilitated low level quantification and enhanced validity. This approach is of high relevance for high-throughput screening experiments during upstream development, e.g. for titer determination, dynamic HCP network analysis or product characterization. Considering the downstream purification process, physicochemical clustering of identified HCPs is of relevance to adjust buffer conditions accordingly. However, the technology provides an innovative approach for label-free MS based quantification relying on statistical pattern analysis and comparison. Absolute quantification based on physicochemical properties and peptide similarity score provides a technological approach without the need of sophisticated sample preparation strategies and is therefore proven to be straightforward, sensitive and highly reproducible in terms of product characterization.

Keywords: process analytical technology, mass spectrometry, process characterization, MVDA, pattern recognition

Procedia PDF Downloads 242
6116 Study of Motion of Impurity Ions in Poly(Vinylidene Fluoride) from View Point of Microstructure of Polymer Solid

Authors: Yuichi Anada

Abstract:

Electrical properties of polymer solid is characterized by dielectric relaxation phenomenon. Complex permittivity shows a high dependence on frequency of external stimulation in the broad frequency range from 0.1mHz to 10GHz. The complex-permittivity dispersion gives us a lot of useful information about the molecular motion of polymers and the structure of polymer aggregates. However, the large dispersion of permittivity at low frequencies due to DC conduction of impurity ions often covers the dielectric relaxation in polymer solid. In experimental investigation, many researchers have tried to remove the DC conduction experimentally or analytically for a long time. On the other hand, our laboratory chose another way of research for this problem from the point of view of a reversal in thinking. The way of our research is to use the impurity ions in the DC conduction as a probe to detect the motion of polymer molecules and to investigate the structure of polymer aggregates. In addition to the complex permittivity, the electric modulus and the conductivity relaxation time are strong tools for investigating the ionic motion in DC conduction. In a non-crystalline part of melt-crystallized polymers, free spaces with inhomogeneous size exist between crystallites. As the impurity ions exist in the non-crystalline part and move through these inhomogeneous free spaces, the motion of ions reflects the microstructure of non-crystalline part. The ionic motion of impurity ions in poly(vinylidene fluoride) (PVDF) is investigated in this study. Frequency dependence of the loss permittivity of PVDF shows a characteristic of the direct current (DC) conduction below 1 kHz of frequency at 435 K. The electric modulus-frequency curve shows a characteristic of the dispersion with the single conductivity relaxation time. Namely, it is the Debye-type dispersion. The conductivity relaxation time analyzed from this curve is 0.00003 s at 435 K. From the plot of conductivity relaxation time of PVDF together with the other polymers against permittivity, it was found that there are two group of polymers; one of the group is characterized by small conductivity relaxation time and large permittivity, and another is characterized by large conductivity relaxation time and small permittivity.

Keywords: conductivity relaxation time, electric modulus, ionic motion, permittivity, poly(vinylidene fluoride), DC conduction

Procedia PDF Downloads 165
6115 Grisotti Flap as Treatment for Central Tumors of the Breast

Authors: R. Pardo, P. Menendez, MA Gil-Olarte, S. Sanchez, E. García, R. Quintana, J. Martín

Abstract:

Introduction : Within oncoplastic breast techniques there is increased interest in immediate partial breast reconstruction. The volume resected is greater than that of conventional conservative techniques. Central tumours of the breast have classically been treated with a mastectomy with regard to oncological safety and cosmetic secondary effects after wide central resection of the nipple and breast tissue beneath. Oncological results for central quadrantectomy have a recurrence level, disease- free period and survival identical to mastectomy. Grissoti flap is an oncoplastic surgical technique that allows the surgeon to perform a safe central quadrantectomy with excellent cosmetic results. Material and methods: The Grissoti flap is a glandular cutaneous advancement rotation flap that can fill the defect in the central portion of the excised breast. If the inferior border is affected by tumour and further surgery is decided upon at the Multidisciplinary Team Meeting, it will be necessary to perform a mastectomy. All patients with a Grisotti flap undergoing surgery since 2009 were reviewed obtaining the following data: age, hystopathological diagnosis, size, operating time, volume of tissue resected, postoperative admission time, re-excisions due to positive margins affected by tumour, wound dehiscence, complications and recurrence. Analysis and results of sentinel node biopsy were also obtained. Results: 12 patients underwent surgery between 2009-2015. The mean age was 54 years (34-67) . All had a preoperative diagnosis of ductal infiltrative carcinoma of less than 2 cm,. Diagnosis was made with Ultrasound, Mamography or both . Magnetic resonance was used in 5 cases. No patients had preoperative positive axilla after ultrasound exploration. Mean operating time was 104 minutes (84-130). Postoperative stay was 24 hours. Mean volume resected was 159 cc (70-286). In one patient the surgical border was affected by tumour and a further procedure with resection of the affected border was performed as ambulatory surgery. The sentinel node biopsy was positive for micrometastasis in only two cases. In one case lymphadenectomy was performed in 2009. In the other, treated in 2015, no lymphadenectomy was performed as the patient had a favourable histopathological prognosis and the multidisciplinary team meeting agreed that lymphadenectomy was not required. No recurrence has been diagnosed in any of the patients who underwent surgery and they are all disease free at present. Conclusions: Conservative surgery for retroareolar central tumours of the breast results in good local control of the disease with free surgical borders, including resection of the nipple areola complex and pectoral major muscle fascia. Reconstructive surgery with the inferior Grissoti flap adequately fills the defect after central quadrantectomy with creation of a new cutaneous disc where a new nipple areola complex is reconstructed with a local flap or micropigmentation. This avoids the need for contralateral symmetrization. Sentinel Node biopsy can be performed without added morbidity. When feasible, the Grissoti flap will avoid skin-sparing mastectomy for central breast tumours that will require the use of an expander, prosthesis or myocutaneous flap, with all the complications of a more complex operation.

Keywords: Grisotti flap, oncoplastic surgery, central tumours, breast

Procedia PDF Downloads 329
6114 A Study on Characteristics of Runoff Analysis Methods at the Time of Rainfall in Rural Area, Okinawa Prefecture Part 2: A Case of Kohatu River in South Central Part of Okinawa Pref

Authors: Kazuki Kohama, Hiroko Ono

Abstract:

The rainfall in Japan is gradually increasing every year according to Japan Meteorological Agency and Intergovernmental Panel on Climate Change Fifth Assessment Report. It means that the rainfall difference between rainy season and non-rainfall is increasing. In addition, the increasing trend of strong rain for a short time clearly appears. In recent years, natural disasters have caused enormous human injuries in various parts of Japan. Regarding water disaster, local heavy rain and floods of large rivers occur frequently, and it was decided on a policy to promote hard and soft sides as emergency disaster prevention measures with water disaster prevention awareness social reconstruction vision. Okinawa prefecture in subtropical region has torrential rain and water disaster several times a year such as river flood, in which is caused in specific rivers from all 97 rivers. Also, the shortage of capacity and narrow width are characteristic of river in Okinawa and easily cause river flood in heavy rain. This study focuses on Kohatu River that is one of the specific rivers. In fact, the water level greatly rises over the river levee almost once a year but non-damage of buildings around. On the other hand in some case, the water level reaches to ground floor height of house and has happed nine times until today. The purpose of this research is to figure out relationship between precipitation, surface outflow and total treatment water quantity of Kohatu River. For the purpose, we perform hydrological analysis although is complicated and needs specific details or data so that, the method is mainly using Geographic Information System software and outflow analysis system. At first, we extract watershed and then divided to 23 catchment areas to understand how much surface outflow flows to runoff point in each 10 minutes. On second, we create Unit Hydrograph indicating the area of surface outflow with flow area and time. This index shows the maximum amount of surface outflow at 2400 to 3000 seconds. Lastly, we compare an estimated value from Unit Hydrograph to a measured value. However, we found that measure value is usually lower than measured value because of evaporation and transpiration. In this study, hydrograph analysis was performed using GIS software and outflow analysis system. Based on these, we could clarify the flood time and amount of surface outflow.

Keywords: disaster prevention, water disaster, river flood, GIS software

Procedia PDF Downloads 131
6113 Stability Analysis of Rabies Model with Vaccination Effect and Culling in Dogs

Authors: Eti Dwi Wiraningsih, Folashade Agusto, Lina Aryati, Syamsuddin Toaha, Suzanne Lenhart, Widodo, Willy Govaerts

Abstract:

This paper considers a deterministic model for the transmission dynamics of rabies virus in the wild dogs-domestic dogs-human zoonotic cycle. The effect of vaccination and culling in dogs is considered on the model, then the stability was analysed to get basic reproduction number. We use the next generation matrix method and Routh-Hurwitz test to analyze the stability of the Disease-Free Equilibrium and Endemic Equilibrium of this model.

Keywords: stability analysis, rabies model, vaccination effect, culling in dogs

Procedia PDF Downloads 622
6112 Influence of Natural Rubber on the Frictional and Mechanical Behavior of the Composite Brake Pad Materials

Authors: H. Yanar, G. Purcek, H. H. Ayar

Abstract:

The ingredients of composite materials used for the production of composite brake pads play an important role in terms of safety braking performance of automobiles and trains. Therefore, the ingredients must be selected carefully and used in appropriate ratios in the matrix structure of the brake pad materials. In the present study, a non-asbestos organic composite brake pad materials containing binder resin, space fillers, solid lubricants, and friction modifier was developed, and its fillers content was optimized by adding natural rubber with different rate into the specified matrix structure in order to achieve the best combination of tribo-performance and mechanical properties. For this purpose, four compositions with different rubber content (2.5wt.%, 5.0wt.%, 7.5wt.% and 10wt.%) were prepared and then test samples with the diameter of 20 mm and length of 15 mm were produced to evaluate the friction and mechanical behaviors of the mixture. The friction and wear tests were performed using a pin-on-disc type test rig which was designed according to NF-F-11-292 French standard. All test samples were subjected to two different types of friction tests defined as periodic braking and continuous braking (also known as fade test). In this way, the coefficient of friction (CoF) of composite sample with different rubber content were determined as a function of number of braking cycle and temperature of the disc surface. The results demonstrated that addition of rubber into the matrix structure of the composite caused a significant change in the CoF. Average CoF of the composite samples increased linearly with increasing rubber content into the matrix. While the average CoF was 0.19 for the rubber-free composite, the composite sample containing 20wt.% rubber had the maximum CoF of about 0.24. Although the CoF of composite sample increased, the amount of specific wear rate decreased with increasing rubber content into the matrix. On the other hand, it was observed that the CoF decreased with increasing temperature generated in-between sample and disk depending on the increasing rubber content. While the CoF decreased to the minimum value of 0.15 at 400 °C for the rubber-free composite sample, the sample having the maximum rubber content of 10wt.% exhibited the lowest one of 0.09 at the same temperature. Addition of rubber into the matrix structure decreased the hardness and strength of the samples. It was concluded from the results that the composite matrix with 5 wt.% rubber had the best composition regarding the performance parameters such as required frictional and mechanical behavior. This composition has the average CoF of 0.21, specific wear rate of 0.024 cm³/MJ and hardness value of 63 HRX.

Keywords: brake pad composite, friction and wear, rubber, friction materials

Procedia PDF Downloads 132
6111 Combined Surface Tension and Natural Convection of Nanofluids in a Square Open Cavity

Authors: Habibis Saleh, Ishak Hashim

Abstract:

Combined surface tension and natural convection heat transfer in an open cavity is studied numerically in this article. The cavity is filled with water-{Cu} nanofluids. The left wall is kept at low temperature, the right wall at high temperature and the bottom and top walls are adiabatic. The top free surface is assumed to be flat and non--deformable. Finite difference method is applied to solve the dimensionless governing equations. It is found that the insignificant effect of adding the nanoparticles were obtained about $Ma_{bf}=250$.

Keywords: natural convection, marangoni convection, nanofluids, square open cavity

Procedia PDF Downloads 543