Search results for: code blue drill
1529 Collapse Capacity Assessment of Inelastic Structures under Seismic Sequences
Authors: Shahrzad Mohammadi, Ghasem Boshrouei Sharq
Abstract:
All seismic design codes are based on the determination of the design earthquake without taking into account the effects of aftershocks in the design practice. In regions with a high level of seismicity, the occurrence of several aftershocks of various magnitudes and different time lags is very likely. This research aims to estimate the collapse capacity of a 10-story steel bundled tube moment frame subjected to as-recorded seismic sequences. The studied structure is designed according to the seismic regulations of the fourth revision of the Iranian code of practice for the seismic-resistant design of buildings (Code No.2800). A series of incremental dynamic analyses (IDA) is performed up to the collapse level of the intact structure. Then, in order to demonstrate the effects of aftershock events on the collapse vulnerability of the building, aftershock IDA analyzes are carried out. To gain deeper insight, collapse fragility curves are developed and compared for both series. Also, a study on the influence of various ground motion characteristics on collapse capacity is carried out. The results highlight the importance of considering the decisive effects of aftershocks in seismic codes due to their contribution to the occurrence of collapse.Keywords: IDA, aftershock, bundled tube frame, fragility assessment, GM characteristics, as-recorded seismic sequences
Procedia PDF Downloads 1421528 Testing and Validation Stochastic Models in Epidemiology
Authors: Snigdha Sahai, Devaki Chikkavenkatappa Yellappa
Abstract:
This study outlines approaches for testing and validating stochastic models used in epidemiology, focusing on the integration and functional testing of simulation code. It details methods for combining simple functions into comprehensive simulations, distinguishing between deterministic and stochastic components, and applying tests to ensure robustness. Techniques include isolating stochastic elements, utilizing large sample sizes for validation, and handling special cases. Practical examples are provided using R code to demonstrate integration testing, handling of incorrect inputs, and special cases. The study emphasizes the importance of both functional and defensive programming to enhance code reliability and user-friendliness.Keywords: computational epidemiology, epidemiology, public health, infectious disease modeling, statistical analysis, health data analysis, disease transmission dynamics, predictive modeling in health, population health modeling, quantitative public health, random sampling simulations, randomized numerical analysis, simulation-based analysis, variance-based simulations, algorithmic disease simulation, computational public health strategies, epidemiological surveillance, disease pattern analysis, epidemic risk assessment, population-based health strategies, preventive healthcare models, infection dynamics in populations, contagion spread prediction models, survival analysis techniques, epidemiological data mining, host-pathogen interaction models, risk assessment algorithms for disease spread, decision-support systems in epidemiology, macro-level health impact simulations, socioeconomic determinants in disease spread, data-driven decision making in public health, quantitative impact assessment of health policies, biostatistical methods in population health, probability-driven health outcome predictions
Procedia PDF Downloads 81527 Ethical Decision-Making by Healthcare Professionals during Disasters: Izmir Province Case
Authors: Gulhan Sen
Abstract:
Disasters could result in many deaths and injuries. In these difficult times, accessible resources are limited, demand and supply balance is distorted, and there is a need to make urgent interventions. Disproportionateness between accessible resources and intervention capacity makes triage a necessity in every stage of disaster response. Healthcare professionals, who are in charge of triage, have to evaluate swiftly and make ethical decisions about which patients need priority and urgent intervention given the limited available resources. For such critical times in disaster triage, 'doing the greatest good for the greatest number of casualties' is adopted as a code of practice. But there is no guide for healthcare professionals about ethical decision-making during disasters, and this study is expected to use as a source in the preparation of the guide. This study aimed to examine whether the qualities healthcare professionals in Izmir related to disaster triage were adequate and whether these qualities influence their capacity to make ethical decisions. The researcher used a survey developed for data collection. The survey included two parts. In part one, 14 questions solicited information about socio-demographic characteristics and knowledge levels of the respondents on ethical principles of disaster triage and allocation of scarce resources. Part two included four disaster scenarios adopted from existing literature and respondents were asked to make ethical decisions in triage based on the provided scenarios. The survey was completed by 215 healthcare professional working in Emergency-Medical Stations, National Medical Rescue Teams and Search-Rescue-Health Teams in Izmir. The data was analyzed with SPSS software. Chi-Square Test, Mann-Whitney U Test, Kruskal-Wallis Test and Linear Regression Analysis were utilized. According to results, it was determined that 51.2% of the participants had inadequate knowledge level of ethical principles of disaster triage and allocation of scarce resources. It was also found that participants did not tend to make ethical decisions on four disaster scenarios which included ethical dilemmas. They stayed in ethical dilemmas that perform cardio-pulmonary resuscitation, manage limited resources and make decisions to die. Results also showed that participants who had more experience in disaster triage teams, were more likely to make ethical decisions on disaster triage than those with little or no experience in disaster triage teams(p < 0.01). Moreover, as their knowledge level of ethical principles of disaster triage and allocation of scarce resources increased, their tendency to make ethical decisions also increased(p < 0.001). In conclusion, having inadequate knowledge level of ethical principles and being inexperienced affect their ethical decision-making during disasters. So results of this study suggest that more training on disaster triage should be provided on the areas of the pre-impact phase of disaster. In addition, ethical dimension of disaster triage should be included in the syllabi of the ethics classes in the vocational training for healthcare professionals. Drill, simulations, and board exercises can be used to improve ethical decision making abilities of healthcare professionals. Disaster scenarios where ethical dilemmas are faced should be prepared for such applied training programs.Keywords: disaster triage, medical ethics, ethical principles of disaster triage, ethical decision-making
Procedia PDF Downloads 2451526 The Social Aspects of Code-Switching in Online Interaction: The Case of Saudi Bilinguals
Authors: Shirin Alabdulqader
Abstract:
This research aims to investigate the concept of code-switching (CS) between English, Arabic, and the CS practices of Saudi online users via a Translanguaging (TL) lens for more inclusive view towards the nature of the data from the study. It employs Digitally Mediated Communication (DMC), specifically the WhatsApp and Twitter platforms, in order to understand how the users employ online resources to communicate with others on a daily basis. This project looks beyond language and considers the multimodal affordances (visual and audio means) that interlocutors utilise in their online communicative practices to shape their online social existence. This exploratory study is based on a data-driven interpretivist epistemology as it aims to understand how meaning (reality) is created by individuals within different contexts. This project used a mixed-method approach, combining a qualitative and a quantitative approach. In the former, data were collected from online chats and interview responses, while in the latter a questionnaire was employed to understand the frequency and relations between the participants’ linguistic and non-linguistic practices and their social behaviours. The participants were eight bilingual Saudi nationals (both men and women, aged between 20 and 50 years old) who interacted with others online. These participants provided their online interactions, participated in an interview and responded to a questionnaire. The study data were gathered from 194 WhatsApp chats and 122 Tweets. These data were analysed and interpreted according to three levels: conversational turn taking and CS; the linguistic description of the data; and CS and persona. This project contributes to the emerging field of analysing online Arabic data systematically, and the field of multimodality and bilingual sociolinguistics. The findings are reported for each of the three levels. For conversational turn taking, the CS analysis revealed that it was used to accomplish negotiation and develop meaning in the conversation. With regard to the linguistic practices of the CS data, the majority of the code-switched words were content morphemes. The third level of data interpretation is CS and its relationship with identity; two types of identity were indexed; absolute identity and contextual identity. This study contributes to the DMC literature and bridges some of the existing gaps. The findings of this study are that CS by its nature, and most of the findings, if not all, support the notion of TL that multiliteracy is one’s ability to decode multimodal communication, and that this multimodality contributes to the meaning. Either this is applicable to the online affordances used by monolinguals or multilinguals and perceived not only by specific generations but also by any online multiliterates, the study provides the linguistic features of CS utilised by Saudi bilinguals and it determines the relationship between these features and the contexts in which they appear.Keywords: social media, code-switching, translanguaging, online interaction, saudi bilinguals
Procedia PDF Downloads 1311525 Survey Based Data Security Evaluation in Pakistan Financial Institutions against Malicious Attacks
Authors: Naveed Ghani, Samreen Javed
Abstract:
In today’s heterogeneous network environment, there is a growing demand for distrust clients to jointly execute secure network to prevent from malicious attacks as the defining task of propagating malicious code is to locate new targets to attack. Residual risk is always there no matter what solutions are implemented or whet so ever security methodology or standards being adapted. Security is the first and crucial phase in the field of Computer Science. The main aim of the Computer Security is gathering of information with secure network. No one need wonder what all that malware is trying to do: It's trying to steal money through data theft, bank transfers, stolen passwords, or swiped identities. From there, with the help of our survey we learn about the importance of white listing, antimalware programs, security patches, log files, honey pots, and more used in banks for financial data protection but there’s also a need of implementing the IPV6 tunneling with Crypto data transformation according to the requirements of new technology to prevent the organization from new Malware attacks and crafting of its own messages and sending them to the target. In this paper the writer has given the idea of implementing IPV6 Tunneling Secessions on private data transmission from financial organizations whose secrecy needed to be safeguarded.Keywords: network worms, malware infection propagating malicious code, virus, security, VPN
Procedia PDF Downloads 3581524 Reliability Levels of Reinforced Concrete Bridges Obtained by Mixing Approaches
Authors: Adrián D. García-Soto, Alejandro Hernández-Martínez, Jesús G. Valdés-Vázquez, Reyna A. Vizguerra-Alvarez
Abstract:
Reinforced concrete bridges designed by code are intended to achieve target reliability levels adequate for the geographical environment where the code is applicable. Several methods can be used to estimate such reliability levels. Many of them require the establishment of an explicit limit state function (LSF). When such LSF is not available as a close-form expression, the simulation techniques are often employed. The simulation methods are computing intensive and time consuming. Note that if the reliability of real bridges designed by code is of interest, numerical schemes, the finite element method (FEM) or computational mechanics could be required. In these cases, it can be quite difficult (or impossible) to establish a close-form of the LSF, and the simulation techniques may be necessary to compute reliability levels. To overcome the need for a large number of simulations when no explicit LSF is available, the point estimate method (PEM) could be considered as an alternative. It has the advantage that only the probabilistic moments of the random variables are required. However, in the PEM, fitting of the resulting moments of the LSF to a probability density function (PDF) is needed. In the present study, a very simple alternative which allows the assessment of the reliability levels when no explicit LSF is available and without the need of extensive simulations is employed. The alternative includes the use of the PEM, and its applicability is shown by assessing reliability levels of reinforced concrete bridges in Mexico when a numerical scheme is required. Comparisons with results by using the Monte Carlo simulation (MCS) technique are included. To overcome the problem of approximating the probabilistic moments from the PEM to a PDF, a well-known distribution is employed. The approach mixes the PEM and other classic reliability method (first order reliability method, FORM). The results in the present study are in good agreement whit those computed with the MCS. Therefore, the alternative of mixing the reliability methods is a very valuable option to determine reliability levels when no close form of the LSF is available, or if numerical schemes, the FEM or computational mechanics are employed.Keywords: structural reliability, reinforced concrete bridges, combined approach, point estimate method, monte carlo simulation
Procedia PDF Downloads 3461523 Effect of Graphene on the Structural and Optical Properties of Ceria:Graphene Nanocomposites
Authors: R. Udayabhaskar, R. V. Mangalaraja, V. T. Perarasu, Saeed Farhang Sahlevani, B. Karthikeyan, David Contreras
Abstract:
Bandgap engineering of CeO₂ nanocrystals is of high interest for many research groups to meet the requirement of desired applications. The band gap of CeO₂ nanostructures can be modified by varying the particle size, morphology and dopants. Anchoring the metal oxide nanostructures on graphene sheets will result in composites with improved properties than the parent materials. The presence of graphene sheets will acts a support for the growth, influences the morphology and provides external paths for electronic transitions. Thus, the controllable synthesis of ceria:graphene composites with various morphologies and the understanding of the optical properties is highly important for the usage of these materials in various applications. The development of ceria and ceria:graphene composites with low cost, rapid synthesis with tunable optical properties is still desirable. By this work, we discuss the synthesis of pure ceria (nanospheres) and ceria:graphene composites (nano-rice like morphology) by using commercial microwave oven as a cost effective and environmentally friendly approach. The influence of the graphene on the crystallinity, morphology, band gap and luminescence of the synthesized samples were analyzed. The average crystallite size obtained by using Scherrer formula of the CeO₂ nanostructures showed a decreasing trend with increasing the graphene loading. The higher graphene loaded ceria composite clearly depicted morphology of nano-rice like in shape with the diameter below 10 nm and the length over 50 nm. The presence of graphene and ceria related vibrational modes (100-4000 cm⁻¹) confirmed the successful formation of composites. We observed an increase in band gap (blue shift) with increasing loading amount of graphene. Further, the luminescence related to various F-centers was quenched in the composites. The authors gratefully acknowledge the FONDECYT Project No.: 3160142 and BECA Conicyt National Doctorado2017 No. 21170851 Government of Chile, Santiago, for the financial assistance.Keywords: ceria, graphene, luminescence, blue shift, band gap widening
Procedia PDF Downloads 1921522 Micro-Filtration with an Inorganic Membrane
Authors: Benyamina, Ouldabess, Bensalah
Abstract:
The aim of this study is to use membrane technique for filtration of a coloring solution. the preparation of the micro-filtration membranes is based on a natural clay powder with a low cost, deposited on macro-porous ceramic supports. The micro-filtration membrane provided a very large permeation flow. Indeed, the filtration effectiveness of membrane was proved by the total discoloration of bromothymol blue solution with initial concentration of 10-3 mg/L after the first minutes.Keywords: the inorganic membrane, micro-filtration, coloring solution, natural clay powder
Procedia PDF Downloads 5131521 Evaluation of a Data Fusion Algorithm for Detecting and Locating a Radioactive Source through Monte Carlo N-Particle Code Simulation and Experimental Measurement
Authors: Hadi Ardiny, Amir Mohammad Beigzadeh
Abstract:
Through the utilization of a combination of various sensors and data fusion methods, the detection of potential nuclear threats can be significantly enhanced by extracting more information from different data. In this research, an experimental and modeling approach was employed to track a radioactive source by combining a surveillance camera and a radiation detector (NaI). To run this experiment, three mobile robots were utilized, with one of them equipped with a radioactive source. An algorithm was developed in identifying the contaminated robot through correlation between camera images and camera data. The computer vision method extracts the movements of all robots in the XY plane coordinate system, and the detector system records the gamma-ray count. The position of the robots and the corresponding count of the moving source were modeled using the MCNPX simulation code while considering the experimental geometry. The results demonstrated a high level of accuracy in finding and locating the target in both the simulation model and experimental measurement. The modeling techniques prove to be valuable in designing different scenarios and intelligent systems before initiating any experiments.Keywords: nuclear threats, radiation detector, MCNPX simulation, modeling techniques, intelligent systems
Procedia PDF Downloads 1241520 Streamlining .NET Data Access: Leveraging JSON for Data Operations in .NET
Authors: Tyler T. Procko, Steve Collins
Abstract:
New features in .NET (6 and above) permit streamlined access to information residing in JSON-capable relational databases, such as SQL Server (2016 and above). Traditional methods of data access now comparatively involve unnecessary steps which compromise system performance. This work posits that the established ORM (Object Relational Mapping) based methods of data access in applications and APIs result in common issues, e.g., object-relational impedance mismatch. Recent developments in C# and .NET Core combined with a framework of modern SQL Server coding conventions have allowed better technical solutions to the problem. As an amelioration, this work details the language features and coding conventions which enable this streamlined approach, resulting in an open-source .NET library implementation called Codeless Data Access (CODA). Canonical approaches rely on ad-hoc mapping code to perform type conversions between the client and back-end database; with CODA, no mapping code is needed, as JSON is freely mapped to SQL and vice versa. CODA streamlines API data access by improving on three aspects of immediate concern to web developers, database engineers and cybersecurity professionals: Simplicity, Speed and Security. Simplicity is engendered by cutting out the “middleman” steps, effectively making API data access a whitebox, whereas traditional methods are blackbox. Speed is improved because of the fewer translational steps taken, and security is improved as attack surfaces are minimized. An empirical evaluation of the speed of the CODA approach in comparison to ORM approaches ] is provided and demonstrates that the CODA approach is significantly faster. CODA presents substantial benefits for API developer workflows by simplifying data access, resulting in better speed and security and allowing developers to focus on productive development rather than being mired in data access code. Future considerations include a generalization of the CODA method and extension outside of the .NET ecosystem to other programming languages.Keywords: API data access, database, JSON, .NET core, SQL server
Procedia PDF Downloads 661519 Learning Mandarin Chinese as a Foreign Language in a Bilingual Context: Adult Learners’ Perceptions of the Use of L1 Maltese and L2 English in Mandarin Chinese Lessons in Malta
Authors: Christiana Gauci-Sciberras
Abstract:
The first language (L1) could be used in foreign language teaching and learning as a pedagogical tool to scaffold new knowledge in the target language (TL) upon linguistic knowledge that the learner already has. In a bilingual context, code-switching between the two languages usually occurs in classrooms. One of the reasons for code-switching is because both languages are used for scaffolding new knowledge. This research paper aims to find out why both the L1 (Maltese) and the L2 (English) are used in the classroom of Mandarin Chinese as a foreign language (CFL) in the bilingual context of Malta. This research paper also aims to find out the learners’ perceptions of the use of a bilingual medium of instruction. Two research methods were used to collect qualitative data; semi-structured interviews with adult learners of Mandarin Chinese and lesson observations. These two research methods were used so that the data collected in the interviews would be triangulated with data collected in lesson observations. The L1 (Maltese) is the language of instruction mostly used. The teacher and the learners switch to the L2 (English) or to any other foreign language according to the need at a particular instance during the lesson.Keywords: Chinese, bilingual, pedagogical purpose of L1 and L2, CFL acquisition
Procedia PDF Downloads 2041518 Solid Particles Transport and Deposition Prediction in a Turbulent Impinging Jet Using the Lattice Boltzmann Method and a Probabilistic Model on GPU
Authors: Ali Abdul Kadhim, Fue Lien
Abstract:
Solid particle distribution on an impingement surface has been simulated utilizing a graphical processing unit (GPU). In-house computational fluid dynamics (CFD) code has been developed to investigate a 3D turbulent impinging jet using the lattice Boltzmann method (LBM) in conjunction with large eddy simulation (LES) and the multiple relaxation time (MRT) models. This paper proposed an improvement in the LBM-cellular automata (LBM-CA) probabilistic method. In the current model, the fluid flow utilizes the D3Q19 lattice, while the particle model employs the D3Q27 lattice. The particle numbers are defined at the same regular LBM nodes, and transport of particles from one node to its neighboring nodes are determined in accordance with the particle bulk density and velocity by considering all the external forces. The previous models distribute particles at each time step without considering the local velocity and the number of particles at each node. The present model overcomes the deficiencies of the previous LBM-CA models and, therefore, can better capture the dynamic interaction between particles and the surrounding turbulent flow field. Despite the increasing popularity of LBM-MRT-CA model in simulating complex multiphase fluid flows, this approach is still expensive in term of memory size and computational time required to perform 3D simulations. To improve the throughput of each simulation, a single GeForce GTX TITAN X GPU is used in the present work. The CUDA parallel programming platform and the CuRAND library are utilized to form an efficient LBM-CA algorithm. The methodology was first validated against a benchmark test case involving particle deposition on a square cylinder confined in a duct. The flow was unsteady and laminar at Re=200 (Re is the Reynolds number), and simulations were conducted for different Stokes numbers. The present LBM solutions agree well with other results available in the open literature. The GPU code was then used to simulate the particle transport and deposition in a turbulent impinging jet at Re=10,000. The simulations were conducted for L/D=2,4 and 6, where L is the nozzle-to-surface distance and D is the jet diameter. The effect of changing the Stokes number on the particle deposition profile was studied at different L/D ratios. For comparative studies, another in-house serial CPU code was also developed, coupling LBM with the classical Lagrangian particle dispersion model. Agreement between results obtained with LBM-CA and LBM-Lagrangian models and the experimental data is generally good. The present GPU approach achieves a speedup ratio of about 350 against the serial code running on a single CPU.Keywords: CUDA, GPU parallel programming, LES, lattice Boltzmann method, MRT, multi-phase flow, probabilistic model
Procedia PDF Downloads 2071517 Two-Protein Modified Gold Nanoparticles for Serological Diagnosis of Borreliosis
Authors: Mohammed Alasel, Michael Keusgen
Abstract:
Gold is a noble metal; in its nano-scale level (e.g. spherical nanoparticles), the conduction electrons are triggered to collectively oscillate with a resonant frequency when certain wavelengths of electromagnetic radiation interact with its surface; this phenomenon is known as surface plasmon resonance (SPR). SPR is responsible for giving the gold nanoparticles its intense red color depending mainly on its size, shape and distance between nanoparticles. A decreased distance between gold nanoparticles results in aggregation of them causing a change in color from red to blue. This aggregation enables gold nanoparticles to serve as a sensitive biosensoric indicator. In the proposed work, gold nanoparticles were modified with two proteins: i) Borrelia antigen, variable lipoprotein surface-exposed protein (VlsE), and ii) protein A. VlsE antigen induces a strong antibody response against Lyme disease and can be detected from early to late phase during the disease in humans infected with Borrelia. In addition, it shows low cross-reaction with the other non-pathogenic Borrelia strains. The high specificity of VlsE antigen to anti-Borrelia antibodies, combined simultaneously with the high specificity of protein A to the Fc region of all IgG human antibodies, was utilized to develop a rapid test for serological point of care diagnosis of borreliosis in human serum. Only in the presence of anti-Borrelia antibodies in the serum probe, an aggregation of gold nanoparticles can be observed, which is visible by a concentration-dependent colour shift from red (low IgG) to blue (high IgG). Experiments showed it is clearly possible to distinguish between positive and negative sera samples using a simple suspension of the two-protein modified gold nanoparticles in a very short time (30 minutes). The proposed work showed the potential of using such modified gold nanoparticles generally for serological diagnosis. Improved specificity and reduced assay time can be archived in applying increased salt concentrations combined with decreased pH values (pH 5).Keywords: gold nanoparticles, gold aggregation, serological diagnosis, protein A, lyme borreliosis
Procedia PDF Downloads 3981516 Quality Assessment of the Essential Oil from Eucalyptus globulus Labill of Blida (Algeria) Origin
Authors: M. A. Ferhat, M. N. Boukhatem, F. Chemat
Abstract:
Eucalyptus essential oil is extracted from Eucalyptus globulus of the Myrtaceae family and is also known as Tasmanian blue gum or blue gum. Despite the reputation earned by aromatic and medicinal plants of Algeria. The objectives of this study were: (i) the extraction of the essential oil from the leaves of Eucalyptus globulus Labill., Myrtaceae grown in Algeria, and the quantification of the yield thereof, (ii) the identification and quantification of the compounds in the essential oil obtained, and (iii) the determination of physical and chemical properties of EGEO. The chemical constituents of Eucalyptus globulus essential oil (EGEO) of Blida origin has not previously been investigated. Thus, the present study has been conducted for the determination of chemical constituents and different physico-chemical properties of the EGEO. Chemical composition of the EGEO, grown in Algeria, was analysed by Gas Chromatography-Mass Spectrometry. The chemical components were identified on the basis of Retention Time and comparing with mass spectral database of standard compounds. Relative amounts of detected compounds were calculated on the basis of GC peak areas. Fresh leaves of E. globulus on steam distillation yielded 0.96% (v/w) of essential oil whereas the analysis resulted in the identification of a total of 11 constituents, 1.8 cineole (85.8%), α-pinene (7.2%), and β-myrcene (1.5%) being the main components. Other notable compounds identified in the oil were β-pinene, limonene, α-phellandrene, γ-terpinene, linalool, pinocarveol, terpinen-4-ol, and α-terpineol. The physical properties such as specific gravity, refractive index and optical rotation and the chemical properties such as saponification value, acid number and iodine number of the EGEO were examined. The oil extracted has been analyzed to have 1.4602-1.4623 refractive index value, 0.918-0.919 specific gravity (sp.gr.), +9 - +10 optical rotation that satisfy the standards stipulated by European Pharmacopeia. All the physical and chemical parameters were in the range indicated by the ISO standards. Our findings will help to access the quality of the Eucalyptus oil which is important in the production of high value essential oils that will help to improve the economic condition of the community as well as the nation.Keywords: chemical composition, essential oil, eucalyptol, gas chromatography
Procedia PDF Downloads 3281515 New Coating Materials Based on Mixtures of Shellac and Pectin for Pharmaceutical Products
Authors: M. Kumpugdee-Vollrath, M. Tabatabaeifar, M. Helmis
Abstract:
Shellac is a natural polyester resin secreted by insects. Pectins are natural, non-toxic and water-soluble polysaccharides extracted from the peels of citrus fruits or the leftovers of apples. Both polymers are allowed for the use in the pharmaceutical industry and as a food additive. SSB Aquagold® is the aqueous solution of shellac and can be used for a coating process as an enteric or controlled drug release polymer. In this study, tablets containing 10 mg methylene blue as a model drug were prepared with a rotary press. Those tablets were coated with mixtures of shellac and one of the pectin different types (i.e. CU 201, CU 501, CU 701 and CU 020) mostly in a 2:1 ratio or with pure shellac in a small scale fluidized bed apparatus. A stable, simple and reproducible three-stage coating process was successfully developed. The drug contents of the coated tablets were determined using UV-VIS spectrophotometer. The characterization of the surface and the film thickness were performed with the scanning electron microscopy (SEM) and the light microscopy. Release studies were performed in a dissolution apparatus with a basket. Most of the formulations were enteric coated. The dissolution profiles showed a delayed or sustained release with a lagtime of at least 4 h. Dissolution profiles of coated tablets with pure shellac had a very long lagtime ranging from 13 to 17.5 h and the slopes were quite high. The duration of the lagtime and the slope of the dissolution profiles could be adjusted by adding the proper type of pectin to the shellac formulation and by variation of the coating amount. In order to apply a coating formulation as a colon delivery system, the prepared film should be resistant against gastric fluid for at least 2 h and against intestinal fluid for 4-6 h. The required delay time was gained with most of the shellac-pectin polymer mixtures. The release profiles were fitted with the modified model of the Korsmeyer-Peppas equation and the Hixson-Crowell model. A correlation coefficient (R²) > 0.99 was obtained by Korsmeyer-Peppas equation.Keywords: shellac, pectin, coating, fluidized bed, release, colon delivery system, kinetic, SEM, methylene blue
Procedia PDF Downloads 4071514 Inviscid Steady Flow Simulation Around a Wing Configuration Using MB_CNS
Authors: Muhammad Umar Kiani, Muhammad Shahbaz, Hassan Akbar
Abstract:
Simulation of a high speed inviscid steady ideal air flow around a 2D/axial-symmetry body was carried out by the use of mb_cns code. mb_cns is a program for the time-integration of the Navier-Stokes equations for two-dimensional compressible flows on a multiple-block structured mesh. The flow geometry may be either planar or axisymmetric and multiply-connected domains can be modeled by patching together several blocks. The main simulation code is accompanied by a set of pre and post-processing programs. The pre-processing programs scriptit and mb_prep start with a short script describing the geometry, initial flow state and boundary conditions and produce a discretized version of the initial flow state. The main flow simulation program (or solver as it is sometimes called) is mb_cns. It takes the files prepared by scriptit and mb_prep, integrates the discrete form of the gas flow equations in time and writes the evolved flow data to a set of output files. This output data may consist of the flow state (over the whole domain) at a number of instants in time. After integration in time, the post-processing programs mb_post and mb_cont can be used to reformat the flow state data and produce GIF or postscript plots of flow quantities such as pressure, temperature and Mach number. The current problem is an example of supersonic inviscid flow. The flow domain for the current problem (strake configuration wing) is discretized by a structured grid and a finite-volume approach is used to discretize the conservation equations. The flow field is recorded as cell-average values at cell centers and explicit time stepping is used to update conserved quantities. MUSCL-type interpolation and one of three flux calculation methods (Riemann solver, AUSMDV flux splitting and the Equilibrium Flux Method, EFM) are used to calculate inviscid fluxes across cell faces.Keywords: steady flow simulation, processing programs, simulation code, inviscid flux
Procedia PDF Downloads 4291513 Using Optical Character Recognition to Manage the Unstructured Disaster Data into Smart Disaster Management System
Authors: Dong Seop Lee, Byung Sik Kim
Abstract:
In the 4th Industrial Revolution, various intelligent technologies have been developed in many fields. These artificial intelligence technologies are applied in various services, including disaster management. Disaster information management does not just support disaster work, but it is also the foundation of smart disaster management. Furthermore, it gets historical disaster information using artificial intelligence technology. Disaster information is one of important elements of entire disaster cycle. Disaster information management refers to the act of managing and processing electronic data about disaster cycle from its’ occurrence to progress, response, and plan. However, information about status control, response, recovery from natural and social disaster events, etc. is mainly managed in the structured and unstructured form of reports. Those exist as handouts or hard-copies of reports. Such unstructured form of data is often lost or destroyed due to inefficient management. It is necessary to manage unstructured data for disaster information. In this paper, the Optical Character Recognition approach is used to convert handout, hard-copies, images or reports, which is printed or generated by scanners, etc. into electronic documents. Following that, the converted disaster data is organized into the disaster code system as disaster information. Those data are stored in the disaster database system. Gathering and creating disaster information based on Optical Character Recognition for unstructured data is important element as realm of the smart disaster management. In this paper, Korean characters were improved to over 90% character recognition rate by using upgraded OCR. In the case of character recognition, the recognition rate depends on the fonts, size, and special symbols of character. We improved it through the machine learning algorithm. These converted structured data is managed in a standardized disaster information form connected with the disaster code system. The disaster code system is covered that the structured information is stored and retrieve on entire disaster cycle such as historical disaster progress, damages, response, and recovery. The expected effect of this research will be able to apply it to smart disaster management and decision making by combining artificial intelligence technologies and historical big data.Keywords: disaster information management, unstructured data, optical character recognition, machine learning
Procedia PDF Downloads 1291512 Review and Comparison of Iran`s Sixteenth Topic of the Building with the Ranking System of the Water Sector Lead to Improve the Criteria of the Sixteenth Topic
Authors: O. Fatemi
Abstract:
Considering growing building construction industry in developing countries and sustainable development concept, as well as the importance of taking care of the future generations, codifying buildings scoring system based on environmental criteria, has always been a subject for discussion. The existing systems cannot be used for all the regions due to several reasons, including but not limited to variety in regional variables. In this article, the most important common LEED (Leadership in Energy and Environmental Design) and BREEAM (Building Research Establishment Environmental Assessment Method) common and Global environmental scoring systems, used in UK, USA, and Japan, respectively, have been discussed and compared with a special focus on CASBEE (Comprehensive Assessment System for Built Environment Efficiency), to credit assigning field (weighing and scores systems) as well as sustainable development criteria in each system. Then, converging and distinct fields of the foregoing systems are examined considering National Iranian Building Code. Furthermore, the common credits in the said systems not mentioned in National Iranian Building Code have been identified. These credits, which are generally included in well-known fundamental principles in sustainable development, may be considered as offered options for the Iranian building environmental scoring system. It is suggested that one of the globally and commonly accepted systems is chosen considering national priorities in order to offer an effective method for buildings environmental scoring, and then, a part of credits is added and/or removed, or a certain credit score is changed, and eventually, a new scoring system with a new title is developed for the country. Evidently, building construction industry highly affects the environment, economy, efficiency, and health of the relevant occupants. Considering the growing trend of cities and construction, achieving building scoring systems based on environmental criteria has always been a matter of discussion. The existing systems cannot be used for all the regions due to several reasons, including but not limited to variety in regional variables.Keywords: scoring system, sustainability assessment, water efficiency, national Iranian building code
Procedia PDF Downloads 1811511 Family Homicide: A Comparison of Rural and Urban Communities in California
Authors: Bohsiu Wu
Abstract:
This study compares the differences in social dynamics between rural and urban areas in California to explain homicides involving family members. It is hypothesized that rural homicides are better explained by social isolation and lack of intervention resources, whereas urban homicides are attributed to social disadvantage factors. Several critical social dynamics including social isolation, social disadvantages, acculturation, and intervention resources were entered in a hierarchical linear model (HLM) to examine whether county-level factors affect how each specific dynamic performs at the ZIP code level, a proxy measure for communities. Homicide data are from the Supplementary Homicide Report for all 58 counties in California from 1997 to 1999. Predictors at both the county and ZIP code levels are derived from the 2000 US census. Preliminary results from a HLM analysis show that social isolation is a significant but moderate predictor to explain rural family homicide and various social disadvantage factors are significant factors accounting for urban family homicide. Acculturation has little impact. Rurality and urbanity appear to interact with various social dynamics in explaining family homicide. The implications for prevention at both the county and community level as well as directions for future study on the differences between rural and urban locales are explored in the paper.Keywords: communities, family, HLM, homicide, rural, urban
Procedia PDF Downloads 3261510 A Case Study on the Collapse Assessment of the Steel Moment-Frame Setback High-Rise Tower
Authors: Marzie Shahini, Rasoul Mirghaderi
Abstract:
This paper describes collapse assessments of a steel moment-frame high-rise tower with setback irregularity, designed per the 2010 ASCE7 code, under spectral-matched ground motion records. To estimate a safety margin against life-threatening collapse, an analytical model of the tower is subjected to a suite of ground motions with incremental intensities from maximum considered earthquake hazard level to the incipient collapse level. Capability of the structural system to collapse prevention is evaluated based on the similar methodology reported in FEMA P695. Structural performance parameters in terms of maximum/mean inter-story drift ratios, residual drift ratios, and maximum plastic hinge rotations are also compared to the acceptance criteria recommended by the TBI Guidelines. The results demonstrate that the structural system satisfactorily safeguards the building against collapse. Moreover, for this tower, the code-specified requirements in ASCE7-10 are reasonably adequate to satisfy seismic performance criteria developed in the TBI Guidelines for the maximum considered earthquake hazard level.Keywords: high-rise buildings, set back, residual drift, seismic performance
Procedia PDF Downloads 2601509 Green and Facile Fabrication and Characterization of Fe/ZnO Hollow Spheres and Photodegradation of Azo Dyes
Authors: Seyed Mohsen Mousavi, Ali Reza Mahjoub, Bahjat Afshari Razani
Abstract:
In this work, Fe/ZnO hollow spherical structures with high surface area using the template glucose was prepared by the hydrothermal method using an ultrasonic bath at room temperature was produced and were identified by FT-IR, XRD, FE-SEM and BET. The photocatalytic activity of synthesized spherical Fe/ZnO hollow sphere were studied in the destruction of Congo Red and Methylene Blue as Azo dyes. The results showed that the photocatalytic activity of Fe/ZnO hollow spherical structures is improved compared with ZnO hollow sphere and other morphologys.Keywords: azo dyes, Fe/ZnO hollow sphere, hollow sphere nanostructures, photocatalyst
Procedia PDF Downloads 3701508 A Study of Resin-Dye Fixation on Dyeing Properties of Cotton Fabrics Using Melamine Based Resins and a Reactive Dye
Authors: Nurudeen Ayeni, Kasali Bello, Ovi Abayeh
Abstract:
Study of the effect of dye–resin complexation on the degree of dye absorption were carried out using Procion Blue MX-R to dye cotton fabric in the presence hexamethylol melamine (MR 6) and its phosphate derivative (MPR 4) for resination. The highest degree of dye exhaustion was obtained at 400 C for 1 hour with the resinated fabric showing more affinity for the dye than the ordinary fiber. Improved fastness properties was recorded which show a relatively higher stability of dye–resin–cellulose network formed.Keywords: cotton fabric, reactive dye, dyeing, resination
Procedia PDF Downloads 4081507 Method for Identification of Through Defects of Polymer Films Applied onto Metal Parts
Authors: Yu A. Pluttsova , O. V. Vakhnina , K. B. Zhogova
Abstract:
Nowadays, many devices operate under conditions of enhanced humidity, temperature drops, fog, and vibration. To ensure long-term and uninterruptable equipment operation under adverse conditions, one applies moisture-proof films on products and electronics components, which helps to prevent corrosion, short circuit, allowing a significant increase in device lifecycle. The reliability of such moisture-proof films is mainly determined by their coating uniformity without gaps and cracks. Unprotected product edges, as well as pores in films, can cause device failure during operation. The work objective was to develop an effective, affordable, and profit-proved method for determining the presence of through defects of protective polymer films on the surface of parts made of iron and its alloys. As a diagnostic reagent, one proposed water solution of potassium ferricyanide (III) in hydrochloric acid, this changes the color from yellow to blue according to the reactions; Feº → Fe²⁺ and 4Fe²⁺ + 3[Fe³⁺(CN)₆]³⁻ → Fe ³⁺4[Fe²⁺(CN)₆]₃. There was developed the principle scheme of technological process for determining the presence of polymer films through defects on the surface of parts made of iron and its alloys. There were studied solutions with different diagnostic reagent compositions in water: from 0,1 to 25 mass fractions, %, of potassium ferricyanide (III), and from 5 to 25 mass fractions, %, of hydrochloride acid. The optimal component ratio was chosen. The developed method consists in submerging a part covered with a film into a vessel with a diagnostic reagent. In the polymer film through defect zone, the part material (ferrum) interacts with potassium ferricyanide (III), the color changes to blue. Pilot samples were tested by the developed method for the presence of through defects in the moisture-proof coating. It was revealed that all the studied parts had through defects of the polymer film coating. Thus, the claimed method efficiently reveals polymer film coating through defects on parts made of iron or its alloys, being affordable and profit-proved.Keywords: diagnostic reagent, metal parts, polimer films, through defects
Procedia PDF Downloads 1501506 Influence of Physical Properties on Estimation of Mechanical Strength of Limestone
Authors: Khaled Benyounes
Abstract:
Determination of the rock mechanical properties such as unconfined compressive strength UCS, Young’s modulus E, and tensile strength by the Brazilian test Rtb is considered to be the most important component in drilling and mining engineering project. Research related to establishing correlation between strength and physical parameters of rocks has always been of interest to mining and reservoir engineering. For this, many rock blocks of limestone were collected from the quarry located in Meftah(Algeria), the cores were crafted in the laboratory using a core drill. This work examines the relationships between mechanical properties and some physical properties of limestone. Many empirical equations are established between UCS and physical properties of limestone (such as dry bulk density, velocity of P-waves, dynamic Young’s modulus, alteration index, and total porosity). Others correlations UCS-tensile strength, dynamic Young’s modulus-static Young’s modulus have been find. Based on the Mohr-Coulomb failure criterion, we were able to establish mathematical relationships that will allow estimating the cohesion and internal friction angle from UCS and indirect tensile strength. Results from this study can be useful for mining industry for resolve range of geomechanical problems such as slope stability.Keywords: limestone, mechanical strength, Young’s modulus, porosity
Procedia PDF Downloads 4541505 Modeling and Analysis of DFIG Based Wind Power System Using Instantaneous Power Components
Authors: Jaimala Ghambir, Tilak Thakur, Puneet Chawla
Abstract:
As per the statistical data, the Doubly-fed Induction Generator (DFIG) based wind turbine with variable speed and variable pitch control is the most common wind turbine in the growing wind market. This machine is usually used on the grid connected wind energy conversion system to satisfy grid code requirements such as grid stability, fault ride through (FRT), power quality improvement, grid synchronization and power control etc. Though the requirements are not fulfilled directly by the machine, the control strategy is used in both the stator as well as rotor side along with power electronic converters to fulfil the requirements stated above. To satisfy the grid code requirements of wind turbine, usually grid side converter is playing a major role. So in order to improve the operation capacity of wind turbine under critical situation, the intensive study of both machine side converter control and grid side converter control is necessary In this paper DFIG is modeled using power components as variables and the performance of the DFIG system is analysed under grid voltage fluctuations. The voltage fluctuations are made by lowering and raising the voltage values in the utility grid intentionally for the purpose of simulation keeping in view of different grid disturbances.Keywords: DFIG, dynamic modeling, DPC, sag, swell, voltage fluctuations, FRT
Procedia PDF Downloads 4621504 Fusion Models for Cyber Threat Defense: Integrating Clustering, Random Forests, and Support Vector Machines to Against Windows Malware
Authors: Azita Ramezani, Atousa Ramezani
Abstract:
In the ever-escalating landscape of windows malware the necessity for pioneering defense strategies turns into undeniable this study introduces an avant-garde approach fusing the capabilities of clustering random forests and support vector machines SVM to combat the intricate web of cyber threats our fusion model triumphs with a staggering accuracy of 98.67 and an equally formidable f1 score of 98.68 a testament to its effectiveness in the realm of windows malware defense by deciphering the intricate patterns within malicious code our model not only raises the bar for detection precision but also redefines the paradigm of cybersecurity preparedness this breakthrough underscores the potential embedded in the fusion of diverse analytical methodologies and signals a paradigm shift in fortifying against the relentless evolution of windows malicious threats as we traverse through the dynamic cybersecurity terrain this research serves as a beacon illuminating the path toward a resilient future where innovative fusion models stand at the forefront of cyber threat defense.Keywords: fusion models, cyber threat defense, windows malware, clustering, random forests, support vector machines (SVM), accuracy, f1-score, cybersecurity, malicious code detection
Procedia PDF Downloads 711503 Medical Image Watermark and Tamper Detection Using Constant Correlation Spread Spectrum Watermarking
Authors: Peter U. Eze, P. Udaya, Robin J. Evans
Abstract:
Data hiding can be achieved by Steganography or invisible digital watermarking. For digital watermarking, both accurate retrieval of the embedded watermark and the integrity of the cover image are important. Medical image security in Teleradiology is one of the applications where the embedded patient record needs to be extracted with accuracy as well as the medical image integrity verified. In this research paper, the Constant Correlation Spread Spectrum digital watermarking for medical image tamper detection and accurate embedded watermark retrieval is introduced. In the proposed method, a watermark bit from a patient record is spread in a medical image sub-block such that the correlation of all watermarked sub-blocks with a spreading code, W, would have a constant value, p. The constant correlation p, spreading code, W and the size of the sub-blocks constitute the secret key. Tamper detection is achieved by flagging any sub-block whose correlation value deviates by more than a small value, ℇ, from p. The major features of our new scheme include: (1) Improving watermark detection accuracy for high-pixel depth medical images by reducing the Bit Error Rate (BER) to Zero and (2) block-level tamper detection in a single computational process with simultaneous watermark detection, thereby increasing utility with the same computational cost.Keywords: Constant Correlation, Medical Image, Spread Spectrum, Tamper Detection, Watermarking
Procedia PDF Downloads 1941502 Managing the Blue Economy and Responding to the Environmental Dimensions of a Transnational Governance Challenge
Authors: Ivy Chen XQ
Abstract:
This research places a much-needed focus on the conservation of the Blue Economy (BE) by focusing on the design and development of monitoring systems to track critical indicators on the status of the BE. In this process, local experiences provide an insight into important community issues, as well as the necessity to cooperate and collaborate in order to achieve sustainable options. Researchers worldwide and industry initiatives over the last decade show that the exploitation of marine resources has resulted in a significant decrease in the share of total allowable catch (TAC). The result has been strengthening law enforcement, yet the results have shown that problems were related to poor policies, a lack of understanding of over-exploitation, biological uncertainty and political pressures. This reality and other statistics that show a significant negative impact on the attainment of the Sustainable Development Goals (SDGs), warrant an emphasis on the development of national M&E systems, in order to provide evidence-based information, on the nature and scale of especially transnational fisheries crime and under-sea marine resources in the BE. In particular, a need exists to establish a compendium of relevant BE indicators to assess such impact against the SDGs by using selected SDG indicators for this purpose. The research methodology consists of ATLAS.ti qualitative approach and a case study will be developed of Illegal, unregulated and unreported (IUU) poaching and Illegal Wildlife Trade (IWT) as component of the BE as it relates to the case of abalone in southern Africa and Far East. This research project will make an original contribution through the analysis and comparative assessment of available indicators, in the design process of M&E systems and developing indicators and monitoring frameworks in order to track critical trends and tendencies on the status of the BE, to ensure specific objectives to be aligned with the indicators of the SDGs framework. The research will provide a set of recommendations to governments and stakeholders involved in such projects on lessons learned, as well as priorities for future research. The research findings will enable scholars, civil society institutions, donors and public servants, to understand the capability of the M&E systems, the importance of showing multi-level governance, in the coordination of information management, together with knowledge management (KM) and M&E at the international, regional, national and local levels. This coordination should focus on a sustainable development management approach, based on addressing socio-economic challenges to the potential and sustainability of BE, with an emphasis on ecosystem resilience, social equity and resource efficiency. This research and study focus are timely as the opportunities of the post-Covid-19 crisis recovery package will be grasped to set the economy on a path to sustainable development in line with the UN 2030 Agenda. The pandemic raises more awareness for the world to eliminate IUU poaching and illegal wildlife trade (IWT).Keywords: Blue Economy (BE), transnational governance, Monitoring and Evaluation (M&E), Sustainable Development Goals (SDGs).
Procedia PDF Downloads 1731501 Penalization of Transnational Crimes in the Domestic Legal Order: The Case of Poland
Authors: Magda Olesiuk-Okomska
Abstract:
The degree of international interdependence has grown significantly. Poland is a party to nearly 1000 binding multilateral treaties, including international legal instruments devoted to criminal matters and obliging the state to penalize certain crimes. The paper presents results of a theoretical research conducted as a part of doctoral research. The main hypothesis assumed that there was a separate category of crimes to penalization of which Poland was obliged under international legal instruments; that a catalogue of such crimes and a catalogue of international legal instruments providing for Poland’s international obligations had never been compiled in the domestic doctrine, thus there was no mechanism for monitoring implementation of such obligations. In the course of the research, a definition of transnational crimes was discussed and confronted with notions of international crimes, treaty crimes, as well as cross-border crimes. A list of transnational crimes penalized in the Polish Penal Code as well as in non-code criminal law regulations was compiled; international legal instruments, obliging Poland to criminalize and penalize specific conduct, were enumerated and catalogued. It enabled the determination whether Poland’s international obligations were implemented in domestic legislation, as well as the formulation of de lege lata and de lege ferenda postulates. Implemented research methods included inter alia a dogmatic and legal method, an analytical method and desk research.Keywords: international criminal law, transnational crimes, transnational criminal law, treaty crimes
Procedia PDF Downloads 2231500 Automation of AAA Game Development Using AI
Authors: Branden Heng, Harsheni Siddharthan, Allison Tseng, Paul Toprac, Sarah Abraham, Etienne Vouga
Abstract:
The goal of this project was to evaluate and document the capabilities and limitations of AI tools for empowering small teams to create high-budget, high-profile (AAA) 3D games typically developed by large studios. Two teams of novice game developers attempted to create two different games using AI and Unreal Engine 5.3. First, the teams evaluated 60 AI art, design, sound, and programming tools by considering their capability, ease of use, cost, and license restrictions. Then, the teams used a shortlist of 12 AI tools for game development. During this process, the following tools were found to be the most productive: (i) ChatGPT 4.0 for both game and narrative concepts and documentation; (ii) Dall-E 3 and OpenArt for concept art; (iii) Beatoven for music drafting; (iv) ChatGPT 4.0 and Github Copilot for generating simple code and to complement human-made tutorials as an additional learning resource. While current generative AI may appear impressive at first glance, the assets they produce fall short of AAA industry standards. Generative AI tools are helpful when brainstorming ideas such as concept art and basic storylines, but they still cannot replace human input or creativity at this time. Regarding programming, AI can only effectively generate simple code and act as an additional learning resource. Thus, generative AI tools are, at best, tools to enhance developer productivity rather than as a system to replace developers.Keywords: AAA games, AI, automation tools, game development
Procedia PDF Downloads 26