Search results for: language function
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8351

Search results for: language function

2201 Distributed Generation Connection to the Network: Obtaining Stability Using Transient Behavior

Authors: A. Hadadi, M. Abdollahi, A. Dustmohammadi

Abstract:

The growing use of DGs in distribution networks provide many advantages and also cause new problems which should be anticipated and be solved with appropriate solutions. One of the problems is transient voltage drop and short circuit in the electrical network, in the presence of distributed generation - which can lead to instability. The appearance of the short circuit will cause loss of generator synchronism, even though if it would be able to recover synchronizing mode after removing faulty generator, it will be stable. In order to increase system reliability and generator lifetime, some strategies should be planned to apply even in some situations which a fault prevent generators from separation. In this paper, one fault current limiter is installed due to prevent DGs separation from the grid when fault occurs. Furthermore, an innovative objective function is applied to determine the impedance optimal amount of fault current limiter in order to improve transient stability of distributed generation. Fault current limiter can prevent generator rotor's sudden acceleration after fault occurrence and thereby improve the network transient stability by reducing the current flow in a fast and effective manner. In fact, by applying created impedance by fault current limiter when a short circuit happens on the path of current injection DG to the fault location, the critical fault clearing time improve remarkably. Therefore, protective relay has more time to clear fault and isolate the fault zone without any instability. Finally, different transient scenarios of connection plan sustainability of small scale synchronous generators to the distribution network are presented.

Keywords: critical clearing time, fault current limiter, synchronous generator, transient stability, transient states

Procedia PDF Downloads 180
2200 An Estimating Equation for Survival Data with a Possibly Time-Varying Covariates under a Semiparametric Transformation Models

Authors: Yemane Hailu Fissuh, Zhongzhan Zhang

Abstract:

An estimating equation technique is an alternative method of the widely used maximum likelihood methods, which enables us to ease some complexity due to the complex characteristics of time-varying covariates. In the situations, when both the time-varying covariates and left-truncation are considered in the model, the maximum likelihood estimation procedures become much more burdensome and complex. To ease the complexity, in this study, the modified estimating equations those have been given high attention and considerations in many researchers under semiparametric transformation model was proposed. The purpose of this article was to develop the modified estimating equation under flexible and general class of semiparametric transformation models for left-truncated and right censored survival data with time-varying covariates. Besides the commonly applied Cox proportional hazards model, such kind of problems can be also analyzed with a general class of semiparametric transformation models to estimate the effect of treatment given possibly time-varying covariates on the survival time. The consistency and asymptotic properties of the estimators were intuitively derived via the expectation-maximization (EM) algorithm. The characteristics of the estimators in the finite sample performance for the proposed model were illustrated via simulation studies and Stanford heart transplant real data examples. To sum up the study, the bias for covariates has been adjusted by estimating density function for the truncation time variable. Then the effect of possibly time-varying covariates was evaluated in some special semiparametric transformation models.

Keywords: EM algorithm, estimating equation, semiparametric transformation models, time-to-event outcomes, time varying covariate

Procedia PDF Downloads 143
2199 Development of Electrospun Porous Carbon Fibers from Cellulose/Polyacrylonitrile Blend

Authors: Zubair Khaliq, M. Bilal Qadir, Amir Shahzad, Zulfiqar Ali, Ahsan Nazir, Ali Afzal, Abdul Jabbar

Abstract:

Carbon fibers are one of the most demanding materials on earth due to their potential application in energy, high strength materials, and conductive materials. The nanostructure of carbon fibers offers enhanced properties of conductivity due to the larger surface area. The next generation carbon nanofibers demand the porous structure as it offers more surface area. Multiple techniques are used to produce carbon fibers. However, electrospinning followed by carbonization of the polymeric materials is easy to carry process on a laboratory scale. Also, it offers multiple diversity of changing parameters to acquire the desired properties of carbon fibers. Polyacrylonitrile (PAN) is the most used material for the production of carbon fibers due to its promising processing parameters. Also, cellulose is one of the highest yield producers of carbon fibers. However, the electrospinning of cellulosic materials is difficult due to its rigid chain structure. The combination of PAN and cellulose can offer a suitable solution for the production of carbon fibers. Both materials are miscible in the mixed solvent of N, N, Dimethylacetamide and lithium chloride. This study focuses on the production of porous carbon fibers as a function of PAN/Cellulose blend ratio, solution properties, and electrospinning parameters. These single polymer and blend with different ratios were electrospun to give fine fibers. The higher amount of cellulose offered more difficulty in electrospinning of nanofibers. After carbonization, the carbon fibers were studied in terms of their blend ratio, surface area, and texture. Cellulose contents offered the porous structure of carbon fibers. Also, the presence of LiCl contributed to the porous structure of carbon fibers.

Keywords: cellulose, polyacrylonitrile, carbon nanofibers, electrospinning, blend

Procedia PDF Downloads 189
2198 Development of Electrospun Membranes with Defined Polyethylene Collagen and Oxide Architectures Reinforced with Medium and High Intensity Statins

Authors: S. Jaramillo, Y. Montoya, W. Agudelo, J. Bustamante

Abstract:

Cardiovascular diseases (CVD) are related to affectations of the heart and blood vessels, within these are pathologies such as coronary or peripheral heart disease, caused by the narrowing of the vessel wall (atherosclerosis), which is related to the accumulation of Low-Density Lipoproteins (LDL) in the arterial walls that leads to a progressive reduction of the lumen of the vessel and alterations in blood perfusion. Currently, the main therapeutic strategy for this type of alteration is drug treatment with statins, which inhibit the enzyme 3-hydroxy-3-methyl-glutaryl-CoA reductase (HMG-CoA reductase), responsible for modulating the rate of cholesterol production and other isoprenoids in the mevalonate pathway. This enzyme induces the expression of LDL receptors in the liver, increasing their number on the surface of liver cells, reducing the plasma concentration of cholesterol. On the other hand, when the blood vessel presents stenosis, a surgical procedure with vascular implants is indicated, which are used to restore circulation in the arterial or venous bed. Among the materials used for the development of vascular implants are Dacron® and Teflon®, which perform the function of re-waterproofing the circulatory circuit, but due to their low biocompatibility, they do not have the ability to promote remodeling and tissue regeneration processes. Based on this, the present research proposes the development of a hydrolyzed collagen and polyethylene oxide electrospun membrane reinforced with medium and high-intensity statins, so that in future research it can favor tissue remodeling processes from its microarchitecture.

Keywords: atherosclerosis, medium and high-intensity statins, microarchitecture, electrospun membrane

Procedia PDF Downloads 119
2197 The Formation of Mutual Understanding in Conversation: An Embodied Approach

Authors: Haruo Okabayashi

Abstract:

The mutual understanding in conversation is very important for human relations. This study investigates the mental function of the formation of mutual understanding between two people in conversation using the embodied approach. Forty people participated in this study. They are divided into pairs randomly. Four conversation situations between two (make/listen to fun or pleasant talk, make/listen to regrettable talk) are set for four minutes each, and the finger plethysmogram (200 Hz) of each participant is measured. As a result, the attractors of the participants who reported “I did not understand my partner” show the collapsed shape, which means the fluctuation of their rhythm is too small to match their partner’s rhythm, and their cross correlation is low. The autonomic balance of both persons tends to resonate during conversation, and both LLEs tend to resonate, too. In human history, in order for human beings as weak mammals to live, they may have been with others; that is, they have brought about resonating characteristics, which is called self-organization. However, the resonant feature sometimes collapses, depending on the lifestyle that the person was formed by himself after birth. It is difficult for people who do not have a lifestyle of mutual gaze to resonate their biological signal waves with others’. These people have features such as anxiety, fatigue, and confusion tendency. Mutual understanding is thought to be formed as a result of cooperation between the features of self-organization of the persons who are talking and the lifestyle indicated by mutual gaze. Such an entanglement phenomenon is called a nonlinear relation. By this research, it is found that the formation of mutual understanding is expressed by the rhythm of a biological signal showing a nonlinear relationship.

Keywords: embodied approach, finger plethysmogram, mutual understanding, nonlinear phenomenon

Procedia PDF Downloads 251
2196 Developing English L2 Critical Reading and Thinking Skills through the PISA Reading Literacy Assessment Framework: A Case Study of EFL Learners in a Thai University

Authors: Surasak Khamkhong

Abstract:

This study aimed to investigate the use of the PISA reading literacy assessment framework (PRF) to improve EFL learners’ critical reading and thinking skills. The sample group, selected by the purposive sampling technique, included 36 EFL learners from a university in Northeastern Thailand. The instruments consisted of 8 PRF-based reading lessons, a 27-item-PRF-based reading test which was used as a pre-test and a post-test, and an attitude questionnaire toward the designed lessons. The statistics used for data analysis were percentage, mean, standard deviation, and the Wilcoxon signed-rank test. The results revealed that before the intervention, the students’ English reading proficiency were low as is evident from their low pre-test scores (M=14.00). They did fairly well for the access-and-retrieve questions (M=6.11), but poorly for the integrate-and-interpret questions (M=4.89) and the reflect-and-evaluate questions (M=3.00), respectively. This means that the students could comprehend the texts but they could hardly interpret or evaluate them. However, after the intervention, they could do better as their post-test scores were higher (M=18.01). They could comprehend (M=6.78), interpret (M=6.00) and evaluate (M=5.25) well. This means that after the intervention, their critical reading skills had improved. In terms of their attitude towards the designed lessons and instruction, most students were satisfied with the lessons and the instruction. It may thus be concluded that the designed lessons can help improve students’ English critical reading proficiency and may be used as a teaching model for improving EFL learners’ critical reading skills.

Keywords: second language reading, critical reading and thinking skills, PISA reading literacy framework, English L2 reading development

Procedia PDF Downloads 180
2195 Modeling and Numerical Simulation of Heat Transfer and Internal Loads at Insulating Glass Units

Authors: Nina Penkova, Kalin Krumov, Liliana Zashcova, Ivan Kassabov

Abstract:

The insulating glass units (IGU) are widely used in the advanced and renovated buildings in order to reduce the energy for heating and cooling. Rules for the choice of IGU to ensure energy efficiency and thermal comfort in the indoor space are well known. The existing of internal loads - gage or vacuum pressure in the hermetized gas space, requires additional attention at the design of the facades. The internal loads appear at variations of the altitude, meteorological pressure and gas temperature according to the same at the process of sealing. The gas temperature depends on the presence of coatings, coating position in the transparent multi-layer system, IGU geometry and space orientation, its fixing on the facades and varies with the climate conditions. An algorithm for modeling and numerical simulation of thermal fields and internal pressure in the gas cavity at insulating glass units as function of the meteorological conditions is developed. It includes models of the radiation heat transfer in solar and infrared wave length, indoor and outdoor convection heat transfer and free convection in the hermetized gas space, assuming the gas as compressible. The algorithm allows prediction of temperature and pressure stratification in the gas domain of the IGU at different fixing system. The models are validated by comparison of the numerical results with experimental data obtained by Hot-box testing. Numerical calculations and estimation of 3D temperature, fluid flow fields, thermal performances and internal loads at IGU in window system are implemented.

Keywords: insulating glass units, thermal loads, internal pressure, CFD analysis

Procedia PDF Downloads 259
2194 Modelling and Simulation Efforts in Scale-Up and Characterization of Semi-Solid Dosage Forms

Authors: Saurav S. Rath, Birendra K. David

Abstract:

Generic pharmaceutical industry has to operate in strict timelines of product development and scale-up from lab to plant. Hence, detailed product & process understanding and implementation of appropriate mechanistic modelling and Quality-by-design (QbD) approaches are imperative in the product life cycle. This work provides example cases of such efforts in topical dosage products. Topical products are typically in the form of emulsions, gels, thick suspensions or even simple solutions. The efficacy of such products is determined by characteristics like rheology and morphology. Defining, and scaling up the right manufacturing process with a given set of ingredients, to achieve the right product characteristics presents as a challenge to the process engineer. For example, the non-Newtonian rheology varies not only with CPPs and CMAs but also is an implicit function of globule size (CQA). Hence, this calls for various mechanistic models, to help predict the product behaviour. This paper focusses on such models obtained from computational fluid dynamics (CFD) coupled with population balance modelling (PBM) and constitutive models (like shear, energy density). In a special case of the use of high shear homogenisers (HSHs) for the manufacture of thick emulsions/gels, this work presents some findings on (i) scale-up algorithm for HSH using shear strain, a novel scale-up parameter for estimating mixing parameters, (ii) non-linear relationship between viscosity and shear imparted into the system, (iii) effect of hold time on rheology of product. Specific examples of how this approach enabled scale-up across 1L, 10L, 200L, 500L and 1000L scales will be discussed.

Keywords: computational fluid dynamics, morphology, quality-by-design, rheology

Procedia PDF Downloads 258
2193 A Corpus-Based Study of Subtitling Religious Words into Arabic

Authors: Yousef Sahari, Eisa Asiri

Abstract:

Hollywood films are produced in an open and liberal context, and when subtitling for a more conservative and closed society such as an Arabic society, religious words can pose a thorny challenge for subtitlers. Using a corpus of 90 Hollywood films released between 2000 and 2018 and applying insights from Descriptive Translation Studies (Toury, 1995, 2012) and the dichotomy of domestication and foreignization, this paper investigates three main research questions: (1) What are the dominant religious terms and functions in the English subtitles? (2) What are the dominant translation strategies used in the translation of religious words? (3) Do these strategies tend to be SL-oriented or TL-oriented (domesticating or foreignising)? To answer the research questions above, a quantitative and qualitative analysis of the corpus is conducted, in which the researcher adopts a self-designed, parallel, aligned corpus of ninety films and their Arabic subtitles. A quantitative analysis is performed to compare the frequencies and distribution of religious words, their functions, and the translation strategies employed by the subtitlers of ninety films, with the aim of identifying similarities or differences in addition to identifying the impact of functions of religious terms on the use of subtitling strategies. Based on the quantitative analysis, a qualitative analysis is performed to identify any translational patterns in Arabic translations of religious words and the possible reasons for subtitlers’ choices. The results show that the function of religious words has a strong influence on the choice of subtitling strategies. Also, it is found that foreignization strategies are applied in about two-thirds of the total occurrences of religious words.

Keywords: religious terms, subtitling, audiovisual translation, modern standard arabic, subtitling strategies, english-arabic subtitling

Procedia PDF Downloads 139
2192 A Multi Objective Reliable Location-Inventory Capacitated Disruption Facility Problem with Penalty Cost Solve with Efficient Meta Historic Algorithms

Authors: Elham Taghizadeh, Mostafa Abedzadeh, Mostafa Setak

Abstract:

Logistics network is expected that opened facilities work continuously for a long time horizon without any failure; but in real world problems, facilities may face disruptions. This paper studies a reliable joint inventory location problem to optimize cost of facility locations, customers’ assignment, and inventory management decisions when facilities face failure risks and doesn’t work. In our model we assume when a facility is out of work, its customers may be reassigned to other operational facilities otherwise they must endure high penalty costs associated with losing service. For defining the model closer to real world problems, the model is proposed based on p-median problem and the facilities are considered to have limited capacities. We define a new binary variable (Z_is) for showing that customers are not assigned to any facilities. Our problem involve a bi-objective model; the first one minimizes the sum of facility construction costs and expected inventory holding costs, the second one function that mention for the first one is minimizes maximum expected customer costs under normal and failure scenarios. For solving this model we use NSGAII and MOSS algorithms have been applied to find the pareto- archive solution. Also Response Surface Methodology (RSM) is applied for optimizing the NSGAII Algorithm Parameters. We compare performance of two algorithms with three metrics and the results show NSGAII is more suitable for our model.

Keywords: joint inventory-location problem, facility location, NSGAII, MOSS

Procedia PDF Downloads 510
2191 Aging-Related Changes in Calf Muscle Function: Implications for Venous Hemodynamic and the Role of External Mechanical Activation

Authors: Bhavatharani S., Boopathy V., Kavin S., Naveethkumar R.

Abstract:

Context: Resistance training with blood flow restriction (BFR) has increased in clinical rehabilitation due to the substantial benefits observed in augmenting muscle mass and strength using low loads. However, there is a great variability of training pressures for clinical populations as well as methods to estimate it. The aim of this study was to estimate the percentage of maximal BFR that could result by applying different methodologies based on arbitrary or individual occlusion levels using a cuff width between 9 and 13 cm. Design: A secondary analysis was performed on the combined databases of 2 previous larger studies using BFR training. Methods: To estimate these percentages, the occlusion values needed to reach complete BFR (100% limb occlusion pressure [LOP]) were estimated by Doppler ultrasound. Seventy-five participants (age 24.32 [4.86] y; weight: 78.51 [14.74] kg; height: 1.77 [0.09] m) were enrolled in the laboratory study for measuring LOP in the thigh, arm, or calf. Results: When arbitrary values of restriction are applied, a supra-occlusive LOP between 120% and 190% LOP may result. Furthermore, the application of 130% resting brachial systolic blood pressure creates a similar occlusive stimulus as 100% LOP. Conclusions: Methods using 100 mm Hg and the resting brachial systolic blood pressure could represent the safest application prescriptions as they resulted in applied pressures between 60% and 80% LOP. One hundred thirty percent of the resting brachial systolic blood pressure could be used to indirectly estimate 100% LOP at cuff widths between 9 and 13 cm. Finally, methodologies that use standard values of 200 and, 300 mm Hg far exceed LOP and may carry additional risk during BFR exercise.

Keywords: lower limb rehabilitation, ESP32, pneumatics for medical, programmed rehabilitation

Procedia PDF Downloads 64
2190 Intensity-Enhanced Super-Resolution Amplitude Apodization Effect on the Non-Spherical Near-Field Particle-Lenses

Authors: Liyang Yue, Bing Yan, James N. Monks, Rakesh Dhama, Zengbo Wang, Oleg V. Minin, Igor V. Minin

Abstract:

A particle can function as a refractive lens to focus a plane wave, generating a narrow, high intensive, weak-diverging beam within a sub-wavelength volume, known as the ‘photonic jet’. Refractive index contrast (particle to background media) and scaling effect of the dielectric particle (relative-to-wavelength size) play key roles in photonic jet formation, rather than the shape of particle-lens. Waist (full width of half maximum, FWHM) of a photonic jet could be beyond the diffraction limit and smaller than the Airy disk, which defines the minimum distance between two objects to be imaged as two instead of one. Many important applications for imaging and sensing have been afforded based upon the super-resolution characteristic of the photonic jet. It is known that apodization method, in the form of an amplitude pupil-mask centrally situated on a particle-lens, can further reduce the waist of a photonic nanojet, however, usually lower its intensity at the focus due to blocking of the incident light. In this paper, the anomalously intensity-enhanced apodization effect was discovered in the near-field via numerical simulation. It was also experimentally verified by a scale model using a copper-masked Teflon cuboid solid immersion lens (SIL) with 22 mm side length under radiation of a plane wave with 8 mm wavelength. Peak intensity enhancement and the lateral resolution of the produced photonic jet increased by about 36.0 % and 36.4 % in this approach, respectively. This phenomenon may possess the scale effect and would be valid in multiple frequency bands.

Keywords: apodization, particle-lens, scattering, near-field optics

Procedia PDF Downloads 173
2189 Cytotoxicity and Androgenic Potential of Antifungal Drug Substances on MDA-KB2 Cells

Authors: Benchouala Amira, Bojic Clement, Poupin Pascal, Cossu Leguille-carole

Abstract:

The objective of this study is to evaluate in vitro the cytotoxic and androgenic potential of several antifungal molecules (amphotericin B, econazole, ketoconazole and miconazole) on MDA-Kb2 cell lines. This biological model is an effective tool for the detection of endocrine disruptors because it responds well to the main agonist of the androgen receptor (testosterone) and also to an antagonist: flutamide. The cytotoxicity of each chemical compound tested was measured using an MTT assay (tetrazolium salt, 3-(4,5-dimethylthiazol-2-yl)-2,5-diphenyltetrazolium bromide) which measures the activity of the reductase function of mitochondrial succinate dehydrogenase enzymes of cultured cells. This complementary cytotoxicity test is essential to ensure that the effects of reduction in luminescence intensity observed during androgenic tests are only attributable to the anti-androgenic action of the compounds tested and not to their possible cytotoxic properties. Tests of the androgenic activity of antifungals show that these compounds do not have the capacity to induce transcription of the luciferase gene. These compounds do not exert an androgenic effect on MDA-Kb2 cells in culture for the environmental concentrations tested. The addition of flutamide for the same tested concentrations of antifungal molecules reduces the luminescence induced by amphotericin B, econazole and miconazole, which is explained by a strong interaction of these molecules with flutamide which may have a greater toxic effect than when tested alone. The cytotoxicity test shows that econazole and ketoconazole can cause cell death at certain concentrations tested. This cell mortality is perhaps induced by a direct or indirect action on deoxyribonucleic acid (DNA), ribonucleic acid (RNA) or proteins necessary for cell division.

Keywords: cytotoxicity, androgenic potential, antifungals, MDA-Kb2

Procedia PDF Downloads 26
2188 Subclinical Renal Damage Induced by High-Fat Diet in Young Rats

Authors: Larissa M. Vargas, Julia M. Sacchi, Renata O. Pereira, Lucas S. Asano, Iara C. Araújo, Patricia Fiorino, Vera Farah

Abstract:

The aim of this study was to evaluate the occurrence of subclinical organ injuries induced by high-fat diet. Male wistar rats (n=5/group) were divided in control diet group (CD), commercial rat chow, and hyperlipidic diet (30% lipids) group (HD) administrated during 8 weeks, starting after weaning. All the procedures followed the rules of the Committee of Research and Ethics of the Mackenzie University (CEUA Nº 077/03/2011). At the end of protocol the animals were euthanized by anesthesia overload and the left kidney was removed. Intrarenal lipid deposition was evaluated by histological analyses with oilred. Kidney slices were stained with picrosirius red to evaluate the area of the Bowman's capsule (AB) and space (SB), and glomerular tuft area (GT). The renal expression of sterol regulatory element–binding protein (SREBP-2) was performed by Western Blotting. Creatinine concentration (serum and urine) and lipid profile were determined by colorimetric kit (Labtest). At the end of the protocol there was no differences in body weight between the groups, however the HD showed a marked increase in lipid deposits, glomeruli and tubules, and biochemical analysis for cholesterol and triglycerides. Moreover, in the kidney, the high-fat diet induced a reduction in the AB (13%), GT (18%) and SB (17%) associated with a reduction in glomerular filtration rate (creatinine clearance). The renal SRBP2 expression was increased in HD group. These data suggests that consumption of high-fat diet starting in childhood is associated with subclinical renal damage and function.

Keywords: high-fat diet, kidney, intrarenal lipid deposition, SRBP2

Procedia PDF Downloads 282
2187 A Collective Intelligence Approach to Safe Artificial General Intelligence

Authors: Craig A. Kaplan

Abstract:

If AGI proves to be a “winner-take-all” scenario where the first company or country to develop AGI dominates, then the first AGI must also be the safest. The safest, and fastest, path to Artificial General Intelligence (AGI) may be to harness the collective intelligence of multiple AI and human agents in an AGI network. This approach has roots in seminal ideas from four of the scientists who founded the field of Artificial Intelligence: Allen Newell, Marvin Minsky, Claude Shannon, and Herbert Simon. Extrapolating key insights from these founders of AI, and combining them with the work of modern researchers, results in a fast and safe path to AGI. The seminal ideas discussed are: 1) Society of Mind (Minsky), 2) Information Theory (Shannon), 3) Problem Solving Theory (Newell & Simon), and 4) Bounded Rationality (Simon). Society of Mind describes a collective intelligence approach that can be used with AI and human agents to create an AGI network. Information theory helps address the critical issue of how an AGI system will increase its intelligence over time. Problem Solving Theory provides a universal framework that AI and human agents can use to communicate efficiently, effectively, and safely. Bounded Rationality helps us better understand not only the capabilities of SuperIntelligent AGI but also how humans can remain relevant in a world where the intelligence of AGI vastly exceeds that of its human creators. Each key idea can be combined with recent work in the fields of Artificial Intelligence, Machine Learning, and Large Language Models to accelerate the development of a working, safe, AGI system.

Keywords: AI Agents, Collective Intelligence, Minsky, Newell, Shannon, Simon, AGI, AGI Safety

Procedia PDF Downloads 69
2186 Development of Self-Reliant Satellite-Level Propulsion System by Using Hydrogen Peroxide Propellant

Authors: H. J. Liu, Y. A. Chan, C. K. Pai, K. C. Tseng, Y. H. Chen, Y. L. Chan, T. C. Kuo

Abstract:

To satisfy the mission requirement of the FORMOSAT-7 project, NSPO has initialized a self-reliant development on satellite propulsion technology. A trade-off study on different types of on-board propulsion system has been done. A green propellant, high-concentration hydrogen peroxide (H2O2 hereafter), is chosen in this research because it is ITAR-free, nontoxic and easy to produce. As the components designed for either cold gas or hydrazine propulsion system are not suitable for H2O2 propulsion system, the primary objective of the research is to develop the components compatible with H2O2. By cooperating with domestic research institutes and manufacturing vendors, several prototype components, including a diaphragm-type tank, pressure transducer, ball latching valve, and one-Newton thruster with catalyst bed, were manufactured, and the functional tests were performed successfully according to the mission requirements. The requisite environmental tests, including hot firing test, thermal vaccum test, vibration test and compatibility test, are prepared and will be to completed in the near future. To demonstrate the subsystem function, an Air-Bearing Thrust Stand (ABTS) and a real-time Data Acquisition & Control System (DACS) were implemented to assess the performance of the proposed H2O2 propulsion system. By measuring the distance that the thrust stand has traveled in a given time, the thrust force can be derived from the kinematics equation. To validate the feasibility of the approach, it is scheduled to assess the performance of a cold gas (N2) propulsion system prior to the H2O2 propulsion system.

Keywords: FORMOSAT-7, green propellant, Hydrogen peroxide, thruster

Procedia PDF Downloads 415
2185 Possible Management of Acute Liver Failure Caused Experimentally by Thioacetamide Through a Wide Range of Nano Natural Anti-Inflammatory And Antioxidants Compounds [Herbal Approach]

Authors: Sohair Hassan, Olfat Hammam, Sahar Hussein, Wessam Magdi

Abstract:

Objective: Acute liver failure (ALF) is a clinical condition with an unclear history of pathophysiology, making it a challenging task for scientists to reverse the disease in its initial phase and to help the liver re-function customary: this study aimed to estimate the hepatoprotective effects of Punica granatum Lpeel and Pistacia atlantica leaves as a multi-rich antioxidants ingredients either in their normal and/or in their nanoforms against thioacetamide induced acute liver failure in a rodent model. Method: Male Wistar rats (n=60) were divided into six equal groups, the first group employed as a control; The second group administered a dose of 350 mg /Kg/ b.w of thioacetamide (TAA)-IP, from the third to the sixth group received TAA + [2mls / 100 g b.w/d] of aqueous extracts of Punica granatum L and Pistacia atlantica either in their normal and/or Nano forms consecutively for (14 days) Results: Recorded significant elevation in liver enzymes, lipid profiles, LPO (p= 0.05) and NO with a marked significant decrease in GSH and SOD accompanied by an elevation in inflammatory cytokine (IL6, TNF-α, and AFP) in addition to a noticeable increase in HSP70 level & degradation in DNA respectively in TAA challenged group. However significant and subsequent amelioration of most of the impaired markers was observed with ip nano treatment of both extracts. Conclusion: The current results highlighted the high performance of both plant nano extracts and their hepatoprotective impact and their possible therapeutic role in the amelioration of TAA induced acute liver failure in experimental animals.

Keywords: acute liver failure HPLC, IL6, nano extracts, thioacetamide, TNF-α

Procedia PDF Downloads 189
2184 Comparison of Anterolateral Thigh Flap with or without Acellular Dermal Matrix in Repair of Hypopharyngeal Squamous Cell Carcinoma Defect: A Retrospective Study

Authors: Yaya Gao, Bing Zhong, Yafeng Liu, Fei Chen

Abstract:

Aim: The purpose of this study was to explore the difference between acellular dermal matrix (ADM) combined with anterolateral thigh (ALT) flap and ALT flap alone. Methods: HSCC patients were treated and divided into group A (ALT) and group B (ALT+ADM) between January 2014 and December 2018. We compared and analyzed the intraoperative information and postoperative outcomes of the patients. Results: There were 21 and 17 patients in group A and group B, respectively. The operation time, blood loss, defect size and anastomotic vessel selection showed no significant difference between two groups. The postoperative complications, including wound bleeding (n=0 vs. 1, p=0.459), wound dehiscence (n=0 vs. 1, p=0.459), wound infection (n=5vs.3, p=0.709), pharyngeal fistula (n=5vs.4, p=1.000) and hypoproteinemia (n=11 vs. 12, p=0.326) were comparable between the groups. Dysphagia at 6 months (number of liquid diets=0vs. 0; number of partial tube feedings=1vs. 1; number of total tube feedings=1vs. 0, p=0.655) also showed no significant differences. However, significant differences was observed in dysphagia at 12 months (number of liquid diets=0vs. 0; number of partial tube feedings=3 vs. 1; number of total tube feedings=10vs. 1, p=0.006). Conclusion: For HSCC patients, the use of the ALT flap combined ADM, compared to ALT treatment, showed better swallowing function at 12 months. The ALT flap combined ADM may serve as a safe and feasible alternative for selected HSCC patients.

Keywords: hypopharyngeal squamous cell carcinoma, anterolateral thigh free flap, acellular dermal matrix, reconstruction, dysphagia

Procedia PDF Downloads 64
2183 Breaking Barriers: Utilizing Innovation to Improve Educational Outcomes for Students with Disabilities

Authors: Emily Purdom, Rachel Robinson

Abstract:

As the number of students worldwide requiring speech-language therapy, occupational therapy and mental health services during their school day increases, innovation is becoming progressively more important to meet the demand. Telepractice can be used to reach a greater number of students requiring specialized therapy while maintaining the highest quality of care. It can be provided in a way that is not only effective but ultimately more convenient for student, teacher and therapist without the added burden of travel. Teletherapy eradicates many hurdles to traditional on-site service delivery and helps to solve the pervasive shortage of certified professionals. Because location is no longer a barrier to specialized education plans for students with disabilities when teletherapy is conducted, there are many advantages that can be deployed. Increased frequency of engagement is possible along with students receiving specialized care from a clinician that may not be in their direct area. Educational teams, including parents, can work together more easily and engage in face-to-face, student-centered collaboration through videoconference. Practical strategies will be provided for connecting students with qualified therapists without the typical in-person dynamic. In most cases, better therapy outcomes are going to be achieved when treatment is most convenient for the student and educator. This workshop will promote discussion in the field of education to increase advocacy for remote service delivery. It will serve as a resource for those wanting to expand their knowledge of options for students with special needs afforded through innovation.

Keywords: education technology, innovation, student support services, telepractice

Procedia PDF Downloads 231
2182 Availability of Safety Measures and Knowledge Towards Hazardous Waste Management among Workers in Scientific Laboratories of Two Universities in Lebanon

Authors: Inaam Nasrallah, Pascale Salameh, Abbas El-Outa, Assem Alkak, Rihab Nasr, Wafa Toufic Bawab

Abstract:

Background: Hazardous Waste Management(HWM). is critical to human health outcomes and environmental protection. This study evaluated the knowledge regarding safety measures to be applied when collecting and storing waste in scientific laboratories of two universities in Lebanon.Method: A survey-based observational study was conducted in scientific laboratories of the public university and that of a private university, where a total of 309 participants were recruited.Result: The mean total knowledge score on safety measures of HWM was 9.02±4.34 (maximum attainable score, 13). Significant association (p<0.05) was found between knowledge score and job function, years of experience, educational level, professional status, work schedule, and training on proper HWM. Participants had adequate perceptions regarding the impact of HWM on health and the environment. Linear regression modeling revealed that knowledge score was significantly higher among bachelor level lab workers compared to those with doctoral degrees (p=0.043), full-time schedule workers versus part-timers (p=0.03), and among public university participants as compared to those of the private university (p<0.001).Conclusion: This study showed good knowledge concerning HWM in the scientific laboratoriesof the studied universities in Lebanon and a good awareness of the HWM on health and the environment. It highlights the importance of culture, attitude, and practice on proper HWM in the academic scientific laboratory.

Keywords: hasardous waste, safety measures, waste management, knwoledge score, scientific laboratory workers

Procedia PDF Downloads 186
2181 The Synthesis, Structure and Catalytic Activity of Iron(II) Complex with New N2O2 Donor Schiff Base Ligand

Authors: Neslihan Beyazit, Sahin Bayraktar, Cahit Demetgul

Abstract:

Transition metal ions have an important role in biochemistry and biomimetic systems and may provide the basis of models for active sites of biological targets. The presence of copper(II), iron(II) and zinc(II) is crucial in many biological processes. Tetradentate N2O2 donor Schiff base ligands are well known to form stable transition metal complexes and these complexes have also applications in clinical and analytical fields. In this study, we present salient structural features and the details of cathecholase activity of Fe(II) complex of a new Schiff Base ligand. A new asymmetrical N2O2 donor Schiff base ligand and its Fe(II) complex were synthesized by condensation of 4-nitro-1,2 phenylenediamine with 6-formyl-7-hydroxy-5-methoxy-2-methylbenzopyran-4-one and by using an appropriate Fe(II) salt, respectively. Schiff base ligand and its metal complex were characterized by using FT-IR, 1H NMR, 13C NMR, UV-Vis, elemental analysis and magnetic susceptibility. In order to determine the kinetics parameters of catechol oxidase-like activity of Schiff base Fe(II) complex, the oxidation of the 3,5-di-tert-butylcatechol (3,5-DTBC) was measured at 25°C by monitoring the increase of the absorption band at 390-400 nm of the product 3,5-di-tert-butylcatequinone (3,5-DTBQ). The compatibility of catalytic reaction with Michaelis-Menten kinetics also investigated by the method of initial rates by monitoring the growth of the 390–400 nm band of 3,5-DTBQ as a function of time. Kinetic studies showed that Fe(II) complex of the new N2O2 donor Schiff base ligand was capable of acting as a model compound for simulating the catecholase properties of type-3 copper proteins.

Keywords: catecholase activity, Michaelis-Menten kinetics, Schiff base, transition metals

Procedia PDF Downloads 376
2180 Creation of Ultrafast Ultra-Broadband High Energy Laser Pulses

Authors: Walid Tawfik

Abstract:

The interaction of high intensity ultrashort laser pulses with plasma generates many significant applications, including soft x-ray lasers, time-resolved laser induced plasma spectroscopy LIPS, and laser-driven accelerators. The development in producing of femtosecond down to ten femtosecond optical pulses has facilitates scientists with a vital tool in a variety of ultrashort phenomena, such as high field physics, femtochemistry and high harmonic generation HHG. In this research, we generate a two-octave-wide ultrashort supercontinuum pulses with an optical spectrum extending from 3.5 eV (ultraviolet) to 1.3 eV (near-infrared) using a capillary fiber filled with neon gas. These pulses are formed according to nonlinear self-phase modulation in the neon gas as a nonlinear medium. The investigations of the created pulses were made using spectral phase interferometry for direct electric-field reconstruction (SPIDER). A complete description of the output pulses was considered. The observed characterization of the produced pulses includes the beam profile, the pulse width, and the spectral bandwidth. After reaching optimization conditions, the intensity of the reconstructed pulse autocorrelation function was applied for the shorts pulse duration to achieve transform limited ultrashort pulses with durations below 6-fs energies up to 600μJ. Moreover, the effect of neon pressure variation on the pulse width was examined. The nonlinear self-phase modulation realized to be increased with the pressure of the neon gas. The observed results may lead to an advanced method to control and monitor ultrashort transit interaction in femtochemistry.

Keywords: supercontinuum, ultrafast, SPIDER, ultra-broadband

Procedia PDF Downloads 211
2179 The Factors Constitute the Interaction between Teachers and Students: An Empirical Study at the Notion of Framing

Authors: Tien-Hui Chiang

Abstract:

The code theory, proposed by Basil Bernstein, indicates that framing can be viewed as the core element in constituting the phenomenon of cultural reproduction because it is able to regulate the transmission of pedagogical information. Strong framing increases the social relation boundary between a teacher and pupils, which obstructs information transmission, so that in order to improve underachieving students’ academic performances, teachers need to reduce to strength of framing. Weak framing enables them to transform academic knowledge into commonsense knowledge in daily life language. This study posits that most teachers would deliver strong framing due to their belief mainly confined within the aspect of instrumental rationality that deprives their critical minds. This situation could make them view the normal distribution bell curve of students’ academic performances as a natural outcome. In order to examine the interplay between framing, instrumental rationality and pedagogical action, questionnaires were completed by over 5,000 primary school teachers in Henan province, China, who were stratified sample. The statistical results show that most teachers employed psychological concepts to measure students’ academic performances and, in turn, educational inequity was legitimatized as a natural outcome in the efficiency-led approach. Such efficiency-led minds made them perform as the agent practicing the mechanism of social control and in turn sustaining the phenomenon of cultural reproduction.

Keywords: code, cultural reproduction, framing, instrumental rationality, social relation and interaction

Procedia PDF Downloads 138
2178 A Systematic Snapshot of Software Outsourcing Challenges

Authors: Issam Jebreen, Eman Al-Qbelat

Abstract:

Outsourcing software development projects can be challenging, and there are several common challenges that organizations face. A study was conducted with a sample of 46 papers on outsourcing challenges, and the results show that there are several common challenges faced by organizations when outsourcing software development projects. Poor outsourcing relationship was identified as the most significant challenge, with 35% of the papers referencing it. Lack of quality was the second most significant challenge, with 33% of the papers referencing it. Language and cultural differences were the third most significant challenge, with 24% of the papers referencing it. Non-competitive price was another challenge faced by organizations, with 21% of the papers referencing it. Poor coordination and communication were also identified as a challenge, with 21% of the papers referencing it. Opportunistic behavior, lack of contract negotiation, inadequate user involvement, and constraints due to time zone were also challenges faced by organizations. Other challenges faced by organizations included poor project management, lack of technical capabilities, vendor employee high turnover, poor requirement specification, IPR issues, poor management of budget, schedule, and delay, geopolitical and country instability, the difference in development methodologies, failure to manage end-user expectations, and poor monitoring and control. In conclusion, outsourcing software development projects can be challenging, but organizations can mitigate these challenges by selecting the right outsourcing partner, having a well-defined contract and clear communication, having a clear understanding of the requirements, and implementing effective project management practices.

Keywords: software outsourcing, vendor, outsourcing challenges, quality model, continent, country, global outsourcing, IT workforce outsourcing.

Procedia PDF Downloads 73
2177 Ideology and Lexicogrammar: Discourse Against the Power in Lyrical Texts (XIII, XVII and XX Centuries)

Authors: Ulisses Tadeu Vaz de Oliveira

Abstract:

The development of multifunctional studies in the theoretical-methodological perspective of the Systemic-Functional Grammar (SFG) and the increasing number of critical literary studies have introduced new opportunities for the study of ideologies and societies, but also brought up new challenges across and within many areas. In this regard, the Critical Linguistics researches allow a form of pairing a textual linguistic analysis method (micro level) with a social language theory in political and ideological processes (macro level), presented in the literature. This presentation will report on strategies to criticize power holders in literary productions from three distinct eras, namely: (a) Satirical Galego-Portuguese chants of Gil Pérez Conde (thirteenth century), (b) Poems of Gregorio de Matos Guerra (seventeenth century), and (c) Songs of Chico Buarque de Holanda (twentieth century). The analysis of these productions is based on the SFG proposals, which considers the clause as a social event. Therefore, the structure serves to realize three concurrent meanings (metafunctions): Ideational, Interpersonal and Textual. The presenter aims to shed light on the core issues relevant to the successes of the authors to criticize authorities in repressive times while caring about face-threatening and politeness. The effective and meaningful critical discourse was a way of moving the society`s chains towards new ideologies reflected in the lexicogrammatical choices made and the rhetorical functions of the persuasive structures used by the authors.

Keywords: ideology, literature, persuasion, systemic-functional grammar

Procedia PDF Downloads 399
2176 Analysis and Control of Camera Type Weft Straightener

Authors: Jae-Yong Lee, Gyu-Hyun Bae, Yun-Soo Chung, Dae-Sub Kim, Jae-Sung Bae

Abstract:

In general, fabric is heat-treated using a stenter machine in order to dry and fix its shape. It is important to shape before the heat treatment because it is difficult to revert back once the fabric is formed. To produce the product of right shape, camera type weft straightener has been applied recently to capture and process fabric images quickly. It is more powerful in determining the final textile quality rather than photo-sensor. Positioning in front of a stenter machine, weft straightener helps to spread fabric evenly and control the angle between warp and weft constantly as right angle by handling skew and bow rollers. To process this tricky procedure, the structural analysis should be carried out in advance, based on which, its control technology can be drawn. A structural analysis is to figure out the specific contact/slippage characteristics between fabric and roller. We already examined the applicability of camera type weft straightener to plain weave fabric and found its possibility and the specific working condition of machine and rollers. In this research, we aimed to explore another applicability of camera type weft straightener. Namely, we tried to figure out camera type weft straightener can be used for fabrics. To find out the optimum condition, we increased the number of rollers. The analysis is done by ANSYS software using Finite Element Analysis method. The control function is demonstrated by experiment. In conclusion, the structural analysis of weft straightener is done to identify a specific characteristic between roller and fabrics. The control of skew and bow roller is done to decrease the error of the angle between warp and weft. Finally, it is proved that camera type straightener can also be used for the special fabrics.

Keywords: camera type weft straightener, structure analysis, control, skew and bow roller

Procedia PDF Downloads 281
2175 Sentiment Analysis of Chinese Microblog Comments: Comparison between Support Vector Machine and Long Short-Term Memory

Authors: Xu Jiaqiao

Abstract:

Text sentiment analysis is an important branch of natural language processing. This technology is widely used in public opinion analysis and web surfing recommendations. At present, the mainstream sentiment analysis methods include three parts: sentiment analysis based on a sentiment dictionary, based on traditional machine learning, and based on deep learning. This paper mainly analyzes and compares the advantages and disadvantages of the SVM method of traditional machine learning and the Long Short-term Memory (LSTM) method of deep learning in the field of Chinese sentiment analysis, using Chinese comments on Sina Microblog as the data set. Firstly, this paper classifies and adds labels to the original comment dataset obtained by the web crawler, and then uses Jieba word segmentation to classify the original dataset and remove stop words. After that, this paper extracts text feature vectors and builds document word vectors to facilitate the training of the model. Finally, SVM and LSTM models are trained respectively. After accuracy calculation, it can be obtained that the accuracy of the LSTM model is 85.80%, while the accuracy of SVM is 91.07%. But at the same time, LSTM operation only needs 2.57 seconds, SVM model needs 6.06 seconds. Therefore, this paper concludes that: compared with the SVM model, the LSTM model is worse in accuracy but faster in processing speed.

Keywords: sentiment analysis, support vector machine, long short-term memory, Chinese microblog comments

Procedia PDF Downloads 75
2174 Simulation-Based Control Module for Offshore Single Point Mooring System

Authors: Daehyun Baek, Seungmin Lee, Minju Kim Jangik Park, Hyeong-Soon Moon

Abstract:

SPM (Single Point Mooring) is one of the mooring buoy facilities installed on a coast near oil and gas terminal which is not able to berth FPSO or large oil tankers under the condition of high draft due to geometrical limitation. Loading and unloading of crude oil and gas through a subsea pipeline can be carried out between the mooring buoy, ships and onshore facilities. SPM is an offshore-standalone system which has to withstand the harsh marine environment with harsh conditions such as high wind, current and so on. Therefore, SPM is required to have high stability, reliability and durability. Also, SPM is comprised to be integrated systems which consist of power management, high pressure valve control, sophisticated hardware/software and a long distance communication system. In order to secure required functions of SPM system, a simulation model for the integrated system of SPM using MATLAB Simulink and State flow tool has been developed. The developed model consists of configuration of hydraulic system for opening and closing of PLEM (Pipeline End Manifold) valves and control system logic. To verify functions of the model, an integrated simulation model for overall systems of SPM was also developed by considering handshaking variables between individual systems. In addition to the dynamic model, a self-diagnostic function to determine failure of the system was configured, which enables the SPM system itself to alert users about the failure once a failure signal comes to arise. Controlling and monitoring the SPM system is able to be done by a HMI system which is capable of managing the SPM system remotely, which was carried out by building a communication environment between the SPM system and the HMI system.

Keywords: HMI system, mooring buoy, simulink simulation model, single point mooring, stateflow

Procedia PDF Downloads 406
2173 Normalized Enterprises Architectures: Portugal's Public Procurement System Application

Authors: Tiago Sampaio, André Vasconcelos, Bruno Fragoso

Abstract:

The Normalized Systems Theory, which is designed to be applied to software architectures, provides a set of theorems, elements and rules, with the purpose of enabling evolution in Information Systems, as well as ensuring that they are ready for change. In order to make that possible, this work’s solution is to apply the Normalized Systems Theory to the domain of enterprise architectures, using Archimate. This application is achieved through the adaptation of the elements of this theory, making them artifacts of the modeling language. The theorems are applied through the identification of the viewpoints to be used in the architectures, as well as the transformation of the theory’s encapsulation rules into architectural rules. This way, it is possible to create normalized enterprise architectures, thus fulfilling the needs and requirements of the business. This solution was demonstrated using the Portuguese Public Procurement System. The Portuguese government aims to make this system as fair as possible, allowing every organization to have the same business opportunities. The aim is for every economic operator to have access to all public tenders, which are published in any of the 6 existing platforms, independently of where they are registered. In order to make this possible, we applied our solution to the construction of two different architectures, which are able of fulfilling the requirements of the Portuguese government. One of those architectures, TO-BE A, has a Message Broker that performs the communication between the platforms. The other, TO-BE B, represents the scenario in which the platforms communicate with each other directly. Apart from these 2 architectures, we also represent the AS-IS architecture that demonstrates the current behavior of the Public Procurement Systems. Our evaluation is based on a comparison between the AS-IS and the TO-BE architectures, regarding the fulfillment of the rules and theorems of the Normalized Systems Theory and some quality metrics.

Keywords: archimate, architecture, broker, enterprise, evolvable systems, interoperability, normalized architectures, normalized systems, normalized systems theory, platforms

Procedia PDF Downloads 340
2172 The Role of Long-Chain Ionic Surfactants on Extending Drug Delivery from Contact Lenses

Authors: Cesar Torres, Robert Briber, Nam Sun Wang

Abstract:

Eye drops are the most commonly used treatment for short-term and long-term ophthalmic diseases. However, eye drops could deliver only about 5% of the functional ingredients contained in a burst dosage. To address the limitations of eye drops, the use of therapeutic contact lenses has been introduced. Drug-loaded contact lenses provide drugs a longer residence time in the tear film and hence, decrease the potential risk of side effects. Nevertheless, a major limitation of contact lenses as drug delivery devices is that most of the drug absorbed is released within the first few hours. This fact limits their use for extended release. The present study demonstrates the application of long-alkyl chain ionic surfactants on extending drug release kinetics from commercially available silicone hydrogel contact lenses. In vitro release experiments were carried by immersing drug-containing contact lenses in phosphate buffer saline at physiological pH. The drug concentration as a function of time was monitored using ultraviolet-visible spectroscopy. The results of the study demonstrate that release kinetics is dependent on the ionic surfactant weight percent in the contact lenses, and on the length of the hydrophobic alkyl chain of the ionic surfactants. The use of ionic surfactants in contact lenses can extend the delivery of drugs from a few hours to a few weeks, depending on the physicochemical properties of the drugs. Contact lenses embedded with ionic surfactants could be potential biomaterials to be used for extended drug delivery and in the treatment of ophthalmic diseases. However, ocular irritation and toxicity studies would be needed to evaluate the safety of the approach.

Keywords: contact lenses, drug delivery, controlled release, ionic surfactant

Procedia PDF Downloads 131