Search results for: market size
7512 Two Component Source Apportionment Based on Absorption and Size Distribution Measurement
Authors: Tibor Ajtai, Noémi Utry, Máté Pintér, Gábor Szabó, Zoltán Bozóki
Abstract:
Beyond its climate and health related issues ambient light absorbing carbonaceous particulate matter (LAC) has also become a great scientific interest in terms of its regulations recently. It has been experimentally demonstrated in recent studies, that LAC is dominantly composed of traffic and wood burning aerosol particularly under wintertime urban conditions, when the photochemical and biological activities are negligible. Several methods have been introduced to quantitatively apportion aerosol fractions emitted by wood burning and traffic but most of them require costly and time consuming off-line chemical analysis. As opposed to chemical features, the microphysical properties of airborne particles such as optical absorption and size distribution can be easily measured on-line, with high accuracy and sensitivity, especially under highly polluted urban conditions. Recently a new method has been proposed for the apportionment of wood burning and traffic aerosols based on the spectral dependence of their absorption quantified by the Aerosol Angström Exponent (AAE). In this approach the absorption coefficient is deduced from transmission measurement on a filter accumulated aerosol sample and the conversion factor between the measured optical absorption and the corresponding mass concentration (the specific absorption cross section) are determined by on-site chemical analysis. The recently developed multi-wavelength photoacoustic instruments provide novel, in-situ approach towards the reliable and quantitative characterization of carbonaceous particulate matter. Therefore, it also opens up novel possibilities on the source apportionment through the measurement of light absorption. In this study, we demonstrate an in-situ spectral characterization method of the ambient carbon fraction based on light absorption and size distribution measurements using our state-of-the-art multi-wavelength photoacoustic instrument (4λ-PAS) and Single Mobility Particle Sizer (SMPS) The carbonaceous particulate selective source apportionment study was performed for ambient particulate matter in the city center of Szeged, Hungary where the dominance of traffic and wood burning aerosol has been experimentally demonstrated earlier. The proposed model is based on the parallel, in-situ measurement of optical absorption and size distribution. AAEff and AAEwb were deduced from the measured data using the defined correlation between the AOC(1064nm)/AOC(266nm) and N100/N20 ratios. σff(λ) and σwb(λ) were determined with the help of the independently measured temporal mass concentrations in the PM1 mode. Furthermore, the proposed optical source apportionment is based on the assumption that the light absorbing fraction of PM is exclusively related to traffic and wood burning. This assumption is indirectly confirmed here by the fact that the measured size distribution is composed of two unimodal size distributions identified to correspond to traffic and wood burning aerosols. The method offers the possibility of replacing laborious chemical analysis with simple in-situ measurement of aerosol size distribution data. The results by the proposed novel optical absorption based source apportionment method prove its applicability whenever measurements are performed at an urban site where traffic and wood burning are the dominant carbonaceous sources of emission.Keywords: absorption, size distribution, source apportionment, wood burning, traffic aerosol
Procedia PDF Downloads 2287511 Granule Morphology of Zirconia Powder with Solid Content on Two-Fluid Spray Drying
Authors: Hyeongdo Jeong, Jong Kook Lee
Abstract:
Granule morphology and microstructure were affected by slurry viscosity, chemical composition, particle size and spray drying process. In this study, we investigated granule morphology of zirconia powder with solid content on two-fluid spray drying. Zirconia granules after spray drying show sphere-like shapes with a diameter of 40-70 μm at low solid contents (30 or 40 wt%) and specific surface area of 5.1-5.6 m²/g. But a donut-like shape with a few cracks were observed on zirconia granules prepared from the slurry of high solid content (50 wt %), green compacts after cold isostatic pressing under the pressure of 200 MPa have the density of 2.1-2.2 g/cm³ and homogeneous fracture surface by complete destruction of granules. After the sintering at 1500 °C for 2 h, all specimens have relative density of 96.2-98.3 %. With increasing a solid content from 30 to 50 wt%, grain size increased from 0.3 to 0.6 μm, but relative density was inversely decreased from 98.3 to 96.2 %.Keywords: zirconia, solid content, granulation, spray drying
Procedia PDF Downloads 2167510 Evaluating Thailand’s Cosmetic Surgery Tourism by Taiwanese Female Tourists
Authors: Wen-Yu Chen, Chia-Yuan Hsu, Sasinee Vongsrikul
Abstract:
The present study is to explore the perception of Taiwanese females towards medical tourism in Thailand for the development of applicable marketing strategy, integrating travel motivation and cosmetic surgery trend to attract potential medical tourists from Taiwan. Since previous studies relevant to this research issue are limited, qualitative study is firstly employed by using one focus group interview and in-depth interviews with Taiwanese females. Moreover, the present research collected questionnaires from 290 Taiwanese females to provide greater understanding of research results. The top three factors that affect Taiwanese females’ decision for not going to Thailand for medical tourism are “physicians and nurses cannot speak Chinese”, “low quality of the cosmetic surgery product that I want to do”, and “the county does not have laws to protect medical tourists’ right”. The finding of the empirical part would suggest the area in medical tourism industry which Thailand should promote and emphasizes in order to increase its presence as a hub for cosmetic surgery and attract Taiwanese female market. Therefore, the study contributes to the potential development of marketing strategy for medical tourism, specifically in the area of cosmetic surgery in Thailand while targeting Taiwan market.Keywords: Thailand, Taiwanese female tourists, medical tourism, cosmetic surgery
Procedia PDF Downloads 4247509 Listening to Circles, Playing Lights: A Study of Cross-Modal Perception in Music
Authors: Roni Granot, Erica Polini
Abstract:
Music is often described in terms of non-auditory adjectives such as a rising melody, a bright sound, or a zigzagged contour. Such cross modal associations have been studied with simple isolated musical parameters, but only rarely in rich musical contexts. The current study probes cross sensory associations with polarity based dimensions by means of pairings of 10 adjectives: blunt-sharp, relaxed-tense, heavy-light, low (in space)-high, low (pitch)-high, big-small, hard-soft, active-passive, bright-dark, sad-happy. 30 participants (randomly assigned to one of two groups) were asked to rate one of 27 short saxophone improvisations on a 1 to 6 scale where 1 and six correspond to the opposite pole of each dimension. The 27 improvisations included three exemplars for each of three dimensions (size, brightness, sharpness), played by three different players. Here we focus on the question of whether ratings of scales corresponding with the musical dimension were consistently rated as such (e.g. music improvised to represent a white circle rated as bright in contrast with music improvised to represent a dark circle rated as dark). Overall the average scores by dimension showed an upward trend in the equivalent verbal scale, with a low rating for small, bright and sharp musical improvisations and higher scores for large, dark and blunt improvisations. Friedman tests indicate a statistically significant difference for brightness (χ2 (2) = 19.704, p = .000) and sharpness dimensions (χ2 (2) = 15.750, p = .000), but not for size (χ2 (2) = 1.444, p = .486). Post hoc analysis with Wilcoxon signed-rank tests within the brightness dimension, show significant differences among all possible parings resulted in significant differences: the rankings of 'bright' and 'dark' (Z = -3.310, p = .001), of 'bright' and 'medium' (Z = -2.438, p = .015) and of 'dark' and 'medium' music (Z = -2.714, p = .007); but only differences between the extreme contrasts within the sharpness dimension : 'sharp' and 'blunt' music (Z = -3.147, p = .002) and between 'sharp' and 'medium' music rated on the sharpness scale (Z = - 3.054, p = .002), but not between 'medium' and 'blunt' music (Z = -.982, p = .326). In summary our study suggests a privileged link between music and the perceptual and semantic domain of brightness. In contrast, size seems to be very difficult to convey in music, whereas sharpness seems to be mapped onto the two extremes (sharp vs. blunt) rather than continuously. This is nicely reflected in the musical literature in titles and texts which stress the association between music and concepts of light or darkness rather than sharpness or size.Keywords: audiovisual, brightness, cross-modal perception, cross-sensory correspondences, size, visual angularity
Procedia PDF Downloads 2217508 Investigation of the Effect of Grid Size on External Store Separation Trajectory Using CFD
Authors: Alaa A. Osman, Amgad M. Bayoumy Aly, Ismail El baialy, Osama E. Abdellatif, Essam E. Khallil
Abstract:
In this paper, a numerical simulation of a finned store separating from a wing-pylon configuration has been studied and validated. A dynamic unstructured tetrahedral mesh approach is accomplished by using three grid sizes to numerically solving the discretized three dimensional, inviscid and compressible Navier-stokes equations. The method used for computations of separation of an external store assuming quasi-steady flow condition. Computations of quasi-steady flow have been directly coupled to a six degree-of-freedom (6DOF) rigid-body motion code to generate store trajectories. The pressure coefficients at four different angular cuts and time histories of various trajectory parameters during the store separation are compared for every grid size with published experimental data.Keywords: CFD modelling, transonic store separation, quasi-steady flow, moving-body trajectories
Procedia PDF Downloads 3907507 A Comparative Study on the Influencing Factors of Urban Residential Land Prices Among Regions
Authors: Guo Bingkun
Abstract:
With the rapid development of China's social economy and the continuous improvement of urbanization level, people's living standards have undergone tremendous changes, and more and more people are gathering in cities. The demand for urban residents' housing has been greatly released in the past decade. The demand for housing and related construction land required for urban development has brought huge pressure to urban operations, and land prices have also risen rapidly in the short term. On the other hand, from the comparison of the eastern and western regions of China, there are also great differences in urban socioeconomics and land prices in the eastern, central and western regions. Although judging from the current overall market development, after more than ten years of housing market reform and development, the quality of housing and land use efficiency in Chinese cities have been greatly improved. However, the current contradiction between land demand for urban socio-economic development and land supply, especially the contradiction between land supply and demand for urban residential land, has not been effectively alleviated. Since land is closely linked to all aspects of society, changes in land prices will be affected by many complex factors. Therefore, this paper studies the factors that may affect urban residential land prices and compares them among eastern, central and western cities, and finds the main factors that determine the level of urban residential land prices. This paper provides guidance for urban managers in formulating land policies and alleviating land supply and demand. It provides distinct ideas for improving urban planning and improving urban planning and promotes the improvement of urban management level. The research in this paper focuses on residential land prices. Generally, the indicators for measuring land prices mainly include benchmark land prices, land price level values, parcel land prices, etc. However, considering the requirements of research data continuity and representativeness, this paper chooses to use residential land price level values. Reflects the status of urban residential land prices. First of all, based on the existing research at home and abroad, the paper considers the two aspects of land supply and demand and, based on basic theoretical analysis, determines some factors that may affect urban housing, such as urban expansion, taxation, land reserves, population, and land benefits. Factors of land price and correspondingly selected certain representative indicators. Secondly, using conventional econometric analysis methods, we established a model of factors affecting urban residential land prices, quantitatively analyzed the relationship and intensity of influencing factors and residential land prices, and compared the differences in the impact of urban residential land prices between the eastern, central and western regions. Compare similarities. Research results show that the main factors affecting China's urban residential land prices are urban expansion, land use efficiency, taxation, population size, and residents' consumption. Then, the main reason for the difference in residential land prices between the eastern, central and western regions is the differences in urban expansion patterns, industrial structures, urban carrying capacity and real estate development investment.Keywords: urban housing, urban planning, housing prices, comparative study
Procedia PDF Downloads 507506 Effect of Electromagnetic Field on Capacitive Deionization Performance
Authors: Alibi Kilybay, Emad Alhseinat, Ibrahim Mustafa, Abdulfahim Arangadi, Pei Shui, Faisal Almarzooqi
Abstract:
In this work, the electromagnetic field has been used for improving the performance of the capacitive deionization process. The effect of electromagnetic fields on the efficiency of the capacitive deionization (CDI) process was investigated experimentally. The results showed that treating the feed stream of the CDI process using an electromagnetic field can enhance the electrosorption capacity from 20% up to 70%. The effect of the degree of time of exposure, concentration, and type of ions have been examined. The electromagnetic field enhanced the salt adsorption capacity (SAC) of the Ca²⁺ ions by 70%, while the SAC enhanced 20% to the Na⁺ ions. It is hypnotized that the electrometric field affects the hydration shell around the ions and thus reduces their effective size and enhances the mass transfer. This reduction in ion effective size and increase in mass transfer enhanced the electrosorption capacity and kinetics of the CDI process.Keywords: capacitive deionization, desalination, electromagnetic treatment, water treatment
Procedia PDF Downloads 2647505 Physicochemical and Sensory Properties of Gluten-Free Semolina Produced from Blends of Cassava, Maize and Rice
Authors: Babatunde Stephen Oladeji, Gloria Asuquo Edet
Abstract:
The proximate, functional, pasting, and sensory properties of semolina from blends of cassava, maize, and rice were investigated. Cassava, maize, and rice were milled and sieved to pass through a 1000 µm sieve, then blended in the following ratios to produce five samples; FS₁ (40:30:30), FS₂ (20:50:30), FS₃ (25:25:50), FS₄ (34:33:33) and FS₅ (60:20:20) for cassava, maize, and rice, respectively. A market sample of wheat semolina labeled as FSc served as the control. The proximate composition, functional properties, pasting profile, and sensory characteristics of the blends were determined using standard analytical methods. The protein content of the samples ranged from 5.66% to 6.15%, with sample FS₂ having the highest value and being significantly different (p ≤ 0.05). The bulk density of the formulated samples ranged from 0.60 and 0.62 g/ml. The control (FSc) had a higher bulk density of 0.71 g/ml. The water absorption capacity of both the formulated and control samples ranged from 0.67% to 2.02%, with FS₃ having the highest value and FSc having the lowest value (0.67%). The peak viscosity of the samples ranged from 60.83-169.42 RVU, and the final viscosity of semolina samples ranged from 131.17 to 235.42 RVU. FS₅ had the highest overall acceptability score (7.46), but there was no significant difference (p ≤ 0.05) from other samples except for FS₂ (6.54) and FS₃ (6.29). This study establishes that high-quality and consumer-acceptable semolina that is comparable to the market sample could be produced from blends of cassava, maize, and rice.Keywords: semolina, gluten, celiac disease, wheat allergies
Procedia PDF Downloads 1037504 Continuous Improvement Programme as a Strategy for Technological Innovation in Developing Nations. Nigeria as a Case Study
Authors: Sefiu Adebowale Adewumi
Abstract:
Continuous improvement programme (CIP) adopts an approach to improve organizational performance with small incremental steps over time. In this approach, it is not the size of each step that is important, but the likelihood that the improvements will be ongoing. Many companies in developing nations are now complementing continuous improvement with innovation, which is the successful exploitation of new ideas. Focus area of CIP in the organization was in relation to the size of the organizations and also in relation to the generic classification of these organizations. Product quality was prevalent in the manufacturing industry while manpower training and retraining and marketing strategy were emphasized for improvement to be made in the service, transport and supply industries. However, focus on innovation in raw materials, process and methods are needed because these are the critical factors that influence product quality in the manufacturing industries.Keywords: continuous improvement programme, developing countries, generic classfications, technological innovation
Procedia PDF Downloads 1897503 Critical Success Factors Quality Requirement Change Management
Authors: Jamshed Ahmad, Abdul Wahid Khan, Javed Ali Khan
Abstract:
Managing software quality requirements change management is a difficult task in the field of software engineering. Avoiding incoming changes result in user dissatisfaction while accommodating to many requirement changes may delay product delivery. Poor requirements management is solely considered the primary cause of the software failure. It becomes more challenging in global software outsourcing. Addressing success factors in quality requirement change management is desired today due to the frequent change requests from the end-users. In this research study, success factors are recognized and scrutinized with the help of a systematic literature review (SLR). In total, 16 success factors were identified, which significantly impacted software quality requirement change management. The findings show that Proper Requirement Change Management, Rapid Delivery, Quality Software Product, Access to Market, Project Management, Skills and Methodologies, Low Cost/Effort Estimation, Clear Plan and Road Map, Agile Processes, Low Labor Cost, User Satisfaction, Communication/Close Coordination, Proper Scheduling and Time Constraints, Frequent Technological Changes, Robust Model, Geographical distribution/Cultural differences are the key factors that influence software quality requirement change. The recognized success factors and validated with the help of various research methods, i.e., case studies, interviews, surveys and experiments. These factors are then scrutinized in continents, database, company size and period of time. Based on these findings, requirement change will be implemented in a better way.Keywords: global software development, requirement engineering, systematic literature review, success factors
Procedia PDF Downloads 1977502 The Dynamics of Algeria’s Natural Gas Exports to Europe: Evidence from ARDL Bounds Testing Approach with Breakpoints
Authors: Hicham Benamirouche, Oum Elkheir Moussi
Abstract:
The purpose of the study is to examine the dynamics of Algeria’s natural gas exports through the Autoregressive Distributed Lag (ARDL) bounds testing approach with break points. The analysis was carried out for the period from 1967 to 2015. Based on imperfect substitution specification, the ARDL approach reveals a long-run equilibrium relationship between Algeria’s Natural gas exports and their determinant factors (Algeria’s gas reserves, Domestic gas consumption, Europe’s GDP per capita, relative prices, the European gas production and the market share of competitors). All the long-run elasticities estimated are statistically significant with a large impact of domestic factors, which constitute the supply constraints. In short term, the elasticities are statistically significant, and almost comparable to those of the long term. Furthermore, the speed of adjustment towards long-run equilibrium is less than one year because of the little flexibility of the long term export contracts. Two break points have been estimated when we employ the domestic gas consumption as a break variable; 1984 and 2010, which reflect the arbitration policy between the domestic gas market and gas exports.Keywords: natural gas exports, elasticity, ARDL bounds testing, break points, Algeria
Procedia PDF Downloads 1997501 Strengthening Islamic Banking Customer Behavioral Intention through Value and Commitment
Authors: Mornay Roberts-Lombard
Abstract:
Consumers’ perceptions of value are crucial to ensuring their future commitment and behavioral intentions. As a result, service providers, such as Islamic banks, must provide their customers with products and services that are regarded as valuable, stimulating, collaborative, and competent. Therefore, the value provided to customers must meet or surpass their expectations, which can drive customers’ commitment (affective and calculative) and eventually favorably impact their future behavioral intentions. Consequently, Islamic banks in South Africa, as a growing African market, need to obtain a better understanding of the variables that impact Islamic banking customers’ value perceptions and how these impact their future behavioral intentions. Furthermore, it is necessary to investigate how customers’ perceived value perceptions impact their affective and calculative commitment and how the latter impact their future behavioral intentions. The purpose of this study is to bridge these gaps in knowledge, as the competitiveness of the Islamic banking industry in South Africa requires a deeper understanding of the aforementioned relationships. The study was exploratory and quantitative in nature, and data was collected from 250 Islamic banking customers using self-administered questionnaires. These banking customers resided in the Gauteng province of South Africa. Exploratory factor analysis, Pearson’s coefficient analysis, and multiple regression analysis were applied to measure the proposed hypotheses developed for the study. This research will aid Islamic banks in the country in potentially strengthening customers’ future commitment (affective and calculative) and positively impact their future behavioral intentions. The findings of the study established that service quality has a significant and positive impact on perceived value. Moreover, it was determined that perceived value has a favorable and considerable impact on affective and calculative commitment, while calculative commitment has a beneficial impact on behavioral intention. The research informs Islamic banks of the importance of service engagement in driving customer perceived value, which stimulates the future affective and calculative commitment of Islamic bank customers in an emerging market context. Finally, the study proposes guidelines for Islamic banks to develop an enhanced understanding of the factors that impact the perceived value-commitment-behavioral intention link in a competitive Islamic banking market in South Africa.Keywords: perceived value, affective commitment, calculative commitment, behavioural intention
Procedia PDF Downloads 797500 Tumor Size and Lymph Node Metastasis Detection in Colon Cancer Patients Using MR Images
Authors: Mohammadreza Hedyehzadeh, Mahdi Yousefi
Abstract:
Colon cancer is one of the most common cancer, which predicted to increase its prevalence due to the bad eating habits of peoples. Nowadays, due to the busyness of people, the use of fast foods is increasing, and therefore, diagnosis of this disease and its treatment are of particular importance. To determine the best treatment approach for each specific colon cancer patients, the oncologist should be known the stage of the tumor. The most common method to determine the tumor stage is TNM staging system. In this system, M indicates the presence of metastasis, N indicates the extent of spread to the lymph nodes, and T indicates the size of the tumor. It is clear that in order to determine all three of these parameters, an imaging method must be used, and the gold standard imaging protocols for this purpose are CT and PET/CT. In CT imaging, due to the use of X-rays, the risk of cancer and the absorbed dose of the patient is high, while in the PET/CT method, there is a lack of access to the device due to its high cost. Therefore, in this study, we aimed to estimate the tumor size and the extent of its spread to the lymph nodes using MR images. More than 1300 MR images collected from the TCIA portal, and in the first step (pre-processing), histogram equalization to improve image qualities and resizing to get the same image size was done. Two expert radiologists, which work more than 21 years on colon cancer cases, segmented the images and extracted the tumor region from the images. The next step is feature extraction from segmented images and then classify the data into three classes: T0N0، T3N1 و T3N2. In this article, the VGG-16 convolutional neural network has been used to perform both of the above-mentioned tasks, i.e., feature extraction and classification. This network has 13 convolution layers for feature extraction and three fully connected layers with the softmax activation function for classification. In order to validate the proposed method, the 10-fold cross validation method used in such a way that the data was randomly divided into three parts: training (70% of data), validation (10% of data) and the rest for testing. It is repeated 10 times, each time, the accuracy, sensitivity and specificity of the model are calculated and the average of ten repetitions is reported as the result. The accuracy, specificity and sensitivity of the proposed method for testing dataset was 89/09%, 95/8% and 96/4%. Compared to previous studies, using a safe imaging technique (MRI) and non-use of predefined hand-crafted imaging features to determine the stage of colon cancer patients are some of the study advantages.Keywords: colon cancer, VGG-16, magnetic resonance imaging, tumor size, lymph node metastasis
Procedia PDF Downloads 597499 A Cost Effective Approach to Develop Mid-Size Enterprise Software Adopted the Waterfall Model
Authors: Mohammad Nehal Hasnine, Md Kamrul Hasan Chayon, Md Mobasswer Rahman
Abstract:
Organizational tendencies towards computer-based information processing have been observed noticeably in the third-world countries. Many enterprises are taking major initiatives towards computerized working environment because of massive benefits of computer-based information processing. However, designing and developing information resource management software for small and mid-size enterprises under budget costs and strict deadline is always challenging for software engineers. Therefore, we introduced an approach to design mid-size enterprise software by using the Waterfall model, which is one of the SDLC (Software Development Life Cycles), in a cost effective way. To fulfill research objectives, in this study, we developed mid-sized enterprise software named “BSK Management System” that assists enterprise software clients with information resource management and perform complex organizational tasks. Waterfall model phases have been applied to ensure that all functions, user requirements, strategic goals, and objectives are met. In addition, Rich Picture, Structured English, and Data Dictionary have been implemented and investigated properly in engineering manner. Furthermore, an assessment survey with 20 participants has been conducted to investigate the usability and performance of the proposed software. The survey results indicated that our system featured simple interfaces, easy operation and maintenance, quick processing, and reliable and accurate transactions.Keywords: end-user application development, enterprise software design, information resource management, usability
Procedia PDF Downloads 4387498 Stage-Gate Based Integrated Project Management Methodology for New Product Development
Authors: Mert Kıranç, Ekrem Duman, Murat Özbilen
Abstract:
In order to achieve new product development (NPD) activities on time and within budgetary constraints, the NPD managers need a well-designed methodology. This study intends to create an integrated project management methodology for the ones who focus on new product development projects. In the scope of the study, four different management systems are combined. These systems are called as 'Schedule-oriented Stage-Gate Method, Risk Management, Change Management and Earned Value Management'. New product development term is quite common in many different industries such as defense industry, construction, health care/dental, higher education, fast moving consumer goods, white goods, electronic devices, marketing and advertising and software development. All product manufacturers run against each other’s for introducing a new product to the market. In order to achieve to produce a more competitive product in the market, an optimum project management methodology is chosen, and this methodology is adapted to company culture. The right methodology helps the company to present perfect product to the customers at the right time. The benefits of proposed methodology are discussed as an application by a company. As a result, how the integrated methodology improves the efficiency and how it achieves the success of the project are unfolded.Keywords: project, project management, management methodology, new product development, risk management, change management, earned value, stage-gate
Procedia PDF Downloads 3127497 Optimization of Reliability Test Plans: Increase Wafer Fabrication Equipments Uptime
Authors: Swajeeth Panchangam, Arun Rajendran, Swarnim Gupta, Ahmed Zeouita
Abstract:
Semiconductor processing chambers tend to operate in controlled but aggressive operating conditions (chemistry, plasma, high temperature etc.) Owing to this, the design of this equipment requires developing robust and reliable hardware and software. Any equipment downtime due to reliability issues can have cost implications both for customers in terms of tool downtime (reduced throughput) and for equipment manufacturers in terms of high warranty costs and customer trust deficit. A thorough reliability assessment of critical parts and a plan for preventive maintenance/replacement schedules need to be done before tool shipment. This helps to save significant warranty costs and tool downtimes in the field. However, designing a proper reliability test plan to accurately demonstrate reliability targets with proper sample size and test duration is quite challenging. This is mainly because components can fail in different failure modes that fit into different Weibull beta value distributions. Without apriori Weibull beta of a failure mode under consideration, it always leads to over/under utilization of resources, which eventually end up in false positives or false negatives estimates. This paper proposes a methodology to design a reliability test plan with optimal model size/duration/both (independent of apriori Weibull beta). This methodology can be used in demonstration tests and can be extended to accelerated life tests to further decrease sample size/test duration.Keywords: reliability, stochastics, preventive maintenance
Procedia PDF Downloads 157496 Separating Landform from Noise in High-Resolution Digital Elevation Models through Scale-Adaptive Window-Based Regression
Authors: Anne M. Denton, Rahul Gomes, David W. Franzen
Abstract:
High-resolution elevation data are becoming increasingly available, but typical approaches for computing topographic features, like slope and curvature, still assume small sliding windows, for example, of size 3x3. That means that the digital elevation model (DEM) has to be resampled to the scale of the landform features that are of interest. Any higher resolution is lost in this resampling. When the topographic features are computed through regression that is performed at the resolution of the original data, the accuracy can be much higher, and the reported result can be adjusted to the length scale that is relevant locally. Slope and variance are calculated for overlapping windows, meaning that one regression result is computed per raster point. The number of window centers per area is the same for the output as for the original DEM. Slope and variance are computed by performing regression on the points in the surrounding window. Such an approach is computationally feasible because of the additive nature of regression parameters and variance. Any doubling of window size in each direction only takes a single pass over the data, corresponding to a logarithmic scaling of the resulting algorithm as a function of the window size. Slope and variance are stored for each aggregation step, allowing the reported slope to be selected to minimize variance. The approach thereby adjusts the effective window size to the landform features that are characteristic to the area within the DEM. Starting with a window size of 2x2, each iteration aggregates 2x2 non-overlapping windows from the previous iteration. Regression results are stored for each iteration, and the slope at minimal variance is reported in the final result. As such, the reported slope is adjusted to the length scale that is characteristic of the landform locally. The length scale itself and the variance at that length scale are also visualized to aid in interpreting the results for slope. The relevant length scale is taken to be half of the window size of the window over which the minimum variance was achieved. The resulting process was evaluated for 1-meter DEM data and for artificial data that was constructed to have defined length scales and added noise. A comparison with ESRI ArcMap was performed and showed the potential of the proposed algorithm. The resolution of the resulting output is much higher and the slope and aspect much less affected by noise. Additionally, the algorithm adjusts to the scale of interest within the region of the image. These benefits are gained without additional computational cost in comparison with resampling the DEM and computing the slope over 3x3 images in ESRI ArcMap for each resolution. In summary, the proposed approach extracts slope and aspect of DEMs at the lengths scales that are characteristic locally. The result is of higher resolution and less affected by noise than existing techniques.Keywords: high resolution digital elevation models, multi-scale analysis, slope calculation, window-based regression
Procedia PDF Downloads 1297495 Investigation on the Effect of Titanium (Ti) Plus Boron (B) Addition to the Mg-AZ31 Alloy in the as Cast and After Extrusion on Its Metallurgical and Mechanical Characteristics
Authors: Adnan I. O. Zaid, Raghad S. Hemeimat
Abstract:
Magnesium - aluminum alloys are versatile materials which are used in manufacturing a number of engineering and industrial parts in the automobile and aircraft industries due to their strength – to –weight -ratio. Against these preferable characteristics, magnesium is difficult to deform at room temperature therefore it is alloyed with other elements mainly Aluminum and Zinc to add some required properties particularly for their high strength - to -weight ratio. Mg and its alloys oxidize rapidly therefore care should be taken during melting or machining them; but they are not fire hazardous. Grain refinement is an important technology to improve the mechanical properties and the micro structure uniformity of the alloys. Grain refinement has been introduced in early fifties; when Cibula showed that the presence of Ti, and Ti+ B, produced a great refining effect in Al. since then it became an industrial practice to grain refine Al. Most of the published work on grain refinement was directed toward grain refining Al and Zinc alloys; however, the effect of the addition of rare earth material on the grain size or the mechanical behavior of Mg alloys has not been previously investigated. This forms the main objective of the research work; where, the effect of Ti addition on the grain size, mechanical behavior, ductility, and the extrusion force & energy consumed in forward extrusion of Mg-AZ31 alloy is investigated and discussed in two conditions, first in the as cast condition and the second after extrusion. It was found that addition of Ti to Mg- AZ31 alloy has resulted in reduction of its grain size by 14%; the reduction in grain size after extrusion was much higher. However the increase in Vicker’s hardness was 3% after the addition of Ti in the as cast condition, and higher values for Vicker’s hardness were achieved after extrusion. Furthermore, an increase in the strength coefficient by 36% was achieved with the addition of Ti to Mg-AZ31 alloy in the as cast condition. Similarly, the work hardening index was also increased indicating an enhancement of the ductility and formability. As for the extrusion process, it was found that the force and energy required for the extrusion were both reduced by 57% and 59% with the addition of Ti.Keywords: cast condition, direct extrusion, ductility, MgAZ31 alloy, super - plasticity
Procedia PDF Downloads 4547494 Implicit Transaction Costs and the Fundamental Theorems of Asset Pricing
Authors: Erindi Allaj
Abstract:
This paper studies arbitrage pricing theory in financial markets with transaction costs. We extend the existing theory to include the more realistic possibility that the price at which the investors trade is dependent on the traded volume. The investors in the market always buy at the ask and sell at the bid price. Transaction costs are composed of two terms, one is able to capture the implicit transaction costs and the other the price impact. Moreover, a new definition of a self-financing portfolio is obtained. The self-financing condition suggests that continuous trading is possible, but is restricted to predictable trading strategies which have left and right limit and finite quadratic variation. That is, predictable trading strategies of infinite variation and of finite quadratic variation are allowed in our setting. Within this framework, the existence of an equivalent probability measure is equivalent to the absence of arbitrage opportunities, so that the first fundamental theorem of asset pricing (FFTAP) holds. It is also proved that, when this probability measure is unique, any contingent claim in the market is hedgeable in an L2-sense. The price of any contingent claim is equal to the risk-neutral price. To better understand how to apply the theory proposed we provide an example with linear transaction costs.Keywords: arbitrage pricing theory, transaction costs, fundamental theorems of arbitrage, financial markets
Procedia PDF Downloads 3617493 Basal Cell Carcinoma Excision Intraoperative Frozen Section for Tumor Clearance and Reconstructive Surgery: A Prospective Open Label Interventional Study
Authors: Moizza Tahir, Uzma Bashir, Aisha Akhtar, Zainab Ansari, Sameen Ansari, Muhammad Ali Tahir
Abstract:
Cancer burden has globally increased. Among cutaneous cancers basal cell carcinoma constitute vast majority of skin cancer. There is need for appropriate diagnostic, therapeutic and prognostic significance evaluation for skin cancers Present study report intraoperative frozen section (FS) histopathological clearance for excision of BCC in a tertiary care center and find the frequency of involvement of surgical margin with reference to anatomical site, with size and surgical technique. It was prospective open label interventional study conducted at Dermatology department of tertiary care hospital Rawalpindi Pakistan in lais on with histopathology department from January 2023 to April 2024. Total of thirty-six (n = 36) patients between age 45-80 years with basal cell carcinoma of 10-20mm on face were included following inclusion exclusion criteria by purposive sampling technique. Informed consent was taken. Surgical excision was performed and intraoperative frozen section histopathology clearance of tumor margin was taken from histopathologist on telephone. Surgical reconstruction was done. Final Histopathology report was reexamined on day 10th for margin and depth clearance. Descriptive statistics were calculated for age, gender, sun exposure, reconstructive technique, anatomical site, and tumor free margin report on frozen section analysis. Chi square test was employed for statistical significance of involvement of surgical margin with reference to anatomical site, size and decision on reconstructive surgical technique, p value of <0.05 was considered significant. Total of 36 patients of BCC were enrolled, males 12 (33.3%) and females were 24 (66.6%). Age ranged from 45 year to 80 year mean of 58.36 ±SD7.8. Size of BCC ranged from 10mm to 35mm mean of 25mm ±SD 0.63. Morphology was nodular 18 (50%), superficial spreading 11(30.6%), morphoeic 1 (2.8%) and ulcerative in 6(16.7%) cases. Intraoperative frozen section for histopathological margin clearance with 2-3 mm safety margin and surgical technique has p-value0.51, for anatomical site p value 0.24 and size p-0.84. Intraoperative frozen section (FS) histopathological clearance for BCC face with 2-3mm safety margin with reference to reconstructive technique, anatomical site and size of BCC were insignificant.Keywords: basal cell carcinoma, tumor free amrgin, basal cell carcinoma and frozen section, safety margin
Procedia PDF Downloads 557492 Effect of Particle Size and Concentration of Pomegranate (Punica granatum l.) Peel Powder on Suppression of Oxidation of Edible Plant Oils
Authors: D. G. D. C. L. Munasinghe, M. S. Gunawardana, P. H. P. Prasanna, C. S. Ranadheera, T. Madhujith
Abstract:
Lipid oxidation is an important process that affects the shelf life of edible oils. Oxidation produces off flavors, off odors and chemical compounds that lead to adverse health effects. Chemical mechanisms such as autoxidation, photo-oxidation and thermal oxidation are responsible for lipid oxidation. Refined, Bleached and Deodorized (RBD) coconut oil, Virgin Coconut Oil (VCO) and corn oil are widely used plant oils. Pomegranate fruit is known to possess high antioxidative efficacy. Peel of pomegranate contains high antioxidant activity than aril and pulp membrane. The study attempted to study the effect of particle size and concentration of pomegranate peel powder on suppression of oxidation of RBD coconut oil, VCO and corn oil. Pomegranate peel powder was incorporated into each oil sample as micro (< 250 µm) and nano particles (280 - 300 nm) at 100 ppm and 200 ppm concentrations. The control sample of each oil was prepared, devoid of pomegranate peel powder. The stability of oils against autoxidation was evaluated by storing oil samples at 60 °C for 28 days. The level of oxidation was assessed by peroxide value and thiobarbituric acid reactive substances on 0,1,3,5,7,14 and 28 day, respectively. VCO containing pomegranate particles of 280 - 300 nm at 200 ppm showed the highest oxidative stability followed by RBD coconut oil and corn oil. Results revealed that pomegranate peel powder with 280 - 300 nm particle size at 200 ppm concentration was the best in mitigating oxidation of RBD coconut oil, VCO and corn oil. There is a huge potential of utilizing pomegranate peel powder as an antioxidant agent in reducing oxidation of edible plant oils.Keywords: antioxidant, autoxidation, micro particles, nano particles, pomegranate peel powder
Procedia PDF Downloads 4537491 The Influence of Ecologically -Valid High- and Low-Volume Resistance Training on Muscle Strength and Size in Trained Men
Authors: Jason Dellatolla, Scott Thomas
Abstract:
Much of the current literature pertaining to resistance training (RT) volume prescription lacks ecological validity, and very few studies investigate true high-volume ranges. Purpose: The present study sought to investigate the effects of ecologically-valid high- vs low-volume RT on muscular size and strength in trained men. Methods: This study systematically randomized trained, college-aged men into two groups: low-volume (LV; n = 4) and high-volume (HV; n = 5). The sample size was affected by COVID-19 limitations. Subjects followed an ecologically-valid 6-week RT program targeting both muscle size and strength. RT occurred 3x/week on non-consecutive days. Over the course of six weeks, LVR and HVR gradually progressed from 15 to 23 sets/week and 30 to 46 sets/week of lower-body RT, respectively. Muscle strength was assessed via 3RM tests in the squat, stiff-leg deadlift (SL DL), and leg press. Muscle hypertrophy was evaluated through a combination of DXA, BodPod, and ultrasound (US) measurements. Results: Two-way repeated-measures ANOVAs indicated that strength in all 3 compound lifts increased significantly among both groups (p < 0.01); between-group differences only occurred in the squat (p = 0.02) and SL DL (p = 0.03), both of which favored HVR. Significant pre-to-post-study increases in indicators of hypertrophy were discovered for lean body mass in the legs via DXA, overall fat-free mass via BodPod, and US measures of muscle thickness (MT) for the rectus femoris, vastus intermedius, vastus medialis, vastus lateralis, long-head of the biceps femoris, and total MT. Between-group differences were only found for MT of the vastus medialis – favoring HVR. Moreover, each additional weekly set of lower-body RT was associated with an average increase in MT of 0.39% in the thigh muscles. Conclusion: We conclude that ecologically-valid RT regimens significantly improve muscular strength and indicators of hypertrophy. When HVR is compared to LVR, HVR provides significantly greater gains in muscular strength but has no greater effect on hypertrophy over the course of 6 weeks in trained, college-aged men.Keywords: ecological validity, hypertrophy, resistance training, strength
Procedia PDF Downloads 1147490 Investigating the Minimum RVE Size to Simulate Poly (Propylene carbonate) Composites Reinforced with Cellulose Nanocrystals as a Bio-Nanocomposite
Authors: Hamed Nazeri, Pierre Mertiny, Yongsheng Ma, Kajsa Duke
Abstract:
The background of the present study is the use of environment-friendly biopolymer and biocomposite materials. Among the recently introduced biopolymers, poly (propylene carbonate) (PPC) has been gaining attention. This study focuses on the size of representative volume elements (RVE) in order to simulate PPC composites reinforced by cellulose nanocrystals (CNCs) as a bio-nanocomposite. Before manufacturing nanocomposites, numerical modeling should be implemented to explore and predict mechanical properties, which may be accomplished by creating and studying a suitable RVE. In other studies, modeling of composites with rod shaped fillers has been reported assuming that fillers are unidirectionally aligned. But, modeling of non-aligned filler dispersions is considerably more difficult. This study investigates the minimum RVE size to enable subsequent FEA modeling. The matrix and nano-fillers were modeled using the finite element software ABAQUS, assuming randomly dispersed fillers with a filler mass fraction of 1.5%. To simulate filler dispersion, a Monte Carlo technique was employed. The numerical simulation was implemented to find composite elastic moduli. After commencing the simulation with a single filler particle, the number of particles was increased to assess the minimum number of filler particles that satisfies the requirements for an RVE, providing the composite elastic modulus in a reliable fashion.Keywords: biocomposite, Monte Carlo method, nanocomposite, representative volume element
Procedia PDF Downloads 4437489 Scientific and Regulatory Challenges of Advanced Therapy Medicinal Products
Authors: Alaa Abdellatif, Gabrièle Breda
Abstract:
Background. Advanced therapy medicinal products (ATMPs) are innovative therapies that mainly target orphan diseases and high unmet medical needs. ATMP includes gene therapy medicinal products (GTMP), somatic cell therapy medicinal products (CTMP), and tissue-engineered therapies (TEP). Since legislation opened the way in 2007, 25 ATMPs have been approved in the EU, which is about the same amount as the U.S. Food and Drug Administration. However, not all of the ATMPs that have been approved have successfully reached the market and retained their approval. Objectives. We aim to understand all the factors limiting the market access to very promising therapies in a systemic approach, to be able to overcome these problems, in the future, with scientific, regulatory and commercial innovations. Further to recent reviews that focus either on specific countries, products, or dimensions, we will address all the challenges faced by ATMP development today. Methodology. We used mixed methods and a multi-level approach for data collection. First, we performed an updated academic literature review on ATMP development and their scientific and market access challenges (papers published between 2018 and April 2023). Second, we analyzed industry feedback from cell and gene therapy webinars and white papers published by providers and pharmaceutical industries. Finally, we established a comparative analysis of the regulatory guidelines published by EMA and the FDA for ATMP approval. Results: The main challenges in bringing these therapies to market are the high development costs. Developing ATMPs is expensive due to the need for specialized manufacturing processes. Furthermore, the regulatory pathways for ATMPs are often complex and can vary between countries, making it challenging to obtain approval and ensure compliance with different regulations. As a result of the high costs associated with ATMPs, challenges in obtaining reimbursement from healthcare payers lead to limited patient access to these treatments. ATMPs are often developed for orphan diseases, which means that the patient population is limited for clinical trials which can make it challenging to demonstrate their safety and efficacy. In addition, the complex manufacturing processes required for ATMPs can make it challenging to scale up production to meet demand, which can limit their availability and increase costs. Finally, ATMPs face safety and efficacy challenges: dangerous adverse events of these therapies like toxicity related to the use of viral vectors or cell therapy, starting material and donor-related aspects. Conclusion. As a result of our mixed method analysis, we found that ATMPs face a number of challenges in their development, regulatory approval, and commercialization and that addressing these challenges requires collaboration between industry, regulators, healthcare providers, and patient groups. This first analysis will help us to address, for each challenge, proper and innovative solution(s) in order to increase the number of ATMPs approved and reach the patientsKeywords: advanced therapy medicinal products (ATMPs), product development, market access, innovation
Procedia PDF Downloads 767488 Transportation and Urban Land-Use System for the Sustainability of Cities, a Case Study of Muscat
Authors: Bader Eddin Al Asali, N. Srinivasa Reddy
Abstract:
Cities are dynamic in nature and are characterized by concentration of people, infrastructure, services and markets, which offer opportunities for production and consumption. Often growth and development in urban areas is not systematic, and is directed by number of factors like natural growth, land prices, housing availability, job locations-the central business district (CBD’s), transportation routes, distribution of resources, geographical boundaries, administrative policies, etc. One sided spatial and geographical development in cities leads to the unequal spatial distribution of population and jobs, resulting in high transportation activity. City development can be measured by the parameters such as urban size, urban form, urban shape, and urban structure. Urban Size is the city size and defined by the population of the city, and urban form is the location and size of the economic activity (CBD) over the geographical space. Urban shape is the geometrical shape of the city over which the distribution of population and economic activity occupied. And Urban Structure is the transport network within which the population and activity centers are connected by hierarchy of roads. Among the urban land-use systems transportation plays significant role and is one of the largest energy consuming sector. Transportation interaction among the land uses is measured in Passenger-Km and mean trip length, and is often used as a proxy for measurement of energy consumption in transportation sector. Among the trips generated in cities, work trips constitute more than 70 percent. Work trips are originated from the place of residence and destination to the place of employment. To understand the role of urban parameters on transportation interaction, theoretical cities of different size and urban specifications are generated through building block exercise using a specially developed interactive C++ programme and land use transportation modeling is carried. The land-use transportation modeling exercise helps in understanding the role of urban parameters and also to classify the cities for their urban form, structure, and shape. Muscat the capital city of Oman underwent rapid urbanization over the last four decades is taken as a case study for its classification. Also, a pilot survey is carried to capture urban travel characteristics. Analysis of land-use transportation modeling with field data classified Muscat as a linear city with polycentric CBD. Conclusions are drawn suggestion are given for policy making for the sustainability of Muscat City.Keywords: land-use transportation, transportation modeling urban form, urban structure, urban rule parameters
Procedia PDF Downloads 2707487 Investigation of the Operational Principle and Flow Analysis of a Newly Developed Dry Separator
Authors: Sung Uk Park, Young Su Kang, Sangmo Kang, Young Kweon Suh
Abstract:
Mineral product, waste concrete (fine aggregates), waste in the optical field, industry, and construction employ separators to separate solids and classify them according to their size. Various sorting machines are used in the industrial field such as those operating under electrical properties, centrifugal force, wind power, vibration, and magnetic force. Study on separators has been carried out to contribute to the environmental industry. In this study, we perform CFD analysis for understanding the basic mechanism of the separation of waste concrete (fine aggregate) particles from air with a machine built with a rotor with blades. In CFD, we first performed two-dimensional particle tracking for various particle sizes for the model with 1 degree, 1.5 degree, and 2 degree angle between each blade to verify the boundary conditions and the method of rotating domain method to be used in 3D. Then we developed 3D numerical model with ANSYS CFX to calculate the air flow and track the particles. We judged the capability of particle separation for given size by counting the number of particles escaping from the domain toward the exit among 10 particles issued at the inlet. We confirm that particles experience stagnant behavior near the exit of the rotating blades where the centrifugal force acting on the particles is in balance with the air drag force. It was also found that the minimum particle size that can be separated by the machine with the rotor is determined by its capability to stay at the outlet of the rotor channels.Keywords: environmental industry, separator, CFD, fine aggregate
Procedia PDF Downloads 5957486 Borrowing Performance: A Network Connectivity Analysis of Second-Tier Cities in Turkey
Authors: Eğinç Simay Ertürk, Ferhan Gezi̇ci̇
Abstract:
The decline of large cities and the rise of second-tier cities have been observed as a global trend with significant implications for economic development and urban planning. In this context, the concepts of agglomeration shadow and borrowed size have gained importance as network externalities that affect the growth and development of surrounding areas. Istanbul, Izmir, and Ankara are Turkey's most significant metropolitan cities and play a significant role in the country's economy. The surrounding cities rely on these metropolitan cities for economic growth and development. However, the concentration of resources and investment in a single location can lead to agglomeration shadows in the surrounding areas. On the other hand, network connectivity between metropolitan and second-tier cities can result in borrowed function and performance, enabling smaller cities to access resources, investment, and knowledge they would not otherwise have access. The study hypothesizes that the network connectivity between second-tier and metropolitan cities in Turkey enables second-tier cities to increase their urban performance by borrowing size through these networks. Regression analysis will be used to identify specific network connectivity parameters most strongly associated with urban performance. Network connectivity will be measured with parameters such as transportation nodes and telecommunications infrastructure, and urban performance will be measured with an index, including parameters such as employment, education, and industry entrepreneurship, with data at the province levels. The contribution of the study lies in its research on how networking can benefit second-tier cities in Turkey.Keywords: network connectivity, borrowed size, agglomeration shadow, secondary cities
Procedia PDF Downloads 817485 Modelling Volatility of Cryptocurrencies: Evidence from GARCH Family of Models with Skewed Error Innovation Distributions
Authors: Timothy Kayode Samson, Adedoyin Isola Lawal
Abstract:
The past five years have shown a sharp increase in public interest in the crypto market, with its market capitalization growing from $100 billion in June 2017 to $2158.42 billion on April 5, 2022. Despite the outrageous nature of the volatility of cryptocurrencies, the use of skewed error innovation distributions in modelling the volatility behaviour of these digital currencies has not been given much research attention. Hence, this study models the volatility of 5 largest cryptocurrencies by market capitalization (Bitcoin, Ethereum, Tether, Binance coin, and USD Coin) using four variants of GARCH models (GJR-GARCH, sGARCH, EGARCH, and APARCH) estimated using three skewed error innovation distributions (skewed normal, skewed student- t and skewed generalized error innovation distributions). Daily closing prices of these currencies were obtained from Yahoo Finance website. Finding reveals that the Binance coin reported higher mean returns compared to other digital currencies, while the skewness indicates that the Binance coin, Tether, and USD coin increased more than they decreased in values within the period of study. For both Bitcoin and Ethereum, negative skewness was obtained, meaning that within the period of study, the returns of these currencies decreased more than they increased in value. Returns from these cryptocurrencies were found to be stationary but not normality distributed with evidence of the ARCH effect. The skewness parameters in all best forecasting models were all significant (p<.05), justifying of use of skewed error innovation distributions with a fatter tail than normal, Student-t, and generalized error innovation distributions. For Binance coin, EGARCH-sstd outperformed other volatility models, while for Bitcoin, Ethereum, Tether, and USD coin, the best forecasting models were EGARCH-sstd, APARCH-sstd, EGARCH-sged, and GJR-GARCH-sstd, respectively. This suggests the superiority of skewed Student t- distribution and skewed generalized error distribution over the skewed normal distribution.Keywords: skewed generalized error distribution, skewed normal distribution, skewed student t- distribution, APARCH, EGARCH, sGARCH, GJR-GARCH
Procedia PDF Downloads 1197484 Formulation and Invivo Evaluation of Salmeterol Xinafoate Loaded MDI for Asthma Using Response Surface Methodology
Authors: Paresh Patel, Priya Patel, Vaidehi Sorathiya, Navin Sheth
Abstract:
The aim of present work was to fabricate Salmeterol Xinafoate (SX) metered dose inhaler (MDI) for asthma and to evaluate the SX loaded solid lipid nanoparticles (SLNs) for pulmonary delivery. Solid lipid nanoparticles can be used to deliver particles to the lungs via MDI. A modified solvent emulsification diffusion technique was used to prepare Salmeterol Xinafoate loaded solid lipid nanoparticles by using compritol 888 ATO as lipid, tween 80 as surfactant, D-mannitol as cryoprotecting agent and L-leucine was used to improve aerosolization behaviour. Box-Behnken design was applied with 17 runs. 3-D surface response plots and contour plots were drawn and optimized formulation was selected based on minimum particle size and maximum % EE. % yield, in vitro diffusion study, scanning electron microscopy, X-ray diffraction, DSC, FTIR also characterized. Particle size, zeta potential analyzed by Zetatrac particle size analyzer and aerodynamic properties was carried out by cascade impactor. Pre convulsion time was examined for control group, treatment group and compare with marketed group. MDI was evaluated for leakage test, flammability test, spray test and content per puff. By experimental design, particle size and % EE found to be in range between 119-337 nm and 62.04-76.77% by solvent emulsification diffusion technique. Morphologically, particles have spherical shape and uniform distribution. DSC & FTIR study showed that no interaction between drug and excipients. Zeta potential shows good stability of SLNs. % respirable fraction found to be 52.78% indicating reach to the deep part of lung such as alveoli. Animal study showed that fabricated MDI protect the lungs against histamine induced bronchospasm in guinea pigs. MDI showed sphericity of particle in spray pattern, 96.34% content per puff and non-flammable. SLNs prepared by Solvent emulsification diffusion technique provide desirable size for deposition into the alveoli. This delivery platform opens up a wide range of treatment application of pulmonary disease like asthma via solid lipid nanoparticles.Keywords: salmeterol xinafoate, solid lipid nanoparticles, box-behnken design, solvent emulsification diffusion technique, pulmonary delivery
Procedia PDF Downloads 4517483 Nonstationary Increments and Casualty in the Aluminum Market
Authors: Andrew Clark
Abstract:
McCauley, Bassler, and Gunaratne show that integration I(d) processes as used in economics and finance do not necessarily produce stationary increments, which are required to determine causality in both the short term and the long term. This paper follows their lead and shows I(d) aluminum cash and futures log prices at daily and weekly intervals do not have stationary increments, which means prior causality studies using I(d) processes need to be re-examined. Wavelets based on undifferenced cash and futures log prices do have stationary increments and are used along with transfer entropy (versus cointegration) to measure causality. Wavelets exhibit causality at most daily time scales out to 1 year, and weekly time scales out to 1 year and more. To determine stationarity, localized stationary wavelets are used. LSWs have the benefit, versus other means of testing for stationarity, of using multiple hypothesis tests to determine stationarity. As informational flows exist between cash and futures at daily and weekly intervals, the aluminum market is efficient. Therefore, hedges used by producers and consumers of aluminum need not have a big concern in terms of the underestimation of hedge ratios. Questions about arbitrage given efficiency are addressed in the paper.Keywords: transfer entropy, nonstationary increments, wavelets, localized stationary wavelets, localized stationary wavelets
Procedia PDF Downloads 202