Search results for: multivariate time series data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 37861

Search results for: multivariate time series data

30361 Bio-Furan Based Poly (β-Thioether Ester) Synthesized via Thiol-Michael Addition Polymerization with Tunable Structure and Properties

Authors: Daihui Zhang, Marie J. Dumont

Abstract:

A derivative of 5-hydroxymethylfurfural (HMF) was synthesized for the thiol-Michael addition reaction. The efficiency of the catalysts (base and nucleophiles) and side reactions during the thiol-Michael addition were investigated. Dimethylphenylphosphine efficiently initiated the thiol-Michael addition polymerization for synthesizing a series of bio-based furan polymers with different structure and properties. The benzene rings or hydroxyl groups present in the polymer chains increased the glass transition temperature (Tg) of poly (β-thioether ester). Additionally, copolymers with various compositions were obtained via adding different ratio of 1,6-hexanedithiols to 1,4-benzenedithiols. 1H NMR analysis revealed that experimental ratios of two dithiols monomers matched well with theoretical ratios. The occurrence of a reversible Diels-Alder reaction between furan rings and maleimide groups allowed poly (β-thioether ester) to be dynamically crosslinked. These polymers offer the potentials to produce materials from biomass that have both practical mechanical properties and reprocessing ability.

Keywords: copolymers, Diels-Alder reaction, hydroxymethylfurfural, Thiol-Michael addition

Procedia PDF Downloads 312
30360 The Role of Neuroserpin in Rheumatoid Arthritis Patients

Authors: Sevil Arabaci Tamer, Gonul Gurol, Ibrahim Tekeoglu, Halil Harman, Ihsan Hakki Ciftci

Abstract:

Neuroserpin (NSP) is a serine protease inhibitor and member of the serpin family. It is expressed in developing and adult nervous systems, and acts as an inhibitor of protease tissue plasminogen activator (tPA) and a regulator of neuronal growth and plasticity. Also NSP displays anti-inflammatory activity. But, its role in rheumatoid arthritis had never been studied before. So, the aim of the present study was to investigate the effect of neuroserpin in patients with RA. A total of 50 frozen (-20 ºC) serum samples 40 of them belonged to patients with RA, and 10 sample belonged to healthy subjects, were enrolled prospectively. We used DAS-28 to evaluate disease activity. The following clinical data gathered from the original patients' charts. Serum neuroserpin levels were measured by enzyme-linked immunosorbent assay. Our preliminary study results demonstrate, for the first time, that NSP levels are significantly different in RA patients relative to healthy subjects (P = 0.014). So, NSP contribute to pathological condition of RA. Thus, we believe that serum NSP levels can be as a marker in patients with RA. However other inflammatory diseases should be further investigated.

Keywords: neuroserpin, rheumatoid arthritis, tPA, tPA inhibitor

Procedia PDF Downloads 456
30359 An Evaluation of a First Year Introductory Statistics Course at a University in Jamaica

Authors: Ayesha M. Facey

Abstract:

The evaluation sought to determine the factors associated with the high failure rate among students taking a first-year introductory statistics course. By utilizing Tyler’s Objective Based Model, the main objectives were: to assess the effectiveness of the lecturer’s teaching strategies; to determine the proportion of students who attends lectures and tutorials frequently and to determine the impact of infrequent attendance on performance; to determine how the assigned activities assisted in students understanding of the course content; to ascertain the possible issues being faced by students in understanding the course material and obtain possible solutions to the challenges and to determine whether the learning outcomes have been achieved based on an assessment of the second in-course examination. A quantitative survey research strategy was employed and the study population was students enrolled in semester one of the academic year 2015/2016. A convenience sampling approach was employed resulting in a sample of 98 students. Primary data was collected using self-administered questionnaires over a one-week period. Secondary data was obtained from the results of the second in-course examination. Data were entered and analyzed in SPSS version 22 and both univariate and bivariate analyses were conducted on the information obtained from the questionnaires. Univariate analyses provided description of the sample through means, standard deviations and percentages while bivariate analyses were done using Spearman’s Rho correlation coefficient and Chi-square analyses. For secondary data, an item analysis was performed to obtain the reliability of the examination questions, difficulty index and discriminant index. The examination results also provided information on the weak areas of the students and highlighted the learning outcomes that were not achieved. Findings revealed that students were more likely to participate in lectures than tutorials and that attendance was high for both lectures and tutorials. There was a significant relationship between participation in lectures and performance on examination. However, a high proportion of students has been absent from three or more tutorials as well as lectures. A higher proportion of students indicated that they completed the assignments obtained from the lectures sometimes while they rarely completed tutorial worksheets. Students who were more likely to complete their assignments were significantly more likely to perform well on their examination. Additionally, students faced a number of challenges in understanding the course content and the topics of probability, binomial distribution and normal distribution were the most challenging. The item analysis also highlighted these topics as problem areas. Problems doing mathematics and application and analyses were their major challenges faced by students and most students indicated that some of the challenges could be alleviated if additional examples were worked in lectures and they were given more time to solve questions. Analysis of the examination results showed that a number of learning outcomes were not achieved for a number of topics. Based on the findings recommendations were made that suggested adjustments to grade allocations, delivery of lectures and methods of assessment.

Keywords: evaluation, item analysis, Tyler’s objective based model, university statistics

Procedia PDF Downloads 175
30358 Parametric Approach for Reserve Liability Estimate in Mortgage Insurance

Authors: Rajinder Singh, Ram Valluru

Abstract:

Chain Ladder (CL) method, Expected Loss Ratio (ELR) method and Bornhuetter-Ferguson (BF) method, in addition to more complex transition-rate modeling, are commonly used actuarial reserving methods in general insurance. There is limited published research about their relative performance in the context of Mortgage Insurance (MI). In our experience, these traditional techniques pose unique challenges and do not provide stable claim estimates for medium to longer term liabilities. The relative strengths and weaknesses among various alternative approaches revolve around: stability in the recent loss development pattern, sufficiency and reliability of loss development data, and agreement/disagreement between reported losses to date and ultimate loss estimate. CL method results in volatile reserve estimates, especially for accident periods with little development experience. The ELR method breaks down especially when ultimate loss ratios are not stable and predictable. While the BF method provides a good tradeoff between the loss development approach (CL) and ELR, the approach generates claim development and ultimate reserves that are disconnected from the ever-to-date (ETD) development experience for some accident years that have more development experience. Further, BF is based on subjective a priori assumption. The fundamental shortcoming of these methods is their inability to model exogenous factors, like the economy, which impact various cohorts at the same chronological time but at staggered points along their life-time development. This paper proposes an alternative approach of parametrizing the loss development curve and using logistic regression to generate the ultimate loss estimate for each homogeneous group (accident year or delinquency period). The methodology was tested on an actual MI claim development dataset where various cohorts followed a sigmoidal trend, but levels varied substantially depending upon the economic and operational conditions during the development period spanning over many years. The proposed approach provides the ability to indirectly incorporate such exogenous factors and produce more stable loss forecasts for reserving purposes as compared to the traditional CL and BF methods.

Keywords: actuarial loss reserving techniques, logistic regression, parametric function, volatility

Procedia PDF Downloads 107
30357 Radiology Information System’s Mechanisms: HL7-MHS & HL7/DICOM Translation

Authors: Kulwinder Singh Mann

Abstract:

The innovative features of information system, known as Radiology Information System (RIS), for electronic medical records has shown a good impact in the hospital. The objective is to help and make their work easier; such as for a physician to access the patient’s data and for a patient to check their bill transparently. The interoperability of RIS with the other intra-hospital information systems it interacts with, dealing with the compatibility and open architecture issues, are accomplished by two novel mechanisms. The first one is the particular message handling system that is applied for the exchange of information, according to the Health Level Seven (HL7) protocol’s specifications and serves the transfer of medical and administrative data among the RIS applications and data store unit. The second one implements the translation of information between the formats that HL7 and Digital Imaging and Communication in Medicine (DICOM) protocols specify, providing the communication between RIS and Picture and Archive Communication System (PACS) which is used for the increasing incorporation of modern medical imaging equipment.

Keywords: RIS, PACS, HIS, HL7, DICOM, messaging service, interoperability, digital images

Procedia PDF Downloads 286
30356 Participation Motivation and Financing Approach of Small and Medium Enterprises in Mergers and Acquisitions in Vietnam from the Viewpoint of Intermediaries

Authors: Nguyen Thi Hoang Hieu

Abstract:

Mergers and Acquisitions (M&A) activities have increasingly become popular in both developed and developing countries. It is also an attractive topic for researchers to exploit in various sectors such as business, economies or finance. However, activities of Small and Medium Enterprises (SMEs) in M&A activities for a long time have not been sufficiently studied to provide the complete picture of what has been really, particularly in the developing market like Vietnam. The study employs semi-structured in-depth interviews with experts who have worked for years in the M&A sector to explore the participation motivation of both buy side and sell side of M&A activities. In addition, through the interviews, the study attempts to explain how firms finance their M&A deals. The collected data then will be content-analyzed to reflect the study's expectations based on the theories and practices reviews. In addition, limitations and recommendations are given in the hope that the M&A performance in Vietnam can be improved in the future.

Keywords: mergers and acquisitions, Vietnam, small and medium enterprises, content-analysis, semi-structure in-depth interview

Procedia PDF Downloads 245
30355 Bulk Viscous Bianchi Type V Cosmological Model with Time Dependent Gravitational Constant and Cosmological Constant in General Relativity

Authors: Reena Behal, D. P. Shukla

Abstract:

In this paper, we investigate Bulk Viscous Bianchi Type V Cosmological Model with Time dependent gravitational constant and cosmological constant in general Relativity by assuming ξ(t)=ξ_(0 ) p^m where ξ_(0 ) and m are constants. We also assume a variation law for Hubble parameter as H(R) = a (R^(-n)+1), where a>0, n>1 being constant. Two universe models were obtained, and their physical behavior has been discussed. When n=1 the Universe starts from singular state whereas when n=0 the cosmology follows a no singular state. The presence of bulk viscosity increase matter density’s value.

Keywords: Bulk Viscous Bianchi Type V Cosmological Model, hubble constants, gravitational constant, cosmological constants

Procedia PDF Downloads 156
30354 Application of Discrete-Event Simulation in Health Technology Assessment: A Cost-Effectiveness Analysis of Alzheimer’s Disease Treatment Using Real-World Evidence in Thailand

Authors: Khachen Kongpakwattana, Nathorn Chaiyakunapruk

Abstract:

Background: Decision-analytic models for Alzheimer’s disease (AD) have been advanced to discrete-event simulation (DES), in which individual-level modelling of disease progression across continuous severity spectra and incorporation of key parameters such as treatment persistence into the model become feasible. This study aimed to apply the DES to perform a cost-effectiveness analysis of treatment for AD in Thailand. Methods: A dataset of Thai patients with AD, representing unique demographic and clinical characteristics, was bootstrapped to generate a baseline cohort of patients. Each patient was cloned and assigned to donepezil, galantamine, rivastigmine, memantine or no treatment. Throughout the simulation period, the model randomly assigned each patient to discrete events including hospital visits, treatment discontinuation and death. Correlated changes in cognitive and behavioral status over time were developed using patient-level data. Treatment effects were obtained from the most recent network meta-analysis. Treatment persistence, mortality and predictive equations for functional status, costs (Thai baht (THB) in 2017) and quality-adjusted life year (QALY) were derived from country-specific real-world data. The time horizon was 10 years, with a discount rate of 3% per annum. Cost-effectiveness was evaluated based on the willingness-to-pay (WTP) threshold of 160,000 THB/QALY gained (4,994 US$/QALY gained) in Thailand. Results: Under a societal perspective, only was the prescription of donepezil to AD patients with all disease-severity levels found to be cost-effective. Compared to untreated patients, although the patients receiving donepezil incurred a discounted additional costs of 2,161 THB, they experienced a discounted gain in QALY of 0.021, resulting in an incremental cost-effectiveness ratio (ICER) of 138,524 THB/QALY (4,062 US$/QALY). Besides, providing early treatment with donepezil to mild AD patients further reduced the ICER to 61,652 THB/QALY (1,808 US$/QALY). However, the dominance of donepezil appeared to wane when delayed treatment was given to a subgroup of moderate and severe AD patients [ICER: 284,388 THB/QALY (8,340 US$/QALY)]. Introduction of a treatment stopping rule when the Mini-Mental State Exam (MMSE) score goes below 10 to a mild AD cohort did not deteriorate the cost-effectiveness of donepezil at the current treatment persistence level. On the other hand, none of the AD medications was cost-effective when being considered under a healthcare perspective. Conclusions: The DES greatly enhances real-world representativeness of decision-analytic models for AD. Under a societal perspective, treatment with donepezil improves patient’s quality of life and is considered cost-effective when used to treat AD patients with all disease-severity levels in Thailand. The optimal treatment benefits are observed when donepezil is prescribed since the early course of AD. With healthcare budget constraints in Thailand, the implementation of donepezil coverage may be most likely possible when being considered starting with mild AD patients, along with the stopping rule introduced.

Keywords: Alzheimer's disease, cost-effectiveness analysis, discrete event simulation, health technology assessment

Procedia PDF Downloads 110
30353 Synthesis of Novel Nanostructure Copper(II) Metal-Organic Complex for Photocatalytic Degradation of Remdesivir Antiviral COVID-19 from Aqueous Solution: Adsorption Kinetic and Thermodynamic Studies

Authors: Sam Bahreini, Payam Hayati

Abstract:

Metal-organic coordination [Cu(L)₄(SCN)₂] was synthesized applying ultrasonic irradiation, and its photocatalytic performance for the degradation of Remdesivir (RS) under sunlight irradiation was systematically explored for the first time in this study. The physicochemical properties of the synthesized photocatalyst were investigated using Fourier-transform infrared (FT-IR), field emission scanning electron microscopy (FE-SEM), powder x-ray diffraction (PXRD), energy-dispersive x-ray (EDX), thermal gravimetric analysis (TGA), diffuse reflectance spectroscopy (DRS) techniques. Systematic examinations were carried out by changing irradiation time, temperature, solution pH value, contact time, RS concentration, and catalyst dosage. The photodegradation kinetic profiles were modeled in pseudo-first order, pseudo-second-order, and intraparticle diffusion models reflected that photodegradation onto [Cu(L)₄(SCN)₂] catalyst follows pseudo-first order kinetic model. The fabricated [Cu(L)₄(SCN)₂] nanostructure bandgap was determined as 2.60 eV utilizing the Kubelka-Munk formula from the diffuse reflectance spectroscopy method. Decreasing chemical oxygen demand (COD) (from 70.5 mgL-1 to 36.4 mgL-1) under optimal conditions well confirmed mineralizing of the RS drug. The values of ΔH° and ΔS° was negative, implying the process of adsorption is spontaneous and more favorable in lower temperatures.

Keywords: Photocatalytic degradation, COVID-19, density functional theory (DFT), molecular electrostatic potential (MEP)

Procedia PDF Downloads 149
30352 IT Workforce Enablement: How Cloud Computing Changes the Competence Mix of the IT Workforce

Authors: Dominik Krimpmann

Abstract:

Cloud computing has provided the impetus for change in the demand, sourcing, and consumption of IT-enabled services. The technology developed from an emerging trend towards a ‘must-have’. Many organizations harnessed on the quick-wins of cloud computing within the last five years but nowadays reach a plateau when it comes to sustainable savings and performance. This study aims to investigate what is needed from an organizational perspective to make cloud computing a sustainable success. The study was carried out in Germany among senior IT professionals, both in management and delivery positions. Our research shows that IT executives must be prepared to realign their IT workforce to sustain the advantage of cloud computing for today and the near future. While new roles will undoubtedly emerge, roles alone cannot ensure the success of cloud deployments. What is needed is a change in the IT workforce’s business behaviour, or put more simply, the ways in which the IT personnel works. It gives clear guidance on which dimensions of an employees’ working behaviour need to be adapted. The practical implications are drawn from a series of semi-structured interviews, resulting in a high-level workforce enablement plan. Lastly, it elaborates on tools and gives clear guidance on which pitfalls might arise along the proposed workforce enablement process.

Keywords: cloud computing, organization design, organizational change, workforce enablement

Procedia PDF Downloads 294
30351 Impact of Fermentation Time and Microbial Source on Physicochemical Properties, Total Phenols and Antioxidant Activity of Finger Millet Malt Beverage

Authors: Henry O. Udeha, Kwaku G. Duodub, Afam I. O. Jideanic

Abstract:

Finger millet (FM) [Eleusine coracana] is considered as a potential ‘‘super grain’’ by the United States National Academies as one of the most nutritious among all the major cereals. The regular consumption of FM-based diets has been associated with reduced risk of diabetes, cataract and gastrointestinal tract disorder. Hyperglycaemic, hypocholesterolaemic and anticataractogenic, and other health improvement properties have been reported. This study examined the effect of fermentation time and microbial source on physicochemical properties, phenolic compounds and antioxidant activity of two finger millet (FM) malt flours. Sorghum was used as an external reference. The grains were malted, mashed and fermented using the grain microflora and Lactobacillus fermentum. The phenolic compounds of the resulting beverage were identified and quantified using ultra-performance liquid chromatography (UPLC) and mass spectrometer system (MS). A fermentation-time dependent decrease in pH and viscosities of the beverages, with a corresponding increase in sugar content were noted. The phenolic compounds found in the FM beverages were protocatechuic acid, catechin and epicatechin. Decrease in total phenolics of the beverages was observed with increased fermentation time. The beverages exhibited 2, 2-diphenyl-1-picrylhydrazyl, 2, 2՛-azinobis-3-ethylbenzthiazoline-6-sulfonic acid radical scavenging action and iron reducing activities, which were significantly (p < 0.05) reduced at 96 h fermentation for both microbial sources. The 24 h fermented beverages retained a higher amount of total phenolics and had higher antioxidant activity compared to other fermentation periods. The study demonstrates that FM could be utilised as a functional grain in the production of non-alcoholic beverage with important phenolic compounds for health promotion and wellness.

Keywords: antioxidant activity, eleusine coracana, fermentation, phenolic compounds

Procedia PDF Downloads 94
30350 Image Rotation Using an Augmented 2-Step Shear Transform

Authors: Hee-Choul Kwon, Heeyong Kwon

Abstract:

Image rotation is one of main pre-processing steps for image processing or image pattern recognition. It is implemented with a rotation matrix multiplication. It requires a lot of floating point arithmetic operations and trigonometric calculations, so it takes a long time to execute. Therefore, there has been a need for a high speed image rotation algorithm without two major time-consuming operations. However, the rotated image has a drawback, i.e. distortions. We solved the problem using an augmented two-step shear transform. We compare the presented algorithm with the conventional rotation with images of various sizes. Experimental results show that the presented algorithm is superior to the conventional rotation one.

Keywords: high-speed rotation operation, image rotation, transform matrix, image processing, pattern recognition

Procedia PDF Downloads 260
30349 Response Surface Methodology to Supercritical Carbon Dioxide Extraction of Microalgal Lipids

Authors: Yen-Hui Chen, Terry Walker

Abstract:

As the world experiences an energy crisis, investing in sustainable energy resources is a pressing mission for many countries. Microalgae-derived biodiesel has attracted intensive attention as an important biofuel, and microalgae Chlorella protothecoides lipid is recognized as a renewable source for microalgae-derived biodiesel production. Supercritical carbon dioxide (SC-CO₂) is a promising green solvent that may potentially substitute the use of organic solvents for lipid extraction; however, the efficiency of SC-CO₂ extraction may be affected by many variables, including temperature, pressure and extraction time individually or in combination. In this study, response surface methodology (RSM) was used to optimize the process parameters, including temperature, pressure and extraction time, on C. protothecoides lipid yield by SC-CO₂ extraction. A second order polynomial model provided a good fit (R-square value of 0.94) for the C. protothecoides lipid yield. The linear and quadratic terms of temperature, pressure and extraction time—as well as the interaction between temperature and pressure—showed significant effects on lipid yield during extraction. The optimal lipid yield from the model was predicted as the temperature of 59 °C, the pressure of 350.7 bar and the extraction time 2.8 hours. Under these conditions, the experimental lipid yield (25%) was close to the predicted value. The principal fatty acid methyl esters (FAME) of C. protothecoides lipid-derived biodiesel were oleic acid methyl ester (60.1%), linoleic acid methyl ester (18.6%) and palmitic acid methyl ester (11.4%), which made up more than 90% of the total FAMEs. In summary, this study indicated that RSM was useful to characterize the optimization the SC-CO₂ extraction process of C. protothecoides lipid yield, and the second-order polynomial model could be used for predicting and describing the lipid yield very well. In addition, C. protothecoides lipid, extracted by SC-CO₂, was suggested as a potential candidate for microalgae-derived biodiesel production.

Keywords: Chlorella protothecoides, microalgal lipids, response surface methodology, supercritical carbon dioxide extraction

Procedia PDF Downloads 427
30348 The Effects and Interactions of Synthesis Parameters on Properties of Mg Substituted Hydroxyapatite

Authors: S. Sharma, U. Batra, S. Kapoor, A. Dua

Abstract:

In this study, the effects and interactions of reaction time and capping agent assistance during sol-gel synthesis of magnesium substituted hydroxyapatite nanopowder (MgHA) on hydroxyapatite (HA) to β-tricalcium phosphate (β-TCP) ratio, Ca/P ratio and mean crystallite size was examined experimentally as well as through statistical analysis. MgHA nanopowders were synthesized by sol-gel technique at room temperature using aqueous solution of calcium nitrate tetrahydrate, magnesium nitrate hexahydrate and potassium dihydrogen phosphate as starting materials. The reaction time for sol-gel synthesis was varied between 15 to 60 minutes. Two process routes were followed with and without addition of triethanolamine (TEA) in the solutions. The elemental compositions of as-synthesized powders were determined using X-ray fluorescence (XRF) spectroscopy. The functional groups present in the as-synthesized MgHA nanopowders were established through Fourier Transform Infrared Spectroscopy (FTIR). The amounts of phases present, Ca/P ratio and mean crystallite sizes of MgHA nanopowders were determined using X-ray diffraction (XRD). The HA content in biphasic mixture of HA and β-TCP and Ca/P ratio in as-synthesized MgHA nanopowders increased effectively with reaction time of sols (p < 0.0001, two way Anova), however, these were independent of TEA addition (p > 0.15, two way Anova). The MgHA nanopowders synthesized with TEA assistance exhibited 14 nm lower crystallite size (p < 0.018, 2 sample t-test) compared to the powder synthesized without TEA assistance.

Keywords: capping agent, hydroxyapatite, regression analysis, sol-gel, 2- sample t-test, two-way analysis of variance (ANOVA)

Procedia PDF Downloads 351
30347 Study of Land Use Changes around an Archaeological Site Using Satellite Imagery Analysis: A Case Study of Hathnora, Madhya Pradesh, India

Authors: Pranita Shivankar, Arun Suryawanshi, Prabodhachandra Deshmukh, S. V. C. Kameswara Rao

Abstract:

Many undesirable significant changes in landscapes and the regions in the vicinity of historically important structures occur as impacts due to anthropogenic activities over a period of time. A better understanding of such influences using recently developed satellite remote sensing techniques helps in planning the strategies for minimizing the negative impacts on the existing environment. In 1982, a fossilized hominid skull cap was discovered at a site located along the northern bank of the east-west flowing river Narmada in the village Hathnora. Close to the same site, the presence of Late Acheulian and Middle Palaeolithic tools have been discovered in the immediately overlying pebbly gravel, suggesting that the ‘Narmada skull’ may be from the Middle Pleistocene age. The reviews of recently carried out research studies relevant to hominid remains all over the world from Late Acheulian and Middle Palaeolithic sites suggest succession and contemporaneity of cultures there, enhancing the importance of Hathnora as a rare precious site. In this context, the maximum likelihood classification using digital interpretation techniques was carried out for this study area using the satellite imagery from Landsat ETM+ for the year 2006 and Landsat TM (OLI and TIRS) for the year 2016. The overall accuracy of Land Use Land Cover (LULC) classification of 2016 imagery was around 77.27% based on ground truth data. The significant reduction in the main river course and agricultural activities and increase in the built-up area observed in remote sensing data analysis are undoubtedly the outcome of human encroachments in the vicinity of the eminent heritage site.

Keywords: cultural succession, digital interpretation, Hathnora, Homo Sapiens, Late Acheulian, Middle Palaeolithic

Procedia PDF Downloads 153
30346 Cigarette Smoke Detection Based on YOLOV3

Authors: Wei Li, Tuo Yang

Abstract:

In order to satisfy the real-time and accurate requirements of cigarette smoke detection in complex scenes, a cigarette smoke detection technology based on the combination of deep learning and color features was proposed. Firstly, based on the color features of cigarette smoke, the suspicious cigarette smoke area in the image is extracted. Secondly, combined with the efficiency of cigarette smoke detection and the problem of network overfitting, a network model for cigarette smoke detection was designed according to YOLOV3 algorithm to reduce the false detection rate. The experimental results show that the method is feasible and effective, and the accuracy of cigarette smoke detection is up to 99.13%, which satisfies the requirements of real-time cigarette smoke detection in complex scenes.

Keywords: deep learning, computer vision, cigarette smoke detection, YOLOV3, color feature extraction

Procedia PDF Downloads 69
30345 Analysis of the Impact of Refractivity on Ultra High Frequency Signal Strength over Gusau, North West, Nigeria

Authors: B. G. Ayantunji, B. Musa, H. Mai-Unguwa, L. A. Sunmonu, A. S. Adewumi, L. Sa'ad, A. Kado

Abstract:

For achieving reliable and efficient communication system, both terrestrial and satellite communication, surface refractivity is critical in planning and design of radio links. This study analyzed the impact of atmospheric parameters on Ultra High Frequency (UHF) signal strength over Gusau, North West, Nigeria. The analysis exploited meteorological data measured simultaneously with UHF signal strength for the month of June 2017 using a Davis Vantage Pro2 automatic weather station and UHF signal strength measuring devices respectively. The instruments were situated at the premise of Federal University, Gusau (6° 78' N, 12° 13' E). The refractivity values were computed using ITU-R model. The result shows that the refractivity value attained the highest value of 366.28 at 2200hr and a minimum value of 350.66 at 2100hr local time. The correlation between signal strength and refractivity is 0.350; Humidity is 0.532 and a negative correlation of -0.515 for temperature.

Keywords: refractivity, UHF (ultra high frequency) signal strength, free space, automatic weather station

Procedia PDF Downloads 179
30344 Landing Performance Improvement Using Genetic Algorithm for Electric Vertical Take Off and Landing Aircrafts

Authors: Willian C. De Brito, Hernan D. C. Munoz, Erlan V. C. Carvalho, Helder L. C. De Oliveira

Abstract:

In order to improve commute time for small distance trips and relieve large cities traffic, a new transport category has been the subject of research and new designs worldwide. The air taxi travel market promises to change the way people live and commute by using the concept of vehicles with the ability to take-off and land vertically and to provide passenger’s transport equivalent to a car, with mobility within large cities and between cities. Today’s civil air transport remains costly and accounts for 2% of the man-made CO₂ emissions. Taking advantage of this scenario, many companies have developed their own Vertical Take Off and Landing (VTOL) design, seeking to meet comfort, safety, low cost and flight time requirements in a sustainable way. Thus, the use of green power supplies, especially batteries, and fully electric power plants is the most common choice for these arising aircrafts. However, it is still a challenge finding a feasible way to handle with the use of batteries rather than conventional petroleum-based fuels. The batteries are heavy and have an energy density still below from those of gasoline, diesel or kerosene. Therefore, despite all the clear advantages, all electric aircrafts (AEA) still have low flight autonomy and high operational cost, since the batteries must be recharged or replaced. In this sense, this paper addresses a way to optimize the energy consumption in a typical mission of an aerial taxi aircraft. The approach and landing procedure was chosen to be the subject of an optimization genetic algorithm, while final programming can be adapted for take-off and flight level changes as well. A real tilt rotor aircraft with fully electric power plant data was used to fit the derived dynamic equations of motion. Although a tilt rotor design is used as a proof of concept, it is possible to change the optimization to be applied for other design concepts, even those with independent motors for hover and cruise flight phases. For a given trajectory, the best set of control variables are calculated to provide the time history response for aircraft´s attitude, rotors RPM and thrust direction (or vertical and horizontal thrust, for independent motors designs) that, if followed, results in the minimum electric power consumption through that landing path. Safety, comfort and design constraints are assumed to give representativeness to the solution. Results are highly dependent on these constraints. For the tested cases, performance improvement ranged from 5 to 10% changing initial airspeed, altitude, flight path angle, and attitude.

Keywords: air taxi travel, all electric aircraft, batteries, energy consumption, genetic algorithm, landing performance, optimization, performance improvement, tilt rotor, VTOL design

Procedia PDF Downloads 99
30343 Enhancing Large Language Models' Data Analysis Capability with Planning-and-Execution and Code Generation Agents: A Use Case for Southeast Asia Real Estate Market Analytics

Authors: Kien Vu, Jien Min Soh, Mohamed Jahangir Abubacker, Piyawut Pattamanon, Soojin Lee, Suvro Banerjee

Abstract:

Recent advances in Generative Artificial Intelligence (GenAI), in particular Large Language Models (LLMs) have shown promise to disrupt multiple industries at scale. However, LLMs also present unique challenges, notably, these so-called "hallucination" which is the generation of outputs that are not grounded in the input data that hinders its adoption into production. Common practice to mitigate hallucination problem is utilizing Retrieval Agmented Generation (RAG) system to ground LLMs'response to ground truth. RAG converts the grounding documents into embeddings, retrieve the relevant parts with vector similarity between user's query and documents, then generates a response that is not only based on its pre-trained knowledge but also on the specific information from the retrieved documents. However, the RAG system is not suitable for tabular data and subsequent data analysis tasks due to multiple reasons such as information loss, data format, and retrieval mechanism. In this study, we have explored a novel methodology that combines planning-and-execution and code generation agents to enhance LLMs' data analysis capabilities. The approach enables LLMs to autonomously dissect a complex analytical task into simpler sub-tasks and requirements, then convert them into executable segments of code. In the final step, it generates the complete response from output of the executed code. When deployed beta version on DataSense, the property insight tool of PropertyGuru, the approach yielded promising results, as it was able to provide market insights and data visualization needs with high accuracy and extensive coverage by abstracting the complexities for real-estate agents and developers from non-programming background. In essence, the methodology not only refines the analytical process but also serves as a strategic tool for real estate professionals, aiding in market understanding and enhancement without the need for programming skills. The implication extends beyond immediate analytics, paving the way for a new era in the real estate industry characterized by efficiency and advanced data utilization.

Keywords: large language model, reasoning, planning and execution, code generation, natural language processing, prompt engineering, data analysis, real estate, data sense, PropertyGuru

Procedia PDF Downloads 56
30342 Rapid Strategic Consensus Building in Land Readjustment in Kabul

Authors: Nangialai Yousufzai, Eysosiyas Etana, Ikuo Sugiyama

Abstract:

Kabul population has been growing continually since 2001 and reaching six million in 2025 due to the rapid inflow from the neighboring countries. As a result of the population growth, lack of living facilities supported by infrastructure services is becoming serious in social and economic aspects. However, about 70% of the city is still occupied illegally and the government has little information on the infrastructure demands. To improve this situation, land readjustment is one of the powerful development tools, because land readjustment does not need a high governmental budget of itself. Instead, the method needs cooperation between stakeholders such as landowners, developers and a local government. So it is becoming crucial for both government and citizens to implement land readjustment for providing tidy urban areas with enough public services to realize more livable city as a whole. On the contrary, the traditional land readjustment tends to spend a long time until now to get consensus on the new plan between stakeholders. One of the reasons is that individual land area (land parcel) is decreased due to the contribution to public such as roads/parks/squares for improving the urban environment. The second reason is that the new plan is difficult for dwellers to imagine new life after the readjustment. Because the paper-based plan is made by an authority not for dwellers but for specialists to precede the project. This paper aims to shorten the time to realize quick consensus between stakeholders. The first improvement is utilizing questionnaire(s) to assess the demand and preference of the landowners. The second one is utilizing 3D model for dwellers to visualize the new environment easily after the readjustment. In additions, the 3D model is reflecting the demand and preference of the resident so that they could select a land parcel according to their sense value of life. The above-mentioned two improvements are carried out after evaluating total land prices of the new plans to select for maximizing the project value. The land price forecasting formula is derived from the current market ones in Kabul. Finally, it is stressed that the rapid consensus-building of land readjustment utilizing ICT and open data analysis is essential to redevelop slums and illegal occupied areas in Kabul.

Keywords: land readjustment, consensus building, land price formula, 3D simulation

Procedia PDF Downloads 316
30341 Neuro-Connectivity Analysis Using Abide Data in Autism Study

Authors: Dulal Bhaumik, Fei Jie, Runa Bhaumik, Bikas Sinha

Abstract:

Human brain is an amazingly complex network. Aberrant activities in this network can lead to various neurological disorders such as multiple sclerosis, Parkinson’s disease, Alzheimer’s disease and autism. fMRI has emerged as an important tool to delineate the neural networks affected by such diseases, particularly autism. In this paper, we propose mixed-effects models together with an appropriate procedure for controlling false discoveries to detect disrupted connectivities in whole brain studies. Results are illustrated with a large data set known as Autism Brain Imaging Data Exchange or ABIDE which includes 361 subjects from 8 medical centers. We believe that our findings have addressed adequately the small sample inference problem, and thus are more reliable for therapeutic target for intervention. In addition, our result can be used for early detection of subjects who are at high risk of developing neurological disorders.

Keywords: ABIDE, autism spectrum disorder, fMRI, mixed-effects model

Procedia PDF Downloads 266
30340 Enhanced Photoelectrochemical performance of TiO₂ Nanorods: The Critical Role of Hydrothermal Reaction Time

Authors: Srijitra Khanpakdee, Teera Butburee, Jung-Ho Yun, Miaoqiang Lyu, Supphasin Thaweesak, Piangjai Peerakiatkhajohn

Abstract:

The synthesis of titanium dioxide (TiO₂) nanorods (NRs) on fluorine-doped tin oxide (FTO) glass via hydrothermal methods was investigated to determine the optimal reaction time for enhanced photocatalytic and optical performance. Reaction times of 4, 6, and 8 hours were studied. Characterization through SEM, UV-vis, XRD, FTIR, Raman spectroscopy and photoelectrochemical (PEC) techniques revealed significant differences in the properties of the TiO₂ NRs based on the reaction duration. XRD and Raman spectroscopy analysis confirmed the formation of the rutile phase of TiO₂. As photoanodes in PEC cells, TiO₂ NRs synthesized for 4 hours exhibited the best photocatalytic activity, with the highest photocurrent density and superior charge transport properties, attributed to their densely packed vertical structure. Longer reaction times resulted in less optimal morphological and photoelectrochemical characteristics. The bandgap of the TiO₂ NRs remained consistent around 3.06 eV, with only slight variations observed. This study highlights the critical role of reaction time in hydrothermal synthesis, identifying 4 hours as the optimal duration for producing TiO₂ NRs with superior photoelectrochemical performance. These findings provide valuable insights for optimizing TiO₂-based materials for solar energy conversion and renewable energy applications.

Keywords: titanium dioxide, nanorods, hydrothermal, photocatalytic, photoelectrochemical

Procedia PDF Downloads 11
30339 Thermal Buckling Response of Cylindrical Panels with Higher Order Shear Deformation Theory—a Case Study with Angle-Ply Laminations

Authors: Humayun R. H. Kabir

Abstract:

An analytical solution before used for static and free-vibration response has been extended for thermal buckling response on cylindrical panel with anti-symmetric laminations. The partial differential equations that govern kinematic behavior of shells produce five coupled differential equations. The basic displacement and rotational unknowns are similar to first order shear deformation theory---three displacement in spatial space, and two rotations about in-plane axes. No drilling degree of freedom is considered. Boundary conditions are considered as complete hinge in all edges so that the panel respond on thermal inductions. Two sets of double Fourier series are considered in the analytical solution process. The sets are selected that satisfy mixed type of natural boundary conditions. Numerical results are presented for the first 10 eigenvalues, and first 10 mode shapes for Ux, Uy, and Uz components. The numerical results are compared with a finite element based solution.

Keywords: higher order shear deformation, composite, thermal buckling, angle-ply laminations

Procedia PDF Downloads 357
30338 Mean Field Model Interaction for Computer and Communication Systems: Modeling and Analysis of Wireless Sensor Networks

Authors: Irina A. Gudkova, Yousra Demigha

Abstract:

Scientific research is moving more and more towards the study of complex systems in several areas of economics, biology physics, and computer science. In this paper, we will work on complex systems in communication networks, Wireless Sensor Networks (WSN) that are considered as stochastic systems composed of interacting entities. The current advancements of the sensing in computing and communication systems is an investment ground for research in several tracks. A detailed presentation was made for the WSN, their use, modeling, different problems that can occur in their application and some solutions. The main goal of this work reintroduces the idea of mean field method since it is a powerful technique to solve this type of models especially systems that evolve according to a Continuous Time Markov Chain (CTMC). Modeling of a CTMC has been focused; we obtained a large system of interacting Continuous Time Markov Chain with population entities. The main idea was to work on one entity and replace the others with an average or effective interaction. In this context to make the solution easier, we consider a wireless sensor network as a multi-body problem and we reduce it to one body problem. The method was applied to a system of WSN modeled as a Markovian queue showing the results of the used technique.

Keywords: Continuous-Time Markov Chain, Hidden Markov Chain, mean field method, Wireless sensor networks

Procedia PDF Downloads 141
30337 Environment Situation Analysis of Germany

Authors: K. Y. Chen, H. Chua, C. W. Kan

Abstract:

In this study, we will analyze Germany’s environmental situation such as water and air quality and review its environmental policy. In addition, we will collect the yearly environmental data as well as information concerning public environmental investment. Based on the data collect, we try to find out the relationship between public environmental investment and sustainable development in Germany. In addition, after comparing the trend of environmental quality and situation of environmental policy and investment, we may have some conclusions and learnable aspects to refer to. Based upon the data collected, it was revealed that Germany has established a well-developed institutionalization of environmental education. And the ecological culture at school is dynamic and continuous renewal. The booming of green markets in Germany is a very successful experience for learning. The green market not only creates a number of job opportunities, but also helps the government to improve and protect the environment. Acknowledgement: Authors would like to thank the financial support from the Hong Kong Polytechnic University for this work.

Keywords: Germany, public environmental investment, environment quality, sustainable development

Procedia PDF Downloads 233
30336 Kinetic Energy Recovery System Using Spring

Authors: Mayuresh Thombre, Prajyot Borkar, Mangirish Bhobe

Abstract:

New advancement of technology and never satisfying demands of the civilization are putting huge pressure on the natural fuel resources and these resources are at a constant threat to its sustainability. To get the best out of the automobile, the optimum balance between performance and fuel economy is important. In the present state of art, either of the above two aspects are taken into mind while designing and development process which puts the other in the loss as increase in fuel economy leads to decrement in performance and vice-versa. In-depth observation of the vehicle dynamics apparently shows that large amount of energy is lost during braking and likewise large amount of fuel is consumed to reclaim the initial state, this leads to lower fuel efficiency to gain the same performance. Current use of Kinetic Energy Recovery System is only limited to sports vehicles only because of the higher cost of this system. They are also temporary in nature as power can be squeezed only during a small time duration and use of superior parts leads to high cost, which results on concentration on performance only and neglecting the fuel economy. In this paper Kinetic Energy Recovery System for storing the power and then using the same while accelerating has been discussed. The major storing element in this system is a Flat Spiral Spring that will store energy by compression and torsion. The use of spring ensure the permanent storage of energy until used by the driver unlike present mechanical regeneration system in which the energy stored decreases with time and is eventually lost. A combination of internal gears and spur gears will be used in order to make the energy release uniform which will lead to safe usage. The system can be used to improve the fuel efficiency by assisting in overcoming the vehicle’s inertia after braking or to provide instant acceleration whenever required by the driver. The performance characteristics of the system including response time, mechanical efficiency and overall increase in efficiency are demonstrated. This technology makes the KERS (Kinetic Energy Recovery System) more flexible and economical allowing specific application while at the same time increasing the time frame and ease of usage.

Keywords: electric control unit, energy, mechanical KERS, planetary gear system, power, smart braking, spiral spring

Procedia PDF Downloads 186
30335 A Prediction Model of Adopting IPTV

Authors: Jeonghwan Jeon

Abstract:

With the advent of IPTV in the fierce competition with existing broadcasting system, it is emerged as an important issue to predict how much the adoption of IPTV service will be. This paper aims to suggest a prediction model for adopting IPTV using classification and Ranking Belief Simplex (CaRBS). A simplex plot method of representing data allows a clear visual representation to the degree of interaction of the support from the variables to the prediction of the objects. CaRBS is applied to the survey data on the IPTV adoption.

Keywords: prediction, adoption, IPTV, CaRBS

Procedia PDF Downloads 397
30334 Systematic Mapping Study of Digitization and Analysis of Manufacturing Data

Authors: R. Clancy, M. Ahern, D. O’Sullivan, K. Bruton

Abstract:

The manufacturing industry is currently undergoing a digital transformation as part of the mega-trend Industry 4.0. As part of this phase of the industrial revolution, traditional manufacturing processes are being combined with digital technologies to achieve smarter and more efficient production. To successfully digitally transform a manufacturing facility, the processes must first be digitized. This is the conversion of information from an analogue format to a digital format. The objective of this study was to explore the research area of digitizing manufacturing data as part of the worldwide paradigm, Industry 4.0. The formal methodology of a systematic mapping study was utilized to capture a representative sample of the research area and assess its current state. Specific research questions were defined to assess the key benefits and limitations associated with the digitization of manufacturing data. Research papers were classified according to the type of research and type of contribution to the research area. Upon analyzing 54 papers identified in this area, it was noted that 23 of the papers originated in Germany. This is an unsurprising finding as Industry 4.0 is originally a German strategy with supporting strong policy instruments being utilized in Germany to support its implementation. It was also found that the Fraunhofer Institute for Mechatronic Systems Design, in collaboration with the University of Paderborn in Germany, was the most frequent contributing Institution of the research papers with three papers published. The literature suggested future research directions and highlighted one specific gap in the area. There exists an unresolved gap between the data science experts and the manufacturing process experts in the industry. The data analytics expertise is not useful unless the manufacturing process information is utilized. A legitimate understanding of the data is crucial to perform accurate analytics and gain true, valuable insights into the manufacturing process. There lies a gap between the manufacturing operations and the information technology/data analytics departments within enterprises, which was borne out by the results of many of the case studies reviewed as part of this work. To test the concept of this gap existing, the researcher initiated an industrial case study in which they embedded themselves between the subject matter expert of the manufacturing process and the data scientist. Of the papers resulting from the systematic mapping study, 12 of the papers contributed a framework, another 12 of the papers were based on a case study, and 11 of the papers focused on theory. However, there were only three papers that contributed a methodology. This provides further evidence for the need for an industry-focused methodology for digitizing and analyzing manufacturing data, which will be developed in future research.

Keywords: analytics, digitization, industry 4.0, manufacturing

Procedia PDF Downloads 90
30333 Grammatically Coded Corpus of Spoken Lithuanian: Methodology and Development

Authors: L. Kamandulytė-Merfeldienė

Abstract:

The paper deals with the main issues of methodology of the Corpus of Spoken Lithuanian which was started to be developed in 2006. At present, the corpus consists of 300,000 grammatically annotated word forms. The creation of the corpus consists of three main stages: collecting the data, the transcription of the recorded data, and the grammatical annotation. Collecting the data was based on the principles of balance and naturality. The recorded speech was transcribed according to the CHAT requirements of CHILDES. The transcripts were double-checked and annotated grammatically using CHILDES. The development of the Corpus of Spoken Lithuanian has led to the constant increase in studies on spontaneous communication, and various papers have dealt with a distribution of parts of speech, use of different grammatical forms, variation of inflectional paradigms, distribution of fillers, syntactic functions of adjectives, the mean length of utterances.

Keywords: CHILDES, corpus of spoken Lithuanian, grammatical annotation, grammatical disambiguation, lexicon, Lithuanian

Procedia PDF Downloads 218
30332 Visualization of Quantitative Thresholds in Stocks

Authors: Siddhant Sahu, P. James Daniel Paul

Abstract:

Technical analysis comprised by various technical indicators is a holistic way of representing price movement of stocks in the market. Various forms of indicators have evolved from the primitive ones in the past decades. There have been many attempts to introduce volume as a major determinant to determine strong patterns in market forecasting. The law of demand defines the relationship between the volume and price. Most of the traders are familiar with the volume game. Including the time dimension to the law of demand provides a different visualization to the theory. While attempting the same, it was found that there are different thresholds in the market for different companies. These thresholds have a significant influence on the price. This article is an attempt in determining the thresholds for companies using the three dimensional graphs for optimizing the portfolios. It also emphasizes on the magnitude of importance of volumes as a key factor for determining of predicting strong price movements, bullish and bearish markets. It uses a comprehensive data set of major companies which form a major chunk of the Indian automotive sector and are thus used as an illustration.

Keywords: technical analysis, expert system, law of demand, stocks, portfolio analysis, Indian automotive sector

Procedia PDF Downloads 294