Search results for: standard parallel salam
5313 Optimizing Weight Loss with AI (GenAISᵀᴹ): A Randomized Trial of Dietary Supplement Prescriptions in Obese Patients
Authors: Evgeny Pokushalov, Andrey Ponomarenko, John Smith, Michael Johnson, Claire Garcia, Inessa Pak, Evgenya Shrainer, Dmitry Kudlay, Sevda Bayramova, Richard Miller
Abstract:
Background: Obesity is a complex, multifactorial chronic disease that poses significant health risks. Recent advancements in artificial intelligence (AI) offer the potential for more personalized and effective dietary supplement (DS) regimens to promote weight loss. This study aimed to evaluate the efficacy of AI-guided DS prescriptions compared to standard physician-guided DS prescriptions in obese patients. Methods: This randomized, parallel-group pilot study enrolled 60 individuals aged 40 to 60 years with a body mass index (BMI) of 25 or greater. Participants were randomized to receive either AI-guided DS prescriptions (n = 30) or physician-guided DS prescriptions (n = 30) for 180 days. The primary endpoints were the percentage change in body weight and the proportion of participants achieving a ≥5% weight reduction. Secondary endpoints included changes in BMI, fat mass, visceral fat rating, systolic and diastolic blood pressure, lipid profiles, fasting plasma glucose, hsCRP levels, and postprandial appetite ratings. Adverse events were monitored throughout the study. Results: Both groups were well balanced in terms of baseline characteristics. Significant weight loss was observed in the AI-guided group, with a mean reduction of -12.3% (95% CI: -13.1 to -11.5%) compared to -7.2% (95% CI: -8.1 to -6.3%) in the physician-guided group, resulting in a treatment difference of -5.1% (95% CI: -6.4 to -3.8%; p < 0.01). At day 180, 84.7% of the AI-guided group achieved a weight reduction of ≥5%, compared to 54.5% in the physician-guided group (Odds Ratio: 4.3; 95% CI: 3.1 to 5.9; p < 0.01). Significant improvements were also observed in BMI, fat mass, and visceral fat rating in the AI-guided group (p < 0.01 for all). Postprandial appetite suppression was greater in the AI-guided group, with significant reductions in hunger and prospective food consumption, and increases in fullness and satiety (p < 0.01 for all). Adverse events were generally mild-to-moderate, with higher incidences of gastrointestinal symptoms in the AI-guided group, but these were manageable and did not impact adherence. Conclusion: The AI-guided dietary supplement regimen was more effective in promoting weight loss, improving body composition, and suppressing appetite compared to the physician-guided regimen. These findings suggest that AI-guided, personalized supplement prescriptions could offer a more effective approach to managing obesity. Further research with larger sample sizes is warranted to confirm these results and optimize AI-based interventions for weight loss.Keywords: obesity, AI-guided, dietary supplements, weight loss, personalized medicine, metabolic health, appetite suppression
Procedia PDF Downloads 75312 Aspects of Semantics of Standard British English and Nigerian English: A Contrastive Study
Authors: Chris Adetuyi, Adeola Adeniran
Abstract:
The concept of meaning is a complex one in language study when cultural features are added. This is mandatory because language cannot be completely separated from the culture in which case language and culture complement each other. When there are two varieties of a language in a society, i.e. two varieties functioning side by side in a speech community, there is a tendency to view one of the varieties with each other. There is, therefore, the need to make a linguistic comparative study of varieties of such languages. In this paper, a semantic contrastive study is made between Standard British English (SBE) and Nigerian English (NB). The semantic study is limited to aspects of semantics: semantic extension (Kinship terms, metaphors), semantic shift (lexical items considered are ‘drop’ ‘befriend’ ‘dowry’ and escort) acronyms (NEPA, JAMB, NTA) linguistic borrowing or loan words (Seriki, Agbada, Eba, Dodo, Iroko) coinages (long leg, bush meat; bottom power and juju). In the study of these aspects of semantics of SBE and NE lexical terms, conservative statements are made, problems areas and hierarchy of difficulties are highlighted with a view to bringing out areas of differences are highlighted in this paper are concerned. The study will also serve as a guide in further contrastive studies in some other area of languages.Keywords: aspect, British, English, Nigeria, semantics
Procedia PDF Downloads 3465311 Recognition of a Thinly Bedded Distal Turbidite: A Case Study from a Proterozoic Delta System, Chaossa Formation, Simla Group, Western Lesser Himalaya, India
Authors: Priyanka Mazumdar, Ananya Mukhopadhyay
Abstract:
A lot of progress has been achieved in the research of turbidites during the last decades. However, their relationship to delta systems still deserves further attention. This paper addresses example of fine grained turbidite from a pro-deltaic deposit of a Proterozoic mixed energy delta system exposed along Chaossa-Baliana river section of the Chaossa Formation of the Simla Basin. Lithostratigraphic analysis of the Chaossa Formation reveals three major facies associations (prodelta deposit-FA1, delta slope deposit-FA2 and delta front deposit-FA3) based on lithofacies types, petrography and sedimentary structures. Detailed process-based facies and paleoenvironmental analysis of the study area have led to identification of more than150 m thick coarsening-upwards deltaic successions composed of fine grained turbidites overlain by delta slope deposits. Erosional features are locally common at the base of turbidite beds and still more widespread at the top. The complete sequence has eight sub-divisions that are here termed T1 to T8. The basal subdivision (T1) comprises a massive graded unit with a sharp, scoured base, internal parallel-lamination and cross-lamination. The overlying sequence shows textural and compositional grading through alternating silt and mud laminae (T2). T2 is overlying by T3 which is characterized by climbing ripple and cross lamination. Parallel laminae are the predominant facies attributes of T4 which caps the T3 unit. T5 has a loaded scour base and is mainly characterized laminated silt. The topmost three divisions, graded mud (T6), ungraded mud (T7) and laminated mud (T8). The proposed sequence is analogous to the Bouma (1962) structural scheme for sandy turbidites. Repetition of partial sequences represents deposition from different stages of evolution of a large, muddy, turbidity flow. Detailed facies analysis of the study area reveals that the sediments of the turbidites developed during normal regression at the stage of stable or marginally rising sea level. Thin-bedded turbidites were deposited predominantly by turbidity currents in the relatively shallower part of the Simla basin. The fine-grained turbidites are developed by resedimentation of delta-front sands and slumping of upper pro-delta muds.Keywords: turbidites, prodelta, proterozoic, Simla Basin, Bouma sequence
Procedia PDF Downloads 2695310 Development of Lodging Business Management Standards of Bang Khonthi Community in Samut Songkram Province
Authors: Poramet Saeng-On
Abstract:
This research aims to develop ways of lodging business management of Bang Khonthi community in Samut Songkram province that are appropriate with the cultural context of the Bang Khonthi community. Eight lodging business owners were interviewed. It was found that lodging business that are family business must be done with passion, correct understanding of self, culture, nature, Thai way of life, thorough, professional development, environmentally concerned, building partnerships with various networks both community level, and public sector and business cohorts. Public relations should be done through media both traditional and modern outlets, such as websites and social networks to provide customers convenience, security, happiness, knowledge, love and value when travel to Bang Khonthi. This will also help them achieve sustainability in business, in line with the 10 Home Stay Standard Thailand. Suggestions for operators are as follows: Operators need to improve their public relations work. They need to use technology in public relations such as the internet. Management standards must be improved. Souvenir and local products shops should be arranged in the compound. Product pricing must be set accordingly. They need to join hands to help each other. Quality of the business operation should be raised to meet the standards. Educational measures to reduce the impact caused by tourism on the community such as efforts to reduce energy consumption.Keywords: homestay, lodging business, management, standard
Procedia PDF Downloads 4465309 Optimization of Hemp Fiber Reinforced Concrete for Various Environmental Conditions
Authors: Zoe Chang, Max Williams, Gautham Das
Abstract:
The purpose of this study is to evaluate the incorporation of hemp fibers (HF) in concrete. Hemp fiber reinforced concrete (HFRC) is becoming more popular as an alternative for regular mix designs. This study was done to evaluate the compressive strength of HFRC regarding mix procedure. Hemp fibers were obtained from the manufacturer and hand-processed to ensure uniformity in width and length. The fibers were added to the concrete as both wet and dry mixes to investigate and optimize the mix design process. Results indicated that the dry mix had a compressive strength of 1157 psi compared to the wet mix of 985 psi. This dry mix compressive strength was within range of the standard mix compressive strength of 1533 psi. The statistical analysis revealed that the mix design process needs further optimization and uniformity concerning the addition of HF. Regression analysis revealed the standard mix design had a coefficient of 0.9 as compared to the dry mix of 0.375, indicating a variation in the mixing process. While completing the dry mix, the addition of plain hemp fibers caused them to intertwine, creating lumps and inconsistency. However, during the wet mixing process, combining water and hemp fibers before incorporation allows the fibers to uniformly disperse within the mix; hence the regression analysis indicated a better coefficient of 0.55. This study concludes that HRFC is a viable alternative to regular mixes; however, more research surrounding its characteristics needs to be conducted.Keywords: hemp fibers, hemp reinforced concrete, wet & dry, freeze thaw testing, compressive strength
Procedia PDF Downloads 2005308 Comparison of the Yumul Faces Anxiety Scale to the Categorization Scale, the Numerical Verbal Rating Scale, and the State-Trait Anxiety Inventory for Preoperative Anxiety Evaluation
Authors: Ofelia Loani Elvir Lazo, Roya Yumul, David Chernobylsky, Omar Durra
Abstract:
Background: It is crucial to detect the patient’s existing anxiety to assist patients in a perioperative setting which is to be caused by the fear associated with surgical and anesthetic complications. However, the current gold standard for assessing patient anxiety, the STAI, is problematic to use in the preoperative setting, given the duration and concentration required to complete the 40-item questionnaire. Our primary aim in the study is to investigate the correlation of the Yumul Visual Facial Anxiety Scale (VFAS) and Numerical Verbal Rating Scale (NVRS) to State-Trait Anxiety Inventory (STAI) to determine the optimal anxiety scale to use in the perioperative setting. Methods: A clinical study of patients undergoing various surgeries was conducted utilizing each of the preoperative anxiety scales. Inclusion criteria included patients undergoing elective surgeries, while exclusion criteria included patients with anesthesia contraindications, inability to comprehend instructions, impaired judgement, substance abuse history, and those pregnant or lactating. 293 patients were analyzed in terms of demographics, anxiety scale survey results, and anesthesia data via Spearman Coefficients, Chi-Squared Analysis, and Fischer’s exact test utilized for comparative analysis. Results: Statistical analysis showed that VFAS had a higher correlation to STAI than NVRS (rs=0.66, p<0.0001 vs. rs=0.64, p<0.0001). The combined VFAS-Categorization Scores showed the highest correlation with the gold standard (rs=0.72, p<0.0001). Subgroup analysis showed similar results. STAI evaluation time (247.7 ± 54.81 sec) far exceeds VFAS (7.29 ± 1.61 sec), NVRS (7.23 ± 1.60 sec), and Categorization scales (7.29 ± 1.99 sec). Patients preferred VFAS (54.4%), Categorization (11.6%), and NVRS (8.8%). Anesthesiologists preferred VFAS (63.9%), NVRS (22.1%), and Categorization Scales (14.0%). Of note, the top five causes of preoperative anxiety were determined to be waiting (56.5%), pain (42.5%), family concerns (40.5%), no information about surgery (40.1%), or anesthesia (31.6%). Conclusıons: Both VFAS and Categorization tests also take significantly less time than STAI, which is critical in the preoperative setting. Combined VFAS-Categorization Score (VCS) demonstrates the highest correlation to the gold standard, STAI. Among both patients and anesthesiologists, VFAS was the most preferred scale. This forms the basis of the Yumul Faces Anxiety Scale, designed for quick quantization and assessment in the preoperative setting while maintaining a high correlation to the golden standard. Additional studies using the formulated Yumul Faces Anxiety Scale are merited.Keywords: numerical verbal anxiety scale, preoperative anxiety, state-trait anxiety inventory, visual facial anxiety scale
Procedia PDF Downloads 1175307 Network Word Discovery Framework Based on Sentence Semantic Vector Similarity
Authors: Ganfeng Yu, Yuefeng Ma, Shanliang Yang
Abstract:
The word discovery is a key problem in text information retrieval technology. Methods in new word discovery tend to be closely related to words because they generally obtain new word results by analyzing words. With the popularity of social networks, individual netizens and online self-media have generated various network texts for the convenience of online life, including network words that are far from standard Chinese expression. How detect network words is one of the important goals in the field of text information retrieval today. In this paper, we integrate the word embedding model and clustering methods to propose a network word discovery framework based on sentence semantic similarity (S³-NWD) to detect network words effectively from the corpus. This framework constructs sentence semantic vectors through a distributed representation model, uses the similarity of sentence semantic vectors to determine the semantic relationship between sentences, and finally realizes network word discovery by the meaning of semantic replacement between sentences. The experiment verifies that the framework not only completes the rapid discovery of network words but also realizes the standard word meaning of the discovery of network words, which reflects the effectiveness of our work.Keywords: text information retrieval, natural language processing, new word discovery, information extraction
Procedia PDF Downloads 955306 Comparative Analysis of Various Waste Oils for Biodiesel Production
Authors: Olusegun Ayodeji Olagunju, Christine Tyreesa Pillay
Abstract:
Biodiesel from waste sources is regarded as an economical and most viable fuel alternative to depleting fossil fuels. In this work, biodiesel was produced from three different sources of waste cooking oil; from cafeterias, which is vegetable-based using the transesterification method. The free fatty acids (% FFA) of the feedstocks were conducted successfully through the titration method. The results for sources 1, 2, and 3 were 0.86 %, 0.54 % and 0.20 %, respectively. The three variables considered in this process were temperature, reaction time, and catalyst concentration within the following range: 50 oC – 70 oC, 30 min – 90 min, and 0.5 % – 1.5 % catalyst. Produced biodiesel was characterized using ASTM standard methods for biodiesel property testing to determine the fuel properties, including kinematic viscosity, specific gravity, flash point, pour point, cloud point, and acid number. The results obtained indicate that the biodiesel yield from source 3 was greater than the other sources. All produced biodiesel fuel properties are within the standard biodiesel fuel specifications ASTM D6751. The optimum yield of biodiesel was obtained at 98.76%, 96.4%, and 94.53% from source 3, source 2, and source 1, respectively at optimum operating variables of 65 oC temperature, 90 minutes reaction time, and 0.5 wt% potassium hydroxide.Keywords: waste cooking oil, biodiesel, free fatty acid content, potassium hydroxide catalyst, optimization analysis
Procedia PDF Downloads 775305 Determining Components of Deflection of the Vertical in Owerri West Local Government, Imo State Nigeria Using Least Square Method
Authors: Chukwu Fidelis Ndubuisi, Madufor Michael Ozims, Asogwa Vivian Ndidiamaka, Egenamba Juliet Ngozi, Okonkwo Stephen C., Kamah Chukwudi David
Abstract:
Deflection of the vertical is a quantity used in reducing geodetic measurements related to geoidal networks to the ellipsoidal plane; and it is essential in Geoid modeling processes. Computing the deflection of the vertical component of a point in a given area is necessary in evaluating the standard errors along north-south and east-west direction. Using combined approach for the determination of deflection of the vertical component provides improved result but labor intensive without appropriate method. Least square method is a method that makes use of redundant observation in modeling a given sets of problem that obeys certain geometric condition. This research work is aimed to computing the deflection of vertical component of Owerri West local government area of Imo State using geometric method as field technique. In this method combination of Global Positioning System on static mode and precise leveling observation were utilized in determination of geodetic coordinate of points established within the study area by GPS observation and the orthometric heights through precise leveling. By least square using Matlab programme; the estimated deflections of vertical component parameters for the common station were -0.0286 and -0.0001 arc seconds for the north-south and east-west components respectively. The associated standard errors of the processed vectors of the network were computed. The computed standard errors of the North-south and East-west components were 5.5911e-005 and 1.4965e-004 arc seconds, respectively. Therefore, including the derived component of deflection of the vertical to the ellipsoidal model will yield high observational accuracy since an ellipsoidal model is not tenable due to its far observational error in the determination of high quality job. It is important to include the determined deflection of the vertical component for Owerri West Local Government in Imo State, Nigeria.Keywords: deflection of vertical, ellipsoidal height, least square, orthometric height
Procedia PDF Downloads 2095304 Taking Risks to Get Pleasure: Reproductive Health Behaviour of Early Adolescents in Pantura Line, Indonesia
Authors: Juariah Salam Suryadi
Abstract:
North coast (Pantura) line is known as a high-risk area related to reproductive health. This is because along the line, there are many food stalls and entertainment industries that at night the function changed to be sexual transaction areas. This business line also facilitate circulation and transaction of drug and substance abuse. The environment conditions can influence adolescents who live in this area. It is because of adolescence characteristics that has high curiosity and looking for their identities. Therefore, purposes of this study were to explore reproductive health behaviour of early adolescents who lived in Pantura line and to suggest intervention based on the adolescents reproductive health conditions. This study was conducted in November 2016 among the seventh-grade students of Pusakajaya Junior High School 1 and 2, Subang District. Number of respondents were 269 students (Male=135, Female=134). The students were interviewed using a semi-structured questionnaire. Some teachers also interviewed to complement the data. The quantitative data was analyzed with univariate analysis, while content analysis was used for the qualitative data. Findings of this study showed that 85,2% of male students were smoker. Most of them started smoking at elementary school. Male students who often drunk alcohol were about 25,2% and all of them initiated to drink at elementary school. There were about 21,5% of male students ever used drug and substance abuse. There were 54,6% of the students that confessed having a lover. Most of them were female students. Sexual behaviour that ever done with their lovers were: holding hands (37,4%), kissing (4%) and embracing (6,8%). Although all of the students claimed to have never had sexual intercourse, but 5,9% of them said that they had friends who have had sexual intercourse. Most of the students also had friends with negative characteristics. Their friends were smoker (82,2%), drinker (53,2%) and drug abuse (42%). Most of the students recognized that they took the risks behaviour to get pleasure with their peers. Information from the teachers indicated that most problem of male students were smoking and drug and substance abuse; while sexuality including unwanted pregnancies were reproductive problems of many female students. Therefore, It is recommended to enhance understanding of the adolescents about risks of unhealthy behaviour through continuing reproductive health education, both in school and out of school. Policy support to create positive social environment and adolescents friendly is also suggested.Keywords: reproductive health, behaviour, early adolescents, pantura line
Procedia PDF Downloads 2895303 Analyzing Students' Writing in an English Code-Mixing Context in Nepali: An Ecological and Systematic Functional Approach
Authors: Binod Duwadi
Abstract:
This article examines the language and literacy practices of English Code-mixing in Nepalese Classroom. Situating the study within an ecological framework, a systematic functional linguistic (SFL) approach was used to analyze students writing in two Neplease schools. Data collection included interviews with teachers, classroom observations, instructional materials, and focal students’ writing samples. Data analyses revealed vastly different language ecologies between the schools owing to sharp socioeconomic stratification, the structural organization of schools, and the pervasiveness of standard language ideology, with stigmatizes English code mixing (ECM) and privileges Standard English in schools. Functional analysis of students’ writing showed that the nature of the writing tasks at the schools created different affordances for exploiting lexicogrammatically choices for meaning making-enhancing them in the case of one school but severely restricting them in the case of another- perpetuating the academic disadvantage for code mixing speakers. Recommendations for structural and attitudinal changes through teacher training and implementation of approaches that engage students’ bidialectal competence for learning are made as important first steps towards addressing educational inequities in Nepalese schools.Keywords: code-mixing, ecological perspective, systematic functional approach, language and identity
Procedia PDF Downloads 1245302 Modified CUSUM Algorithm for Gradual Change Detection in a Time Series Data
Authors: Victoria Siriaki Jorry, I. S. Mbalawata, Hayong Shin
Abstract:
The main objective in a change detection problem is to develop algorithms for efficient detection of gradual and/or abrupt changes in the parameter distribution of a process or time series data. In this paper, we present a modified cumulative (MCUSUM) algorithm to detect the start and end of a time-varying linear drift in mean value of a time series data based on likelihood ratio test procedure. The design, implementation and performance of the proposed algorithm for a linear drift detection is evaluated and compared to the existing CUSUM algorithm using different performance measures. An approach to accurately approximate the threshold of the MCUSUM is also provided. Performance of the MCUSUM for gradual change-point detection is compared to that of standard cumulative sum (CUSUM) control chart designed for abrupt shift detection using Monte Carlo Simulations. In terms of the expected time for detection, the MCUSUM procedure is found to have a better performance than a standard CUSUM chart for detection of the gradual change in mean. The algorithm is then applied and tested to a randomly generated time series data with a gradual linear trend in mean to demonstrate its usefulness.Keywords: average run length, CUSUM control chart, gradual change detection, likelihood ratio test
Procedia PDF Downloads 2985301 Conceptualizing the Cyber Insecurity Risk in the Ethics of Automated Warfare
Authors: Otto Kakhidze, Hoda Alkhzaimi, Adam Ramey, Nasir Memon
Abstract:
This paper provides an alternative, cyber security based a conceptual framework for the ethics of automated warfare. The large body of work produced on fully or partially autonomous warfare systems tends to overlook malicious security factors as in the possibility of technical attacks on these systems when it comes to the moral and legal decision-making. The argument provides a risk-oriented justification to why technical malicious risks cannot be dismissed in legal, ethical and policy considerations when warfare models are being implemented and deployed. The assumptions of the paper are supported by providing a broader model that contains the perspective of technological vulnerabilities through the lenses of the Game Theory, Just War Theory as well as standard and non-standard defense ethics. The paper argues that a conventional risk-benefit analysis without considering ethical factors is insufficient for making legal and policy decisions on automated warfare. This approach will provide the substructure for security and defense experts as well as legal scholars, ethicists and decision theorists to work towards common justificatory grounds that will accommodate the technical security concerns that have been overlooked in the current legal and policy models.Keywords: automated warfare, ethics of automation, inherent hijacking, security vulnerabilities, risk, uncertainty
Procedia PDF Downloads 3575300 Comparison Between Two Techniques (Extended Source to Surface Distance & Field Alignment) Of Craniospinal Irradiation (CSI) In the Eclipse Treatment Planning System
Authors: Naima Jannat, Ariful Islam, Sharafat Hossain
Abstract:
Due to the involvement of the large target volume, Craniospinal Irradiation makes it challenging to achieve a uniform dose, and it requires different isocenters. This isocentric junction needs to shift after every five fractions to overcome the possibility of hot and cold spots. This study aims to evaluate the Planning Target Volume coverage & sparing Organ at Risk between two techniques and shows that the Field Alignment Technique does not need replanning and resetting. Planning method for Craniospinal Irradiation by Eclipse treatment planning system Field Alignment and Extended Source to Surface Distance technique was developed where 36 Gy in 20 Fraction at the rate of 1.8 Gy was prescribed. The patient was immobilized in the prone position. In the Field Alignment technique, the plan consists of half beam blocked parallel opposed cranium and a single posterior cervicospine field was developed by sharing the same isocenter, which obviates divergence matching. Further, a single field was created to treat the remaining lumbosacral spine. Matching between the inferior diverging edge of the cervicospine field and the superior diverging edge of a lumbosacral field, the field alignment option was used, which automatically matches the field edge divergence as per the field alignment rule in Eclipse Treatment Planning System where the couch was set to 2700. In the Extended Source to Surface Distance technique, two parallel opposed fields were created for the cranium, and a single posterior cervicospine field was created where the Source to Surface Distance was from 120-140 cm. Dose Volume Histograms were obtained for each organ contoured and for each technique used. In all, the patient’s maximum dose to Planning Target Volume is higher for the Extended Source to Surface Distance technique to Field Alignment technique. The dose to all surrounding structures was increased with the use of a single Extended Source to Surface Distance when compared to the Field Alignment technique. The average mean dose to Eye, Brain Steam, Kidney, Oesophagus, Heart, Liver, Lung, and Ovaries were respectively (58% & 60 %), (103% & 98%), (13% & 15%), (10% & 63%), (12% & 16%), (33% & 30%), (14% & 18%), (69% & 61%) for Field Alignment and Extended Source to Surface Distance technique. However, the clinical target volume at the spine junction site received a less homogeneous dose with the Field Alignment technique as compared to Extended Source to Surface Distance. We conclude that, although the use of a single field Extended Source to Surface Distance delivered a more homogenous, but its maximum dose is higher than the Field Alignment technique. Also, a huge advantage of the Field Alignment technique for Craniospinal Irradiation is that it doesn’t need replanning and resetting up of patients after every five fractions and 95% prescribed dose was received by more than 95% of the Planning Target Volume in all the plane with the acceptable hot spot.Keywords: craniospinalirradiation, cranium, cervicospine, immobilize, lumbosacral spine
Procedia PDF Downloads 1155299 Mind Your Product-Market Strategy on Selecting Marketing Inputs: An Uncertainty Approach in Indian Context
Authors: Susmita Ghosh, Bhaskar Bhowmick
Abstract:
Market is an important factor for start-ups to look into during decision-making in product development and related areas. Emerging country markets are more uncertain in terms of information availability and institutional supports. The literature review of market uncertainty reveals the need for identifying factors representing the market uncertainty. This paper identifies factors for market uncertainty using Exploratory Factor Analysis (EFA) and confirms the number of factor retention using an alternative factor retention criterion, ‘Parallel Analysis’. 500 entrepreneurs, engaged in start-ups from all over India participated in the study. This paper concludes with the factor structure of ‘market uncertainty’ having dimensions of uncertainty in industry orientation, uncertainty in customer orientation and uncertainty in marketing orientation.Keywords: uncertainty, market, orientation, competitor, demand
Procedia PDF Downloads 5905298 Methods Used to Achieve Airtightness of 0.07 Ach@50Pa for an Industrial Building
Authors: G. Wimmers
Abstract:
The University of Northern British Columbia needed a new laboratory building for the Master of Engineering in Integrated Wood Design Program and its new Civil Engineering Program. Since the University is committed to reducing its environmental footprint and because the Master of Engineering Program is actively involved in research of energy efficient buildings, the decision was made to request the energy efficiency of the Passive House Standard in the Request for Proposals. The building is located in Prince George in Northern British Columbia, a city located at the northern edge of climate zone 6 with an average low between -8 and -10.5 in the winter months. The footprint of the building is 30m x 30m with a height of about 10m. The building consists of a large open space for the shop and laboratory with a small portion of the floorplan being two floors, allowing for a mezzanine level with a few offices as well as mechanical and storage rooms. The total net floor area is 1042m² and the building’s gross volume 9686m³. One key requirement of the Passive House Standard is the airtight envelope with an airtightness of < 0.6 ach@50Pa. In the past, we have seen that this requirement can be challenging to reach for industrial buildings. When testing for air tightness, it is important to test in both directions, pressurization, and depressurization, since the airflow through all leakages of the building will, in reality, happen simultaneously in both directions. A specific detail or situation such as overlapping but not sealed membranes might be airtight in one direction, due to the valve effect, but are opening up when tested in the opposite direction. In this specific project, the advantage was the overall very compact envelope and the good volume to envelope area ratio. The building had to be very airtight and the details for the windows and doors installation as well as all transitions from walls to roof and floor, the connections of the prefabricated wall panels and all penetrations had to be carefully developed to allow for maximum airtightness. The biggest challenges were the specific components of this industrial building, the large bay door for semi-trucks and the dust extraction system for the wood processing machinery. The testing was carried out in accordance with EN 132829 (method A) as specified in the International Passive House Standard and the volume calculation was also following the Passive House guideline resulting in a net volume of 7383m3, excluding all walls, floors and suspended ceiling volumes. This paper will explore the details and strategies used to achieve an airtightness of 0.07 ach@50Pa, to the best of our knowledge the lowest value achieved in North America so far following the test protocol of the International Passive House Standard and discuss the crucial steps throughout the project phases and the most challenging details.Keywords: air changes, airtightness, envelope design, industrial building, passive house
Procedia PDF Downloads 1485297 Urban Analysis of the Old City of Oran and Its Building after an Earthquake
Authors: A. Zatir, A. Mokhtari, A. Foufa, S. Zatir
Abstract:
The city of Oran, like any other region of northern Algeria, is subject to frequent seismic activity, the study presented in this work will be based on an analysis of urban and architectural context of the city of Oran before the date of the earthquake of 1790, and then try to deduce the differences between the old city before and after the earthquake. The analysis developed as a specific objective to tap into the seismic history of the city of Oran parallel to its urban history. The example of the citadel of Oran indicates that constructions presenting the site of the old citadel, may present elements of resistance for face to seismic effects. Removed in city observations of these structures, showed the ingenuity of the techniques used by the ancient builders, including the good performance of domes and arches in resistance to seismic forces.Keywords: earthquake, citadel, performance, traditional techniques, constructions
Procedia PDF Downloads 3045296 Modern Methods of Technology and Organization of Production of Construction Works during the Implementation of Construction 3D Printers
Authors: Azizakhanim Maharramli
Abstract:
The gradual transition from entrenched traditional technology and organization of construction production to innovative additive construction technology inevitably meets technological, technical, organizational, labour, and, finally, social difficulties. Therefore, the chosen nodal method will lead to the elimination of the above difficulties, combining some of the usual methods of construction and the myth in world practice that the labour force is subjected to a strong stream of reduction. The nodal method of additive technology will create favourable conditions for the optimal degree of distribution of labour across facilities due to the consistent performance of homogeneous work and the introduction of additive technology and traditional technology into construction production.Keywords: parallel method, sequential method, stream method, combined method, nodal method
Procedia PDF Downloads 945295 Quantum Cryptography: Classical Cryptography Algorithms’ Vulnerability State as Quantum Computing Advances
Authors: Tydra Preyear, Victor Clincy
Abstract:
Quantum computing presents many computational advantages over classical computing methods due to the utilization of quantum mechanics. The capability of this computing infrastructure poses threats to standard cryptographic systems such as RSA and AES, which are designed for classical computing environments. This paper discusses the impact that quantum computing has on cryptography, while focusing on the evolution from classical cryptographic concepts to quantum and post-quantum cryptographic concepts. Standard Cryptography is essential for securing data by utilizing encryption and decryption methods, and these methods face vulnerability problems due to the advancement of quantum computing. In order to counter these vulnerabilities, the methods that are proposed are quantum cryptography and post-quantum cryptography. Quantum cryptography uses principles such as the uncertainty principle and photon polarization in order to provide secure data transmission. In addition, the concept of Quantum key distribution is introduced to ensure more secure communication channels by distributing cryptographic keys. There is the emergence of post-quantum cryptography which is used for improving cryptographic algorithms in order to be more secure from attacks by classical and quantum computers. Throughout this exploration, the paper mentions the critical role of the advancement of cryptographic methods to keep data integrity and privacy safe from quantum computing concepts. Future research directions that would be discussed would be more effective cryptographic methods through the advancement of technology.Keywords: quantum computing, quantum cryptography, cryptography, data integrity and privacy
Procedia PDF Downloads 255294 Efficient DCT Architectures
Authors: Mr. P. Suryaprasad, R. Lalitha
Abstract:
This paper presents an efficient area and delay architectures for the implementation of one dimensional and two dimensional discrete cosine transform (DCT). These are supported to different lengths (4, 8, 16, and 32). DCT blocks are used in the different video coding standards for the image compression. The 2D- DCT calculation is made using the 2D-DCT separability property, such that the whole architecture is divided into two 1D-DCT calculations by using a transpose buffer. Based on the existing 1D-DCT architecture two different types of 2D-DCT architectures, folded and parallel types are implemented. Both of these two structures use the same transpose buffer. Proposed transpose buffer occupies less area and high speed than existing transpose buffer. Hence the area, low power and delay of both the 2D-DCT architectures are reduced.Keywords: transposition buffer, video compression, discrete cosine transform, high efficiency video coding, two dimensional picture
Procedia PDF Downloads 5215293 Analytical Formulae for the Approach Velocity Head Coefficient
Authors: Abdulrahman Abdulrahman
Abstract:
Critical depth meters, such as abroad crested weir, Venture Flume and combined control flume are standard devices for measuring flow in open channels. The discharge relation for these devices cannot be solved directly, but it needs iteration process to account for the approach velocity head. In this paper, analytical solution was developed to calculate the discharge in a combined critical depth-meter namely, a hump combined with lateral contraction in rectangular channel with subcritical approach flow including energy losses. Also analytical formulae were derived for approach velocity head coefficient for different types of critical depth meters. The solution was derived by solving a standard cubic equation considering energy loss on the base of trigonometric identity. The advantage of this technique is to avoid iteration process adopted in measuring flow by these devices. Numerical examples are chosen for demonstration of the proposed solution.Keywords: broad crested weir, combined control meter, control structures, critical flow, discharge measurement, flow control, hydraulic engineering, hydraulic structures, open channel flow
Procedia PDF Downloads 2745292 AC Voltage Regulators Using Single Phase Matrix Converter
Authors: Nagaraju Jarugu, B. R. Narendra
Abstract:
This paper focused on boost rectification by Single Phase Matrix Converter with fewer numbers of switches. The conventional matrix converter consists of 4 bidirectional switches, i.e. 8 set of IGBT/MOSFET with anti-parallel diodes. In this proposed matrix converter, only six switches are used. The switch commutation arrangements are also carried out in this work. The SPMC topology has many advantages as a minimal passive device use. It is very flexible and it can be used as a lot of converters. The gate pulses to the switches are provided by the PWM techniques. The duty ratio of the switches based on Pulse Width Modulation (PWM) technique was used to produce the output waveform of the circuit, simply by turning ON and OFF the switches. The simulation results using MATLAB/Simulink were provided to validate the feasibility of this proposed method.Keywords: single phase matrix converter, reduced switches, AC voltage regulators, boost rectifier operation
Procedia PDF Downloads 11885291 Using High Performance Computing for Online Flood Monitoring and Prediction
Authors: Stepan Kuchar, Martin Golasowski, Radim Vavrik, Michal Podhoranyi, Boris Sir, Jan Martinovic
Abstract:
The main goal of this article is to describe the online flood monitoring and prediction system Floreon+ primarily developed for the Moravian-Silesian region in the Czech Republic and the basic process it uses for running automatic rainfall-runoff and hydrodynamic simulations along with their calibration and uncertainty modeling. It takes a long time to execute such process sequentially, which is not acceptable in the online scenario, so the use of high-performance computing environment is proposed for all parts of the process to shorten their duration. Finally, a case study on the Ostravice river catchment is presented that shows actual durations and their gain from the parallel implementation.Keywords: flood prediction process, high performance computing, online flood prediction system, parallelization
Procedia PDF Downloads 4925290 Nudge Plus: Incorporating Reflection into Behavioural Public Policy
Authors: Sanchayan Banerjee, Peter John
Abstract:
Nudge plus is a modification of the toolkit of behavioural public policy. It incorporates an element of reflection¾the plus¾into the delivery of a nudge, either blended in or made proximate. Nudge plus builds on recent work combining heuristics and deliberation. It may be used to design pro-social interventions that help preserve the autonomy of the agent. The argument turns on seminal work on dual systems, which presents a subtler relationship between fast and slow thinking than commonly assumed in the classic literature in behavioural public policy. We review classic and recent work on dual processes to show that a hybrid is more plausible than the default interventionist or parallel competitive framework. We define nudge plus, set out what reflection could entail, provide examples, outline causal mechanisms, and draw testable implications.Keywords: nudge, nudge plus, think, dual process theory
Procedia PDF Downloads 1925289 A LED Warning Vest as Safety Smart Textile and Active Cooperation in a Working Group for Building a Normative Standard
Authors: Werner Grommes
Abstract:
The institute of occupational safety and health works in a working group for building a normative standard for illuminated warning vests and did a lot of experiments and measurements as basic work (cooperation). Intelligent car headlamps are able to suppress conventional warning vests with retro-reflective stripes as a disturbing light. Illuminated warning vests are therefore required for occupational safety. However, they must not pose any danger to the wearer or other persons. Here, the risks of the batteries (lithium types), the maximum brightness (glare) and possible interference radiation from the electronics on the implant carrier must be taken into account. The all-around visibility, as well as the required range, play an important role here. For the study, many luminance measurements of already commercially available LEDs and electroluminescent warning vests, as well as their electromagnetic interference fields and aspects of electrical safety, were measured. The results of this study showed that LED lighting is all far too bright and causes strong glare. The integrated controls with pulse modulation and switching regulators cause electromagnetic interference fields. Rechargeable lithium batteries can explode depending on the temperature range. Electroluminescence brings even more hazards. A test method was developed for the evaluation of visibility at distances of 50, 100, and 150 m, including the interview of test persons. A measuring method was developed for the detection of glare effects at close range with the assignment of the maximum permissible luminance. The electromagnetic interference fields were tested in the time and frequency ranges. A risk and hazard analysis were prepared for the use of lithium batteries. The range of values for luminance and risk analysis for lithium batteries were discussed in the standards working group. These will be integrated into the standard. This paper gives a brief overview of the topics of illuminated warning vests, which takes into account the risks and hazards for the vest wearer or othersKeywords: illuminated warning vest, optical tests and measurements, risks, hazards, optical glare effects, LED, E-light, electric luminescent
Procedia PDF Downloads 1135288 A Case of Prosthetic Vascular-Graft Infection Due to Mycobacterium fortuitum
Authors: Takaaki Nemoto
Abstract:
Case presentation: A 69-year-old Japanese man presented with a low-grade fever and fatigue that had persisted for one month. The patient had an aortic dissection on the aortic arch 13 years prior, an abdominal aortic aneurysm seven years prior, and an aortic dissection on the distal aortic arch one year prior, which were all treated with artificial blood-vessel replacement surgery. Laboratory tests revealed an inflammatory response (CRP 7.61 mg/dl), high serum creatinine (Cr 1.4 mg/dL), and elevated transaminase (AST 47 IU/L, ALT 45 IU/L). The patient was admitted to our hospital on suspicion of prosthetic vascular graft infection. Following further workups on the inflammatory response, an enhanced chest computed tomography (CT) and a non-enhanced chest DWI (MRI) were performed. The patient was diagnosed with a pulmonary fistula and a prosthetic vascular graft infection on the distal aortic arch. After admission, the patient was administered Ceftriaxion and Vancomycine for 10 days, but his fever and inflammatory response did not improve. On day 13 of hospitalization, a lung fistula repair surgery and an omental filling operation were performed, and Meropenem and Vancomycine were administered. The fever and inflammatory response continued, and therefore we took repeated blood cultures. M. fortuitum was detected in a blood culture on day 16 of hospitalization. As a result, we changed the treatment regimen to Amikacin (400 mg/day), Meropenem (2 g/day), and Cefmetazole (4 g/day), and the fever and inflammatory response began to decrease gradually. We performed a test of sensitivity for Mycobacterium fortuitum, and found that the MIC was low for fluoroquinolone antibacterial agent. The clinical course was good, and the patient was discharged after a total of 8 weeks of intravenous drug administration. At discharge, we changed the treatment regimen to Levofloxacin (500 mg/day) and Clarithromycin (800 mg/day), and prescribed these two drugs as a long life suppressive therapy. Discussion: There are few cases of prosthetic vascular graft infection caused by mycobacteria, and a standard therapy remains to be established. For prosthetic vascular graft infections, it is ideal to provide surgical and medical treatment in parallel, but in this case, surgical treatment was difficult and, therefore, a conservative treatment was chosen. We attempted to increase the treatment success rate of this refractory disease by conducting a susceptibility test for mycobacteria and treating with different combinations of antimicrobial agents, which was ultimately effective. With our treatment approach, a good clinical course was obtained and continues at the present stage. Conclusion: Although prosthetic vascular graft infection resulting from mycobacteria is a refractory infectious disease, it may be curative to administer appropriate antibiotics based on the susceptibility test in addition to surgical treatment.Keywords: prosthetic vascular graft infection, lung fistula, Mycobacterium fortuitum, conservative treatment
Procedia PDF Downloads 1565287 Application of Fuzzy Logic to Design and Coordinate Parallel Behaviors for a Humanoid Mobile Robot
Authors: Nguyen Chan Hung, Mai Ngoc Anh, Nguyen Xuan Ha, Tran Xuan Duc, Dang Bao Lam, Nguyen Hoang Viet
Abstract:
This paper presents a design and implementation of a navigation controller for a humanoid mobile robot platform to operate in indoor office environments. In order to fulfil the requirement of recognizing and approaching human to provide service while avoiding random obstacles, a behavior-based fuzzy logic controller was designed to simultaneously coordinate multiple behaviors. Experiments in real office environment showed that the fuzzy controller deals well with complex scenarios without colliding with random objects and human.Keywords: behavior control, fuzzy logic, humanoid robot, mobile robot
Procedia PDF Downloads 4195286 Ternary Content Addressable Memory Cell with a Leakage Reduction Technique
Authors: Gagnesh Kumar, Nitin Gupta
Abstract:
Ternary Content Addressable Memory cells are mainly popular in network routers for packet forwarding and packet classification, but they are also useful in a variety of other applications that require high-speed table look-up. The main TCAM-design challenge is to decrease the power consumption associated with the large amount of parallel active circuitry, without compromising with speed or memory density. Furthermore, when the channel length decreases, leakage power becomes more significant, and it can even dominate dynamic power at lower technologies. In this paper, we propose a TCAM-design technique, called Virtual Power Supply technique that reduces the leakage by a substantial amount.Keywords: match line (ML), search line (SL), ternary content addressable memory (TCAM), Leakage power (LP)
Procedia PDF Downloads 2995285 Onco@Home: Comparing the Costs, Revenues, and Patient Experience of Cancer Treatment at Home with the Standard of Care
Authors: Sarah Misplon, Wim Marneffe, Johan Helling, Jana Missiaen, Inge Decock, Dries Myny, Steve Lervant, Koen Vaneygen
Abstract:
The aim of this study was twofold. First, we investigated whether the current funding from the national health insurance (NHI) of home hospitalization (HH) for oncological patients is sufficient in Belgium. Second, we compared patient’s experiences and preferences of HH to the standard of care (SOC). Two HH models were examined in three Belgian hospitals and three home nursing organizations. In a first HH model, the blood draw and monitoring prior to intravenous therapy were performed by a trained home nurse at the patient’s home the day before the visit to the day hospital. In a second HH model, the administration of two subcutaneous treatments was partly provided at home instead of in the hospital. Therefore, we conducted (1) a bottom-up micro-costing study to compare the costs and revenues for the providers (hospitals and home care organizations), and (2) a cross-sectional survey to compare patient’s experiences and preferences of the SOC group and the HH group. Our results show that HH patients prefer HH and none of them wanted to return to SOC, although the satisfaction of patients was not significantly different between the two categories. At the same time, we find that costs associated to HH are higher overall. Comparing revenues with costs, we conclude that the current funding from NHI of HH for oncological patients is insufficient.Keywords: cost analysis, health insurance, preference, home hospitalization
Procedia PDF Downloads 1225284 A New Asset: The Role of Money in the Evolution of 20th Century Street Art
Authors: Eileen Kim
Abstract:
As socioeconomic disparities grew in New York during the 1970s, artists represented new values that came with the times. Street art, in particular, was birthed from a distinctly urban, fringe setting to ultimately become one of the most lucrative forms of art today. Examining the economic and psychological reasons behind the rise of street art, this paper delves into the development of the art market as a parallel insight into human behaviors and economic models such as supply and demand. The purpose of this study is to show the role of the increasingly divided socioeconomic classes and the rise of art collecting as an asset-building form. This study concludes that the iconography and market value of street art represented distinct values that came from a series of intertwined social matters such as racial tensions and revolutions in industrial innovation.Keywords: art industry, cultural representation, ethnicity, markets, public property, social classes, street art
Procedia PDF Downloads 230