Edit model card

kobart-news

Usage

Python Code

from transformers import PreTrainedTokenizerFast, BartForConditionalGeneration
#  Load Model and Tokenize
tokenizer = PreTrainedTokenizerFast.from_pretrained("ainize/kobart-news")
model = BartForConditionalGeneration.from_pretrained("ainize/kobart-news")
# Encode Input Text
input_text = 'κ΅­λ‚΄ μ „λ°˜μ μΈ 경기침체둜 상가 건물주의 μˆ˜μ΅λ„ 전ꡭ적인 κ°μ†Œμ„Έλ₯Ό 보이고 μžˆλŠ” κ²ƒμœΌλ‘œ λ‚˜νƒ€λ‚¬λ‹€. μˆ˜μ΅ν˜• 뢀동산 μ—°κ΅¬κ°œλ°œκΈ°μ—… μƒκ°€μ •λ³΄μ—°κ΅¬μ†ŒλŠ” ν•œκ΅­κ°μ •μ› 톡계λ₯Ό λΆ„μ„ν•œ κ²°κ³Ό μ „κ΅­ μ€‘λŒ€ν˜• 상가 μˆœμ˜μ—…μ†Œλ“(λΆ€λ™μ‚°μ—μ„œ λ°œμƒν•˜λŠ” μž„λŒ€μˆ˜μž…, κΈ°νƒ€μˆ˜μž…μ—μ„œ 제반 κ²½λΉ„λ₯Ό κ³΅μ œν•œ μˆœμ†Œλ“)이 1λΆ„κΈ° γŽ‘λ‹Ή 3만4200μ›μ—μ„œ 3λΆ„κΈ° 2만5800μ›μœΌλ‘œ κ°μ†Œν–ˆλ‹€κ³  17일 λ°ν˜”λ‹€. μˆ˜λ„κΆŒ, μ„Έμ’…μ‹œ, μ§€λ°©κ΄‘μ—­μ‹œμ—μ„œ μˆœμ˜μ—…μ†Œλ“μ΄ κ°€μž₯ 많이 κ°μ†Œν•œ 지역은 3λΆ„κΈ° 1만3100원을 κΈ°λ‘ν•œ μšΈμ‚°μœΌλ‘œ, 1λΆ„κΈ° 1만9100원 λŒ€λΉ„ 31.4% κ°μ†Œν–ˆλ‹€. 이어 λŒ€κ΅¬(-27.7%), μ„œμšΈ(-26.9%), κ΄‘μ£Ό(-24.9%), λΆ€μ‚°(-23.5%), μ„Έμ’…(-23.4%), λŒ€μ „(-21%), κ²½κΈ°(-19.2%), 인천(-18.5%) 순으둜 κ°μ†Œν–ˆλ‹€. 지방 λ„μ‹œμ˜ κ²½μš°λ„ λΉ„μŠ·ν–ˆλ‹€. κ²½λ‚¨μ˜ 3λΆ„κΈ° μˆœμ˜μ—…μ†Œλ“μ€ 1만2800μ›μœΌλ‘œ 1λΆ„κΈ° 1만7400원 λŒ€λΉ„ 26.4% κ°μ†Œν–ˆμœΌλ©° 제주(-25.1%), 경뢁(-24.1%), 좩남(-20.9%), 강원(-20.9%), 전남(-20.1%), 전뢁(-17%), 좩뢁(-15.3%) 등도 κ°μ†Œμ„Έλ₯Ό λ³΄μ˜€λ‹€. μ‘°ν˜„νƒ μƒκ°€μ •λ³΄μ—°κ΅¬μ†Œ 연ꡬ원은 "μ˜¬ν•΄ λ‚΄μˆ˜ 경기의 침체된 λΆ„μœ„κΈ°κ°€ μœ μ§€λ˜λ©° 상가, μ˜€ν”ΌμŠ€ 등을 λΉ„λ‘―ν•œ μˆ˜μ΅ν˜• 뢀동산 μ‹œμž₯의 λΆ„μœ„κΈ°λ„ 경직된 λͺ¨μŠ΅μ„ λ³΄μ˜€κ³  μ˜€ν”ΌμŠ€ν…”, 지식산업센터 λ“±μ˜ μˆ˜μ΅ν˜• 뢀동산 곡급도 증가해 κ³΅μ‹€μ˜ μœ„ν—˜λ„ λŠ˜μ—ˆλ‹€"λ©° "μ‹€μ œ 올 3λΆ„κΈ° μ „κ΅­ μ€‘λŒ€ν˜• 상가 곡싀λ₯ μ€ 11.5%λ₯Ό κΈ°λ‘ν•˜λ©° 1λΆ„κΈ° 11.3% λŒ€λΉ„ 0.2% 포인트 μ¦κ°€ν–ˆλ‹€"κ³  λ§ν–ˆλ‹€. κ·ΈλŠ” "졜근 μ†Œμ…œμ»€λ¨ΈμŠ€(SNSλ₯Ό ν†΅ν•œ μ „μžμƒκ±°λž˜), μŒμ‹ 배달 μ€‘κ°œ μ• ν”Œλ¦¬μΌ€μ΄μ…˜, 쀑고 λ¬Όν’ˆ 거래 μ• ν”Œλ¦¬μΌ€μ΄μ…˜ λ“±μ˜ μ‚¬μš© μ¦κ°€λ‘œ μ˜€ν”„λΌμΈ 맀μž₯에 영ν–₯을 λ―Έμ³€λ‹€"λ©° "ν–₯ν›„ 지역, μ½˜ν…μΈ μ— λ”°λ₯Έ μƒκΆŒ μ–‘κ·Ήν™” ν˜„μƒμ€ 심화될 κ²ƒμœΌλ‘œ 보인닀"κ³  λ§λΆ™μ˜€λ‹€.'
input_ids = tokenizer.encode(input_text, return_tensors="pt")
# Generate Summary Text Ids
summary_text_ids = model.generate(
    input_ids=input_ids,
    bos_token_id=model.config.bos_token_id,
    eos_token_id=model.config.eos_token_id,
    length_penalty=2.0,
    max_length=142,
    min_length=56,
    num_beams=4,
)
# Decoding Text
print(tokenizer.decode(summary_text_ids[0], skip_special_tokens=True))

API and Demo

You can experience this model through ainize-api and ainize-demo.

Downloads last month
2,043
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Spaces using ainize/kobart-news 20