Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Kineto Profiler for ipex 2.5.10+xpu with llm #777

Open
lostkingdom4 opened this issue Jan 25, 2025 · 0 comments
Open

Kineto Profiler for ipex 2.5.10+xpu with llm #777

lostkingdom4 opened this issue Jan 25, 2025 · 0 comments

Comments

@lostkingdom4
Copy link

Describe the issue

I was trying to use Kineto profiler with IPEX LLM installed from the source. I was trying to use the profiler with the following code:

import torch
import torch.nn as nn
from transformers.models.roberta.modeling_roberta import RobertaSelfAttention
import intel_extension_for_pytorch as ipex
from intel_extension_for_pytorch.transformers.models.xpu.optimize_transformers.modules.bert import NewIPEXBertSelfAttention
from transformers import RobertaConfig

from datetime import datetime
import os
import logging

from torch.profiler import profile, ProfilerActivity
import time

from newipexatten_run import GraphModule

# Set default dtype
torch.set_default_dtype(torch.float16)


def main():
    config = RobertaConfig()

    attention_layer = RobertaSelfAttention(config)

    for param in attention_layer.parameters():
        param.requires_grad = False

    attention_layer = attention_layer.eval().to('xpu')

    # Test inputs
    batch_size = 2
    seq_length = 10
    hidden_size = config.hidden_size

    with torch.no_grad():
        hidden_states = torch.rand(batch_size, seq_length, hidden_size).to('xpu').detach()
        attention_mask = torch.ones(batch_size, 1, 1, seq_length).to('xpu').detach()
        for _ in range(10):
            attention_layer(
                hidden_states=hidden_states,
                attention_mask=attention_mask,
                head_mask=None,
                encoder_hidden_states=None,
                encoder_attention_mask=None,
                past_key_value=None,
                output_attentions=False,
            )

        with profile(activities=[ProfilerActivity.CPU, ProfilerActivity.XPU])as prof:
            start_time = time.time()
            outputs = attention_layer(
                        hidden_states=hidden_states,
                        attention_mask=attention_mask,
                        head_mask=None,
                        encoder_hidden_states=None,
                        encoder_attention_mask=None,
                        past_key_value=None,
                        output_attentions=False,
                    )
            end_time = time.time()
        print(prof.key_averages().table())

I got the following errors

Traceback (most recent call last):
  File "/home/ubuntu/project/Data_gen/roberta.py", line 362, in <module>
    main()
  File "/home/ubuntu/project/Data_gen/roberta.py", line 83, in main
    with profile(activities=[ProfilerActivity.CPU, ProfilerActivity.XPU])as prof:
  File "/home/ubuntu/.conda/envs/llm2/lib/python3.10/site-packages/torch/profiler/profiler.py", line 744, in __enter__
    self.start()
  File "/home/ubuntu/.conda/envs/llm2/lib/python3.10/site-packages/torch/profiler/profiler.py", line 754, in start
    self._transit_action(ProfilerAction.NONE, self.current_action)
  File "/home/ubuntu/.conda/envs/llm2/lib/python3.10/site-packages/torch/profiler/profiler.py", line 793, in _transit_action
    action()
  File "/home/ubuntu/.conda/envs/llm2/lib/python3.10/site-packages/torch/profiler/profiler.py", line 168, in prepare_trace
    self.profiler._prepare_trace()
  File "/home/ubuntu/.conda/envs/llm2/lib/python3.10/site-packages/torch/autograd/profiler.py", line 331, in _prepare_trace
    _prepare_profiler(self.config(), self.kineto_activities)
RuntimeError: Fail to enable Kineto Profiler on XPU due to error code: 200

Does the source code installation have Kineto Profiler installed as a default during the process?

Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant